How to: |
The following components are needed to use the adapter for Hadoop/Hive/Impala:
The location of the JDK must be specified in an environment variable.
If you are using Linux, add a line to your profile with the location where Java is installed. For example:
export JAVA_HOME=/usr/java/jdk1.7.0_07
If you are using Windows, right-click Computer and select Properties. Then select Advanced System Settings and click Environment Variables. Add the locations to your PATH variable. For example:
C:\Program Files\Java\jdk7\bin\server;C:\Program Files\Java\jdk7\bin;
Hive. The JDBC driver requires several jar files, which are included in a Hadoop distribution. If you are installing the server on the same system where Hive is installed, you can point to the jar files as described in the next section. If you are installing the server on some other system, copy those files to a location of your choice.
Impala. The JDBC driver can be downloaded from Cloudera. Instructions on how to configure Impala to work with JDBC can be found on the Cloudera.com website, in the Configuring Impala to Work with JDBC section. The driver itself can be found here https://downloads.cloudera.com/impala-jdbc/impala-jdbc-0.5-2.zip. This driver can also be used for Hive 0.10 and 0.11.
The location of the Hadoop and Hive jar files must be specified to the server. If you are running the server on the same system as the Hadoop and Hive server, you can specify their location. If you are running the server on another system, copy the files listed below to some location on your system and specify their location.
This can be done in the system CLASSPATH or in the DataMigrator or WebFOCUS Reporting Server IBI_CLASSPATH variable as follows:
or
From the Data Management Console, expand the Workspace folder.
The Java Services Configuration page opens.
In the IBI_CLASSPATH box, enter the full location of the Hive and Hadoop files shown below, where hive_home is where Hive is installed and hadoop_home is where Hadoop is installed. You must type them explicitly and cannot use $HIVE_HOME. The file names must be entered one per line.
If you are installing the adapter on a different system than where Hadoop and Hive are installed, copy the jar files to a location on that system.
Note: For a server running on Windows, use Windows syntax for directory names. For example:
C:\jdbc\hive-jdbc-0.10.0.jar
For Hive 0.12:
Use the version numbers (such as 0.12.0) that match your installation. Use this driver to connect to a Hive 0.12 server only.
/hive_home/lib/commons-logging-1.1.1.jar /hive_home/lib/hive-exec-0.12.0.jar /hive_home/lib/hive-jdbc-0.12.0.jar /hive_home/lib/hive-metastore-0.12.0.jar /hive_home/lib/hive-service-0.12.0.jar /hive_home/lib/httpclient-4.2.5.jar /hive_home/lib/httpcore-4.2.4.jar /hive_home/lib/libfb303-0.9.0.jar /hive_home/lib/log4j-1.2.16.jar /hive_home/lib/slf4j-api-1.6.1.jar /hive_home/lib/slf4j-log4j12-1.6.1.jar /hadoop_home/hadoop-core-1.1.2.jar
For Hive 0.10 or 0.11:
Use the version numbers (such as 0.10.0) that match your installation.
/hive_home/lib/hive-exec-0.10.0.jar /hive_home/lib/hive-metastore-0.10.0.jar /hive_home/lib/hive-jdbc-0.10.0.jar /hive_home/lib/slf4j-log4j12-1.6.1.jar /hive_home/lib/libfb303.jar /hive_home/lib/hive-service-0.10.0.jar /hive_home/lib/slf4j-api-1.6.1.jar /hadoop_home/hadoop-core-1.0.3.jar
For Impala:
Enter the location where you installed the JDBC driver and the names of each of jar files it included, one per line. These are the file names from impala-jdbc-0.5-2.
/jdbc/commons-logging-1.0.4.jar /jdbc/hive-jdbc-0.10.0-cdh4.2.0.jar /jdbc/hive-metastore-0.10.0-cdh4.2.0.jar /jdbc/hive-service-0.10.0-cdh4.2.0.jar /jdbc/libfb303-0.9.0.jar /jdbc/libthrift-0.9.0.jar /jdbc/log4j-1.2.16.jar /jdbc/slf4j-api-1.6.4.jar /jdbc/slf4j-log4j12-1.6.1.jar
WebFOCUS |