Data Services
Also available as:
PDF
loading table of contents...

Hive JDBC and ODBC Drivers

Hortonworks provides Hive JDBC and ODBC drivers that let you connect to popular Business Intelligence (BI) tools to query, analyze and visualize data stored within the Hortonworks Data Platform.

Hive ODBC drivers for Linux and Mac operating systems are installed when you install Hive. For information about using the Hive ODBC drivers, see the Hortonworks Addons page.

Current Hive JDBC client jars can be found on one of the edge nodes in your cluster at /usr/hdp/current/hive-client/lib/hive-jdbc.jar after you have installed HDP, or you can download them from Hive JDBC driver archive.

JDBC URLs have the following format:

jdbc:hive2://<host>:<port>/<dbName>;<sessionConfs>?<hiveConfs>#<hiveVars> 
JDBC ParameterDescription

host

The cluster node hosting HiveServer2.

port

The port number to which HiveServer2 listens.

dbName

The name of the Hive database to run the query against.

sessionConfs

Optional configuration parameters for the JDBC/ODBC driver in the following format: <key1>=<value1>;<key2>=<key2>...;

hiveConfs

Optional configuration parameters for Hive on the server in the following format: <key1>=<value1>;<key2>=<key2>; ...

The configurations last for the duration of the user session.

hiveVars

Optional configuration parameters for Hive variables in the following format: <key1>=<value1>;<key2>=<key2>; ...

The configurations last for the duration of the user session.

The specific JDBC connection URL for a HiveServer2 client depends on several factors:

  • How is HiveServer2 deployed on the cluster?

  • What type of transport does HiveServer2 use?

  • Does HiveServer2 use transport-layer security?

  • Is HiveServer2 configured to authenticate users?

The rest of this section describes how to use session configuration variables to format the JDBC connection URLs for all these scenarios.

[Note]Note

Some HiveServer2 clients may need to run on a host outside of the Hadoop cluster. These clients require access to the following .jar files to successfully use the Hive JDBC driver in both HTTP and HTTPS modes: hive-jdbc-<version>-standalone.jar, hadoop-common.jar, and hadoop-auth.jar.

Embedded and Remote Modes

In embedded mode, HiveServer2 runs within the Hive client rather than in a separate process. No host or port number is necessary for the JDBC connection. In remote mode, HiveServer2 runs as a separate daemon on a specified host and port, and the JDBC client and HiveServer2 interact using remote procedure calls with the Thrift protocol.

Embedded Mode

jdbc:hive2:// 

Remote Mode

jdbc:hive2://<host>:<port>/<dbName>;<sessionConfs>?<hiveConfs>#<hiveVars> 
[Note]Note

The rest of the example JDBC connection URLs in this topic are valid only for HiveServer2 configured in remote mode.

TCP and HTTP Transport

The JDBC client and HiveServer2 can use either HTTP or TCP-based transport to exchange RPC messages. Specify the transport used by HiveServer2 with the transportMode and httpPath session configuration variables. The default transport is TCP.

transportMode Variable ValueDescription

http

Connect to HiveServer2 using HTTP transport.

binary

Connect to HiveServer2 using TCP transport.

HTTP Transport

jdbc:hive2://<host>:<port>/<dbName>;transportMode=http;httpPath=<http_endpoint>;<otherSessionConfs>?<hiveConfs>#<hiveVars> 
[Note]Note

The JDBC driver assumes a value of cliservice if the httpPath configuration variable is not specified.

TCP Transport

jdbc:hive2://<host>:<port>/<dbName>;<otherSessionConfs>?<hiveConfs>#<hiveVars> 

Because the default transport is TCP, there is no need to specify transportMode=binary if TCP transport is desired.

User Authentication

HiveServer2 supports Kerberos, LDAP, Pluggable Authentication Modules (PAM), and custom plugins for authenticating the JDBC user connecting to HiveServer2. The format of the JDBC connection URL for authentication with Kerberos differs from the format for other authentication models.

User Authentication VariableDescription

principal

A string that uniquely identifies a Kerberos user.

saslQop

Quality of protection for the SASL framework. The level of quality is negotiated between the client and server during authentication. Used by Kerberos authentication with TCP transport.

user

Username for non-Kerberos authentication model.

password

Password for non-Kerberos authentication model.

Kerberos Authentication

jdbc:hive2://<host>:<port>/<dbName>;principal=<HiveServer2_kerberos_principal>;<otherSessionConfs>?<hiveConfs>#<hiveVars> 

Kerberos Authentication with Sasl QOP

jdbc:hive2://<host>:<port>/<dbName>;principal=<HiveServer2_kerberos_principal>;saslQop=<qop_value>;<otherSessionConfs>?<hiveConfs>#<hiveVars> 

Non-Kerberos Authentication

jdbc:hive2://<host>:<port>/<dbName>;user=<username>;password=<password>;<otherSessionConfs>?<hiveConfs>#<hiveVars> 

Transport Layer Security

HiveServer2 supports SSL and Sasl QOP for transport-layer security. The format of the JDBC connection URL for SSL differs from the format used by Sasl QOP.

SSL VariableDescription

ssl

Specifies whether to use SSL

sslTrustStore

The path to the SSL TrustStore.

trustStorePassword

The password to the SSL TrustStore.

jdbc:hive2://<host>:<port>/<dbName>;ssl=true;sslTrustStore=<ssl_truststore_path>;trustStorePassword=<truststore_password>;<otherSessionConfs>?<hiveConfs>#<hiveVars> 

When using TCP for transport and Kerberos for security, HiveServer2 uses Sasl QOP for encryption rather than SSL.

Sasl QOP Variable Description

principal

A string that uniquely identifies a Kerberos user.

saslQop

The level of protection desired. For authentication, checksum, and encryption, specify auth-conf. The other valid values do not provide encryption.

jdbc:hive2://<host>:<port>/<dbName>;principal=<HiveServer2_kerberos_principal>;saslQop=auth-conf;<otherSessionConfs>?<hiveConfs>#<hiveVars>