sql workbench connection to hadoop
Category : Hive
Many time we do not want to run our hive query through beeline or hive cli due to so many reason. Here I am not going to talk about reasons as its big debatable point, so in this article I have explain the steps to connect SQL Workbench to out hadoop cluster. In this article I have used hortonworks hdp 2.3 stack.
Step 1: Download SQL Workbench from the following URL and run it.
http://www.sql-workbench.net/downloads.html
Step 2 : Download following driver list from your hadoop client to your local machine.
FOR HDP 2.2.x From /usr/hdp/current/hive-client/lib/ sftp or scp to your local desktop
- hive-*-0.14*.jar
- httpclient-4.2.5.jar
- httpcore-4.2.5.jar
- libthrift-0.9.0.jar
- libfb303-0.9.0.jar
- commons-logging-1.1.3.jar
FOR HDP 2.3.x
From /usr/hdp/current/hive-client/lib/ sftp or scp to your local desktop
- hive-jdbc.jar (you may have symlink so need to download main jar)
FOR BOTH
From /usr/hdp/current/hadoop-client
- hadoop-common.jar (you may have symlink so need to download main jar)
- hadoop-auth.jar
From /usr/hdp/current/hadoop-client/lib
- log4j-1.2.17.jar
- slf4j-*.jar
Step 3: Goto file and open Manage Driver to configure your driver:
Step 4: Create new Driver with any name(like Hadoop or HiveServer2 ) and upload all required .jar files then click ok.
Step 5: Now go to File menu and then click on “Connection Window ” ,select your created driver also pass required parameters(JDBC URL,Username, password etc):
Step 6: Now click on Test button to test connection and then click on ok button to save your configuration.
Now you can run your command or query through sql-workbench, I hope you this will help you to make connection to hdp via sql workbench
1 Comment
Atilla
September 30, 2019 at 5:43 amHi,
when I start to workbench ,I can see this connection fail message on the monitor
“org/apache/thrift/protocol/TProtocol”
Could you please help to me.
regards.