Hive:JDBC示例
1)本地目录/home/hadoop/test下的test4.txt文件内容(每行数据之间用tab键隔开)如下所示:
[hadoop@master test]$ sudo vim test4.txt
1 dajiangtai
2 hadoop
3 hive
4 hbase
5 spark
2)启动hiveserver2
[hadoop@master test]$ cd ${HIVE_HOME}/bin [hadoop@master bin]$ ll total 32 -rwxr-xr-x 1 hadoop hadoop 881 Jan 30 2015 beeline drwxr-xr-x 3 hadoop hadoop 4096 May 14 23:28 ext -rwxr-xr-x 1 hadoop hadoop 7311 Jan 30 2015 hive -rwxr-xr-x 1 hadoop hadoop 1900 Jan 30 2015 hive-config.sh -rwxr-xr-x 1 hadoop hadoop 885 Jan 30 2015 hiveserver2 -rwxr-xr-x 1 hadoop hadoop 832 Jan 30 2015 metatool -rwxr-xr-x 1 hadoop hadoop 884 Jan 30 2015 schematool [hadoop@master bin]$ ./hiveserver2 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/modules/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/modules/hive1.0.0/lib/hive-jdbc-1.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
3) 程序代码
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
public class Hive {
private static String driverName = "org.apache.hive.jdbc.HiveDriver";//hive驱动名称
private static String url = "jdbc:hive2://master:10000/default";//连接hive2服务的连接地址,Hive0.11.0以上版本提供了一个全新的服务:HiveServer2
private static String user = "hadoop";//对HDFS有操作权限的用户
private static String password = "";//在非安全模式下,指定一个用户运行查询,忽略密码
private static String sql = "";
private static ResultSet res;
public static void main(String[] args) {
try {
Class.forName(driverName);//加载HiveServer2驱动程序
Connection conn = DriverManager.getConnection(url, user, password);//根据URL连接指定的数据库
Statement stmt = conn.createStatement();
//创建的表名
String tableName = "testHiveDriverTable";
/** 第一步:表存在就先删除 **/
sql = "drop table " + tableName;
stmt.execute(sql);
/** 第二步:表不存在就创建 **/
sql = "create table " + tableName + " (key int, value string) row format delimited fields terminated by '\t' STORED AS TEXTFILE";
stmt.execute(sql);
// 执行“show tables”操作
sql = "show tables '" + tableName + "'";
res = stmt.executeQuery(sql);
if (res.next()) {
System.out.println(res.getString(1));
}
// 执行“describe table”操作
sql = "describe " + tableName;
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(res.getString(1) + "\t" + res.getString(2));
}
// 执行“load data into table”操作
String filepath = "/home/hadoop/test/test4.txt";//hive服务所在节点的本地文件路径
sql = "load data local inpath '" + filepath + "' into table " + tableName;
stmt.execute(sql);
// 执行“select * query”操作
sql = "select * from " + tableName;
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(res.getInt(1) + "\t" + res.getString(2));
}
// 执行“regular hive query”操作,此查询会转换为MapReduce程序来处理
sql = "select count(*) from " + tableName;
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(res.getString(1));
}
conn.close();
conn = null;
} catch (ClassNotFoundException e) {
e.printStackTrace();
System.exit(1);
} catch (SQLException e) {
e.printStackTrace();
System.exit(1);
}
}
}
4) 运行结果(右击-->Run as-->Run on Hadoop)
此时直接运行会报错,解决方案请见下一篇博文:HiveSQLException: Error while compiling statement: No privilege 'Create' found for outputs { database:default }。
运行日志如下:
2018-05-24 09:25:52,416 INFO [org.apache.hive.jdbc.Utils] - Supplied authorities: master:10000
2018-05-24 09:25:52,418 INFO [org.apache.hive.jdbc.Utils] - Resolved authority: master:10000
2018-05-24 09:25:52,508 INFO [org.apache.hive.jdbc.HiveConnection] - Will try to open client transport with JDBC Uri: jdbc:hive2://master:10000/default
2018-05-24 09:25:52,509 DEBUG [org.apache.thrift.transport.TSaslTransport] - opening transport org.apache.thrift.transport.TSaslClientTransport@3834d63f
2018-05-24 09:25:52,529 DEBUG [org.apache.thrift.transport.TSaslClientTransport] - Sending mechanism name PLAIN and initial response of length 17
2018-05-24 09:25:52,533 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Writing message with status START and payload length 5
2018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Writing message with status COMPLETE and payload length 17
2018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Start message handled
2018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Main negotiation loop complete
2018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: SASL Client receiving last message
2018-05-24 09:25:52,536 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Received message with status COMPLETE and payload length 0
2018-05-24 09:25:52,552 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 71
2018-05-24 09:25:52,749 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:52,768 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 137
2018-05-24 09:25:52,795 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:52,805 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:52,830 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:52,837 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:52,840 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:52,841 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 229
2018-05-24 09:25:52,914 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:52,914 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:53,270 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:53,270 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:53,271 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:53,272 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 140
2018-05-24 09:25:53,328 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:53,328 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:53,369 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:53,376 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 102
2018-05-24 09:25:53,429 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 131
2018-05-24 09:25:53,451 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:25:53,536 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 119
2018-05-24 09:25:53,551 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
testhivedrivertable
2018-05-24 09:25:53,551 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:53,554 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:53,554 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 135
2018-05-24 09:25:53,674 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:53,674 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:53,740 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:53,740 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 102
2018-05-24 09:25:53,741 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 273
2018-05-24 09:25:53,741 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:25:53,745 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 179
2018-05-24 09:25:53,746 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
key int
2018-05-24 09:25:53,746 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
value string
2018-05-24 09:25:53,746 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:25:53,747 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 138
2018-05-24 09:25:53,747 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:53,751 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:53,751 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 190
2018-05-24 09:25:53,833 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:53,833 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:54,721 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:54,721 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:54,722 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:54,723 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 140
2018-05-24 09:25:55,219 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:55,219 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:55,221 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:55,222 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 102
2018-05-24 09:25:55,223 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 186
2018-05-24 09:25:55,224 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:25:55,324 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 187
2018-05-24 09:25:55,329 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
1 dajiangtai
2018-05-24 09:25:55,329 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
2 hadoop
2018-05-24 09:25:55,329 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
3 hive
2018-05-24 09:25:55,330 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
4 hbase
2018-05-24 09:25:55,330 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
5 spark
2018-05-24 09:25:55,330 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:25:55,330 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 117
2018-05-24 09:25:55,331 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:55,346 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:55,346 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 147
2018-05-24 09:25:55,603 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:55,603 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:00,604 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:00,604 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:05,605 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:05,605 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:10,607 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:10,607 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:15,609 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:15,609 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:20,611 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:20,611 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:25,613 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:25,613 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:30,614 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:30,614 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:33,020 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:33,020 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 102
2018-05-24 09:26:33,021 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 102
2018-05-24 09:26:33,021 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:26:33,043 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 104
2018-05-24 09:26:33,052 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string:
5
2018-05-24 09:26:33,052 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:26:33,053 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 96
2018-05-24 09:26:33,057 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 83
2018-05-24 09:26:33,103 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 40
执行“show tables”运行结果:
testhivedrivertable
执行“describe table”运行结果:
key int
value string
执行“select * query”运行结果:
1 dajiangtai
2 hadoop
3 hive
4 hbase
5 spark
或者从集群上查看运行结果。
hive> show tables;
OK
copy_student1
copy_student2
copy_student3
copy_student4
employee
group_gender_agg
group_gender_sum
group_test
index_test
index_tmp
partition_test
student1
student2
test
test_view
testhivedrivertable
user
Time taken: 0.153 seconds, Fetched: 17 row(s)
hive> desc testhivedrivertable;
OK
key int
value string
Time taken: 0.184 seconds, Fetched: 2 row(s)
hive> select * from testhivedrivertable;
OK
1 dajiangtai
2 hadoop
3 hive
4 hbase
5 spark
Time taken: 0.346 seconds, Fetched: 5 row(s)
以上就是博主为大家介绍的这一板块的主要内容,这都是博主自己的学习过程,希望能给大家带来一定的指导作用,有用的还望大家点个支持,如果对你没用也望包涵,有错误烦请指出。如有期待可关注博主以第一时间获取更新哦,谢谢!
版权声明:本文为博主原创文章,未经博主允许不得转载。
本博文由博主子墨言良原创,未经允许禁止转载,若有兴趣请关注博主以第一时间获取更新哦!