Fork me on GitHub

Hive之单独部署机器

环境说明

  • CentOS7,hadoop-2.6.5,hive-1.2.2,MariaDB-5.5.60,jdk-1.8
  • 假设hive机已经安装好了MariaDB(已启动且已创建好hive账号,对hive数据库有所有权限)和jdk

copy一份hadoop2.6.5到hive机器

只需要保留 bin, etc, libexec, share四个目录即可,share/doc目录可以删除

[root@wadeyu hadoop-2.6.5]# pwd
/usr/local/src/hadoop-2.6.5
[root@wadeyu hadoop-2.6.5]# ll
total 16
drwxrwxr-x. 2 root root 4096 Oct  3  2016 bin
drwxrwxr-x. 3 root root 4096 Oct  3  2016 etc
drwxrwxr-x. 2 root root 4096 Oct  3  2016 libexec
drwxrwxr-x. 3 root root 4096 Sep 18 11:27 share

hive机器安装以及配置hive1.2.2

# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME=/usr/local/src/hadoop-2.6.5

# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/usr/local/src/hive-1.2.2/conf
  • 修改配置hive-site.xml
<configuration>
    <property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:mysql://127.0.0.1:3306/hive?createDatabaseIfNotExist=true&amp;characterEncoding=UTF-8</value>
        <description>JDBC connect string for a JDBC metastore</description>    
    </property>   
    <property> 
        <name>javax.jdo.option.ConnectionDriverName</name> 
        <value>com.mysql.jdbc.Driver</value> 
        <description>Driver class name for a JDBC metastore</description>     
    </property>               

    <property> 
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>hive</value>
        <description>username to use against metastore database</description>
    </property>
    <property>  
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>hive</value>
        <description>password to use against metastore database</description>  
    </property>    
	<property>
		<name>hive.querylog.location</name>
		<value>/usr/local/src/hive-1.2.2/iotmp</value>
		<description>Location of Hive run time structured log file</description>
	</property>
	<property>
		<name>hive.exec.local.scratchdir</name>
		<value>/usr/local/src/hive-1.2.2/iotmp</value>
		<description>Local scratch space for Hive jobs</description>
	</property>  
	<property>
		<name>hive.downloaded.resources.dir</name>
		<value>/usr/local/src/hive-1.2.2/iotmp</value>
		<description>Temporary local directory for added resources in the remote file system.</description>
	</property>
	<property>
		<name>hive.hwi.listen.host</name>
		<value>0.0.0.0</value>
	</property>
	<property>
		<name>hive.hwi.listen.port</name>
		<value>9999</value>
	</property>
	<property>
		<name>hive.hwi.war.file</name>
		<value>lib/hive-hwi-1.2.2.war</value>	
	</property>
	<property>
		<name>hive.cli.print.current.db</name>
		<value>true</value>	
	</property>
	<property>
		<name>hive.cli.print.header</name>
		<value>true</value>	
	</property>
</configuration>

启动客户端

  • 客户端shell:/usr/local/src/hive-1.2.2/bin/hive
  • 保证其它机器可以通过hive客户端使用,需要启动metastroe服务
/usr/local/src/hive-1.2.2/bin/hive --service metastore &

启动网页界面

[root@wadeyu lib]# /usr/local/src/hive-1.2.2/bin/hive --service hwi &
[1] 7349
[root@wadeyu lib]# jps
7410 Jps
7349 RunJar
5161 RunJar
[root@wadeyu lib]# 18/09/18 16:44:22 INFO hwi.HWIServer: HWI is starting up
18/09/18 16:44:28 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
18/09/18 16:44:28 INFO mortbay.log: jetty-6.1.26
18/09/18 16:44:29 INFO mortbay.log: Extract /usr/local/src/hive-1.2.2/lib/hive-hwi-1.2.2.war to /tmp/Jetty_0_0_0_0_9999_hive.hwi.1.2.2.war__hwi__21w1ka/webapp
18/09/18 16:44:31 INFO mortbay.log: Started SocketConnector@0.0.0.0:9999

其它hive客户端机器操作

  • 复制hive-1.2.2到其它机器
  • 如果hadoop未安装,jdk1.8未安装,还需要安装这2个组件
  • 修改hive-site.xml配置文件
[root@master conf]# cat hive-site.xml 
<configuration>
    <property>
        <name>hive.metastore.uris</name>  
        <value>thrift://192.168.1.9:9083</value>  
    </property>
</configuration>

基本操作

基本上跟mysql客户端差不多

碰到的问题

  • 问题1
Exception in thread "main" java.lang.RuntimeException: java.net.ConnectException: Call From master/192.168.1.15 to master:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

原因:未启动hadoop集群

解决方法:启动hadoop集群

  • 问题2
Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D

原因:缺少相关配置

解决方法:hive-site.xml增加如下配置

<property>
	<name>system:java.io.tmpdir</name>
	<value>/usr/local/src/hive-1.2.2/iotmp</value>
	<description/>
</property>
<property>
	<name>system:user.name</name>
	<value>hive</value>
	<description/>
</property>
  • 问题3
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
        at jline.TerminalFactory.create(TerminalFactory.java:101)
        at jline.TerminalFactory.get(TerminalFactory.java:158)
        at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
        at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
        at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
        at org.apache.hadoop.hive.cli.CliDriver.setupConsoleReader(CliDriver.java:787)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:721)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

原因:hadoop jline.jar库版本太低,移动hive jline库 到 hadoop库

解决方法:

[root@master lib]# find /usr/local/src -name '*jline*'
/usr/local/src/hadoop-2.6.5/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/jline-0.9.94.jar
/usr/local/src/hadoop-2.6.5/share/hadoop/yarn/lib/jline-0.9.94.jar
/usr/local/src/hadoop-2.6.5/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/jline-0.9.94.jar
/usr/local/src/zookeeper-3.4.12/lib/jline-0.9.94.jar
/usr/local/src/zookeeper-3.4.12/lib/jline-0.9.94.LICENSE.txt
/usr/local/src/zookeeper-3.4.12/src/java/lib/jline-0.9.94.LICENSE.txt
/usr/local/src/hive-1.2.2/lib/jline-2.12.jar
[root@master lib]# cp jline-2.12.jar /usr/local/src/hadoop-2.6.5/share/hadoop/yarn/lib/jline-2.12.jar
[root@master lib]# ll /usr/local/src/hadoop-2.6.5/share/hadoop/yarn/lib/ | grep jline
-rw-rw-r--. 1 wadeyu wadeyu   87325 Oct  3  2016 jline-0.9.94.jar
-rw-r--r--. 1 root   root    213854 Sep 12 17:58 jline-2.12.jar
  • 问题4
hive> create table dep(id int, name string) row format delimited fields terminated by '\t' lines terminated by '\n';
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct MetaStore DB connections, we don't support retries at the client level.)

原因:jdbc客户端和hive元数据库使用编码不一致

解决方法:

# 解决方法1:hive数据库编码改成latin1
MariaDB [hive]> alter database hive character set latin1;
Query OK, 1 row affected (0.00 sec)

# 结局方法2:客户端编码改成和hive数据一样
或者客户端统一改成utf8编码
jdbc:mysql://192.168.1.9:3306/hive?createDatabaseIfNotExist=true&amp;characterEncoding=UTF-8
使用的是mysql-connector-java-8.0.12.jar这个版本的驱动,使用这种方法可以解决
  • 问题5
[root@master conf]# hive --service cli
[Fatal Error] hive-site.xml:406:94: The reference to entity "characterEncoding" must end with the ';' delimiter.
18/09/12 18:55:00 FATAL conf.Configuration: error parsing conf file:/usr/local/src/hive-1.2.2/conf/hive-site.xml
org.xml.sax.SAXParseException; systemId: file:/usr/local/src/hive-1.2.2/conf/hive-site.xml; lineNumber: 406; columnNumber: 94; The reference to entity "characterEncoding" must end with the ';' delimiter.
        at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
        at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
        at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
        at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2432)
        at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2420)
        at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2488)
        at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2454)
        at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)
        at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)
        at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:2615)
        at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:2636)
        at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:2707)
        at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:2651)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:74)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:637)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

原因:值包含了xml未转义的特殊字符

解决方法:特殊字符&转义成xml实体

<value>jdbc:mysql://192.168.1.9:3306/hive?createDatabaseIfNotExist=true&amp;characterEncoding=UTF-8</value>
  • 问题6
hive> load data local inpath '/home/wadeyu/test2.log' into table dep;
Loading data to table default.dep
Table default.dep stats: [numFiles=1, totalSize=56]
OK
Time taken: 1.779 seconds
hive> select * from dep;
OK
NULL    NULL
NULL    NULL
NULL    NULL
NULL    NULL
NULL    NULL
NULL    NULL
NULL    NULL

原因:vim把tab转换成了空格

解决方法:

文本编辑器需要设置tab符不扩展为多个空格,因为定义表结构的时候,使用\t分隔字段,使用\n分隔行
vim编辑器临时设置:set noexpandtab
  • 问题7
root@wadeyu conf]# ss -ls: cannot access /usr/local/src/hive-1.2.2/lib/hive-hwi-*.war: No such file or directory     18/09/18 16:21:31 INFO hwi.HWIServer: HWI is starting up
18/09/18 16:21:36 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
18/09/18 16:21:36 INFO mortbay.log: jetty-6.1.26
18/09/18 16:21:38 INFO mortbay.log: Started SocketConnector@0.0.0.0:9999

原因:且少hwi.war包

解决方法:

从对应的版本源码里扣出来

cd hwi/web
对web目录内容通过jar命令生成.war文件,然后移动到 hive的lib目录下

[root@wadeyu web]# jar cvf hive-hwi-1.2.2.war ./*
added manifest
adding: authorize.jsp(in = 2729) (out= 1201)(deflated 55%)
adding: css/(in = 0) (out= 0)(stored 0%)
adding: css/bootstrap.min.css(in = 90193) (out= 14754)(deflated 83%)
adding: diagnostics.jsp(in = 2365) (out= 1062)(deflated 55%)
adding: error_page.jsp(in = 1867) (out= 931)(deflated 50%)
adding: img/(in = 0) (out= 0)(stored 0%)
adding: img/glyphicons-halflings-white.png(in = 4352) (out= 4190)(deflated 3%)
adding: img/glyphicons-halflings.png(in = 4352) (out= 4192)(deflated 3%)
adding: index.jsp(in = 1876) (out= 981)(deflated 47%)
adding: left_navigation.jsp(in = 1553) (out= 709)(deflated 54%)
adding: navbar.jsp(in = 1345) (out= 681)(deflated 49%)
adding: session_create.jsp(in = 2690) (out= 1248)(deflated 53%)
adding: session_diagnostics.jsp(in = 2489) (out= 1155)(deflated 53%)
adding: session_history.jsp(in = 3150) (out= 1334)(deflated 57%)
adding: session_kill.jsp(in = 2236) (out= 1108)(deflated 50%)
adding: session_list.jsp(in = 2298) (out= 1059)(deflated 53%)
adding: session_manage.jsp(in = 6738) (out= 2198)(deflated 67%)
adding: session_remove.jsp(in = 2359) (out= 1151)(deflated 51%)
adding: session_result.jsp(in = 2488) (out= 1149)(deflated 53%)
adding: show_database.jsp(in = 2346) (out= 1133)(deflated 51%)
adding: show_databases.jsp(in = 2096) (out= 1039)(deflated 50%)
adding: show_table.jsp(in = 4996) (out= 1607)(deflated 67%)
adding: view_file.jsp(in = 2653) (out= 1257)(deflated 52%)
adding: WEB-INF/(in = 0) (out= 0)(stored 0%)
adding: WEB-INF/web.xml(in = 1438) (out= 741)(deflated 48%)
  • 问题8
 Unable to find a javac compiler;
com.sun.tools.javac.Main is not on the classpath.
Perhaps JAVA_HOME does not point to the JDK.
It is currently set to "/usr/local/src/jdk1.8.0_181/jre"
Caused by:
Unable to find a javac compiler;
com.sun.tools.javac.Main is not on the classpath.
Perhaps JAVA_HOME does not point to the JDK.
It is currently set to "/usr/local/src/jdk1.8.0_181/jre"
	at org.apache.tools.ant.taskdefs.compilers.CompilerAdapterFactory.getCompiler(CompilerAdapterFactory.java:129)

原因:缺少tools.jar库

解决方法:java库lib/tools.jar复制到hive的lib目录下

  • 问题9
18/10/09 15:44:58 [main]: ERROR DataNucleus.Datastore: Error thrown executing CREATE TABLE `PARTITION_PARAMS`
(
    `PART_ID` BIGINT NOT NULL,
    `PARAM_KEY` VARCHAR(256) BINARY NOT NULL,
    `PARAM_VALUE` VARCHAR(4000) BINARY NULL,
    CONSTRAINT `PARTITION_PARAMS_PK` PRIMARY KEY (`PART_ID`,`PARAM_KEY`)
) ENGINE=INNODB : Specified key was too long; max key length is 767 bytes
java.sql.SQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:120)
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
	at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
	at com.mysql.cj.jdbc.StatementImpl.executeInternal(StatementImpl.java:781)

原因:未知,可能是hive的bug

解决方法:MariaDB [hive]> alter database hive default character set latin1;

  • 问题10
18/10/09 16:06:53 [main]: ERROR DataNucleus.Datastore: Error thrown executing ALTER TABLE `PARTITIONS` ADD COLUMN `TBL_ID` BIGINT NULL : Table 'hive.PARTITIONS' doesn't exist
java.sql.SQLSyntaxErrorException: Table 'hive.PARTITIONS' doesn't exist
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:120)
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
	at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
	at com.mysql.cj.jdbc.StatementImpl.executeInternal(StatementImpl.java:781)
	at com.mysql.cj.jdbc.StatementImpl.execute(StatementImpl.java:666)
	at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254)

原因:缺少表

解决方法:删除hive元数据库,使用工具初始化Hive元数据库

[root@wadeyu bin]# ./schematool -dbType mysql -initSchema
  • 问题11
[root@wadeyu hive-1.2.2]# ./bin/beeline -u jdbc:hive2://
Connecting to jdbc:hive2://
18/10/09 16:24:45 [main]: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
18/10/09 16:25:30 [main]: WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
Error applying authorization policy on hive configuration: java.net.NoRouteToHostException: No Route to Host from  wadeyu/192.168.1.7 to master:9000 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see:  http://wiki.apache.org/hadoop/NoRouteToHost
Beeline version 1.2.2 by Apache Hive

原因:hadoop集群未启动,有可能是网络原因

解决方法:启动hadoop集群,检查网络

参考资料

【0】Hive环境的安装部署
http://www.cnblogs.com/zlslch/p/6700695.html

【1】Hadoop集群之Hive安装配置
https://blog.csdn.net/blue_jjw/article/details/50479263

【2】hive的用户和用户权限
https://www.cnblogs.com/yejibigdata/p/6394719.html

【3】Hive用户接口(一)—Hive Web接口HWI的操作及使用
https://blog.csdn.net/NIITYZU/article/details/42582537

【4】Hive的使用之hwi
https://blog.csdn.net/zengmingen/article/details/52399457

【5】使用HIVE的WEB界面:HWI
http://www.cnblogs.com/gpcuster/archive/2010/02/25/1673480.html

【6】hive-hwi-0.13.1图形界面配置
https://blog.csdn.net/wulantian/article/details/38271803

【7】HARDFP ABI理解
http://www.cnblogs.com/sonach/archive/2011/12/24/2300713.html

posted @ 2018-10-09 00:20  huan&ping  阅读(1836)  评论(0编辑  收藏  举报