Hive的安装部署
1.环境准备
1.1软件版本
hive-0.14 下载地址
2.配置
安装hive的前提,必需安装好hadoop环境,可以参考我之前Hadoop社区版搭建,先搭建好hadoop环境;接下来我们开始配置hive
2.1环境变量
sudo vi /etc/profile
HIVE_HOME=/home/hadoop/source/hive-0.14.0 PATH=$HIVE_HOME/bin export HIVE_HOME
2.2hive-site.xml
<configuration> <property> <name>datanucleus.fixedDatastore</name> <value>false</value> </property> <property> <name>hive.metastore.execute.setugi</name> <value>true</value> </property> <property> <name>hive.metastore.warehouse.dir</name> <value>/home/hive/warehouse</value> <description>location of default database for the warehouse</description> </property> <!-- metadata database connection configuration --> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://10.211.55.18:3306/hive?useUnicode=true&characterEncoding=UTF-8&createDatabaseIfNotExist=true</value> <description>JDBC connect string for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>root</value> <description>username to use against metastore database</description> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>root</value> <description>password to use against metastore database</description> </property> <property> <name>hive.hwi.listen.host</name> <value>10.211.55.18</value> <description>This is the host address the Hive Web Interface will listen on</description> </property> <property> <name>hive.hwi.listen.port</name> <value>9999</value> <description>This is the port the Hive Web Interface will listen on</description> </property> <!-- configure hwi war package location --> <!-- <property> <name>hive.hwi.war.file</name> <value>lib/hive-hwi-0.14.0.war</value> <description>This is the WAR file with the jsp content for Hive Web Interface</description> </property> --> </configuration>
2.3hive-env.sh
# Set HADOOP_HOME to point to a specific hadoop install directory HADOOP_HOME=/home/hadoop/source/hadoop-2.5.1
2.4启动
[hadoop@cloud001 ~]$ hive Logging initialized using configuration in file:/home/hadoop/source/hive-0.14.0/conf/hive-log4j.properties SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/source/hadoop-2.5.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/source/hive-0.14.0/lib/hive-jdbc-0.14.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] hive>
至此,hive配置完成。
注:由于配置的是mysql驱动,所以需要把mysql的驱动包放到$HIVE_HOME/lib下
联系方式:
邮箱:smartloli.org@gmail.com
QQ群(Hive与AI实战【新群】):935396818
QQ群(Hadoop - 交流社区1):424769183
QQ群(Kafka并不难学):825943084
温馨提示:请大家加群的时候写上加群理由(姓名+公司/学校),方便管理员审核,谢谢!
邮箱:smartloli.org@gmail.com
QQ群(Hive与AI实战【新群】):935396818
QQ群(Hadoop - 交流社区1):424769183
QQ群(Kafka并不难学):825943084
温馨提示:请大家加群的时候写上加群理由(姓名+公司/学校),方便管理员审核,谢谢!