hive2.3.4安装

一、安装Hadoop

Hive运行在Hadoop环境之上,因此需要hadoop环境,本次在安装在hadoop完全分布式模式的namennode节点上

请参考:hadoop搭建

二、安装Hive

  • 下载
[hadoop@hadoop01 /home/hadoop]$cd /app/
[hadoop@hadoop01 /app]$wget http://archive.apache.org/dist/hive/stable-2/apache-hive-2.3.4-bin.tar.gz
  • 解压安装
[hadoop@hadoop01 /app]$cd /app
[hadoop@hadoop01 /app]$ln -s apache-hive-2.3.4-bin hive
[hadoop@hadoop01 /app]$tar zxvf apache-hive-2.3.4-bin.tar.gz

  • 配置hadoop用户的hive环境变量
echo -e '##################HIVE环境变量配置#############\nexport HIVE_HOME=/app/hive\nexport PATH=$HIVE_HOME/bin:$PATH' >> ~/.bash_profile&& source ~/.bash_profile&&tail -3 ~/.bash_profile

三、配置hive

3.1 拷贝出hive-site.xml和hive-env.sh配置文件

[hadoop@hadoop01 /app/hive/conf]$cd /app/hive/conf/
[hadoop@hadoop01 /app/hive/conf]$cp hive-default.xml.template hive-site.xml
[hadoop@hadoop01 /app]$cp hive-env.sh.template hive-env.sh

3.2 修改hive-site.xm中系统无法识别的变量${system:java.io.tmpdir}${system:user.name}

  • 测试
[hadoop@hadoop01 /app/hive/conf]$sed -n 's#${system:java.io.tmpdir}#/app/hive.java.io.tmpdir#pg' hive-site.xml 
    <value>/app/hive.java.io.tmpdir/${system:user.name}</value>
    <value>/app/hive.java.io.tmpdir/${hive.session.id}_resources</value>
    <value>/app/hive.java.io.tmpdir/${system:user.name}</value>
    <value>/app/hive.java.io.tmpdir/${system:user.name}/operation_logs</value>
[hadoop@hadoop01 /app/hive/conf]$sed -n 's#${system:user.name}#hadoop#pg' hive-site.xml 
    <value>/app/hive.java.io.tmpdir/hadoop</value>
    <value>/app/hive.java.io.tmpdir/hadoop</value>
    <value>/app/hive.java.io.tmpdir/hadoop/operation_logs</value>

  • 替换
[hadoop@hadoop01 /app/hive/conf]$sed -i 's#${system:java.io.tmpdir}#/app/hive.java.io.tmpdir#g' hive-site.xml 
[hadoop@hadoop01 /app/hive/conf]$sed -i 's#${system:user.name}#hadoop#g' hive-site.xml
  • 检查
[hadoop@s101 /app/hive/conf]$grep 'hive.java.io.tmpdir' hive-site.xml 
    <value>/app/hive.java.io.tmpdir/${system:user.name}</value>
    <value>/app/hive.java.io.tmpdir/${hive.session.id}_resources</value>
    <value>/app/hive.java.io.tmpdir/${system:user.name}</value>
    <value>/app/hive.java.io.tmpdir/${system:user.name}/operation_logs</value>
[hadoop@s101 /app/hive/conf]$grep 'hive.java.io.tmpdir/hadoop' hive-site.xml 
    <value>/app/hive.java.io.tmpdir/hadoop</value>
    <value>/app/hive.java.io.tmpdir/hadoop</value>
    <value>/app/hive.java.io.tmpdir/hadoop/operation_logs</value>
  • 创建出文件夹
[hadoop@hadoop01 /app]$mkdir -p /app/hive.java.io.tmpdir/hadoop

3.3 初始化默认derby数据库

[hadoop@hadoop01 /home/hadoop]$schematool -initSchema -dbType derby

.......
Metastore connection URL: jdbc:derby:;databaseName=metastore_db;create=true
Metastore Connection Driver : org.apache.derby.jdbc.EmbeddedDriver
Metastore connection User: APP
Starting metastore schema initialization to 2.3.0
Initialization script hive-schema-2.3.0.derby.sql
Initialization script completed
schemaTool completed

  • 初始化完成后,会在当前目录产生一个metastore_db文件夹,这就是数据库的名称。
    HDFS文件系统中也会产生相应的目录
[hadoop@hadoop01 /home/hadoop]$ll /home/hadoop/metastore_db/
[hadoop@hadoop01 /home/hadoop]$hdfs dfs -ls  /tmp
Found 2 items
drwx-wx-wx   - hadoop supergroup          0 2018-11-26 19:12 /tmp/hive

四、进入hive命令行

  • 检查hadoop集群是否启动,如果没有启动需要启动起来
[hadoop@hadoop01 /home/hadoop]start-dfs.sh
[hadoop@hadoop01 /home/hadoop]start-yarn.sh
  • 进入hive命令行
[hadoop@hadoop01 /home/hadoop]$hive
...
hive> 

五、配置MySQL数据库存储元数据

5.1 修改URL、drivername、username、password等配置信息

[hadoop@hadoop01 /home/hadoop]$vim /app/hive/conf/hive-site.xml

<property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:mysql://hadoop01:3306/hive</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>com.mysql.jdbc.Driver</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>hive</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>123456</value>
</property>

请参考:docker安装mysql

5.2将java的msyql驱动拷贝至apache-hive-2.3.4-bin/lib中

5.3初始化mysql数据

schematool -initSchema -dbType mysql

posted @ 2019-04-11 22:51  Runner_Jack  阅读(1069)  评论(0编辑  收藏  举报