hive 安装和部署
1.先安装好hdfs和mapreduce
2.下载hive的安装包,
下载地址:https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-InstallingHivefromaStableRelease
3.配置系统环境变量
修改 /etc/profile 文件 vi /etc/profile 来修改(root用户操作):
设置 Hive环境变量 # Hive environment export HIVE_HOME=/home/hadoop/cloud/apache-hive-2.1.1-bin export PATH=$HIVE_HOME/bin:$HIVE_HOME/conf:$PATH
使环境变量生效: source /etc/profile
4.创建必要的目录
$HADOOP_HOME/bin/hadoop fs -mkdir -p /user/hive/warehouse $HADOOP_HOME/bin/hadoop fs -mkdir -p /tmp/hive/ hadoop fs -chmod 777 /user/hive/warehouse hadoop fs -chmod 777 /tmp/hive
5../conf/hive-default.xml 文件新增属性
<property> <name>system:java.io.tmpdir</name> <value>自定义的文件路径</value> <description/> </property>
6.设置mysql关联hive
在hive/conf/目录下创建hive-site.xml文件
<?xml version="1.0" encoding="UTF-8" standalone="no"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>javax.jdo.option.Conne l://192.168.169.134:3306/hive?createDatabaseIfNotExist=true</value> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>root</value> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>123456</value> </property> <property> <name>hive.metastore.schema.verification</name> <value>false</value> <description> Enforce metastore schema version consistency. True: Verify that version information stored in metastore matches with one from Hive jars. Also disable automatic schema migration attempt. Users are required to manully migrate schema after Hive upgrade which ensures proper metastore schema migration. (Default) False: Warn if the version information stored in metastore doesn't match with one from in Hive jars. </description> </property> </configuration>
新增mysql-connector包
将其(如mysql-connector-java-5.1.15-bin.jar)拷贝到$HIVE_HOME/lib下即可。
初始化数据库
$HADOOP_HOME/bin/schematool -dbType mysql -initSchema
run hive
$HADOOP_HOME/bin/hive