Oozie 配合 sqoop hive 实现数据分析输出到 mysql

文件/RDBMS -> flume/sqoop -> HDFS -> Hive -> HDFS -> Sqoop -> RDBMS

其中,本文实现了

  • 使用 sqoop 从 RDBMS 中读取数据(非Oozie实现,具体错误将在本文最后说明)
  • 从 Hive 处理数据存储到 HDFS
  • 使用 sqoop 将 HDFS 存储到 RDBMS 中\

1.复制一个 sqoop example,拷贝 hive-site.xml 文件,拷贝 mysql 依赖包到 lib 目录下

2.增加 sqoop-import.sql 文件用以从 RDBMS 读取数据到 Hive 中

--connect
jdbc:mysql://cen-ubuntu:3306/test
--username
root
--password
ubuntu
--table
user
--hive-database
default
--hive-table
import_from_mysql
--hive-import
--hive-overwrite
--delete-target-dir

3.增加 select.sql 用于使用 Hive 处理数据导出到 HDFS 中(注意说明输出分隔符)

insert overwrite directory '/user/cen/oozie-apps/sqoop2hive2sqoop/output/' ROW format delimited fields terminated by ',' select id,name from default.import_from_mysql;

4.增加 sqoop-export.sql 用于使用 sqoop 将 HDFS 文件导入到 RDBMS 中

--connect
jdbc:mysql://cen-ubuntu:3306/test
--username
root
--password
ubuntu
--table
export_from_hdfs
--export-dir
/user/cen/oozie-apps/sqoop2hive2sqoop/output/
--fields-terminated-by
','

5.修改 job.properties 文件

nameNode=hdfs://cen-ubuntu.cenzhongman.com:8020
jobTracker=localhost:8032
queueName=default
oozieAppsRoot=oozie-apps

oozie.use.system.libpath=true

oozie.wf.application.path=${nameNode}/user/cen/${oozieAppsRoot}/sqoop2hive2sqoop/
outputDir=sqoop2hive2sqoop/output

6.修改 workflow.xml 文件

<workflow-app xmlns="uri:oozie:workflow:0.5" name="sqoop2hive2sqoop-wf">
    <start to="hive-node"/>

    <action name="hive-node">
        <hive xmlns="uri:oozie:hive-action:0.5">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <prepare>
                <delete path="${nameNode}/user/cen/${oozieAppsRoot}/${outputDir}"/>
            </prepare>
            <job-xml>${nameNode}/user/cen/${oozieAppsRoot}/sqoop2hive2sqoop/hive-site.xml</job-xml>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <script>select.sql</script>
        </hive>
        <ok to="sqoop-export-node"/>
        <error to="hive-fail"/>
    </action>

    <action name="sqoop-export-node">
        <sqoop xmlns="uri:oozie:sqoop-action:0.3">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <command>export --options-file sqoop-export.sql</command>
            <file>${nameNode}/user/cen/${oozieAppsRoot}/sqoop2hive2sqoop/sqoop-export.sql#sqoop-export.sql</file>
        </sqoop>
        <ok to="end"/>
        <error to="sqoop-export-fail"/>
    </action>

    <kill name="hive-fail">
        <message>hive failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <kill name="sqoop-export-fail">
        <message>Sqoop export failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>

注意事项

  • 各个 action 节点的版本号
  • 用到文件拷贝,请使用 file 属性
  • Hive 的配置文件不能忘记

7.上传文件到 HDFS 上

8.执行 sqoop 从 MySQL 中读取数据到 Hive 中(此处出现错误 could not load org.apache.hadoop.hive.conf.HiveConf.Make sure HIVE_CONF_DIR is set corretly.原因及解决请看 注2 )

bin/sqoop import --options-file /opt/cdh5.3.6/oozie-4.1.0-cdh5.12.0/oozie-apps/sqoop2hive2sqoop/sqoop-import.sql

9.检查 Hive 中是否已经存在数据,并执行 Oozie

export OOZIE_URL=http://cen-ubuntu:11000/oozie/
bin/oozie job --config /opt/cdh5.3.6/oozie-4.1.0-cdh5.12.0/oozie-apps/sqoop2hive2sqoop/job.properties -run

10.检查程序执行 Wordflow 和 MySQL 中的输出结果

注1:使用 Oozie 通过 sqoop import to hive 执行失败(同样的程序,本地执行成功),但日志无输出,此处贴出完整 wordflow.xml 文件仅供参考

<workflow-app xmlns="uri:oozie:workflow:0.5" name="sqoop2hive2sqoop-wf">
    <start to="sqoop-import-node"/>

    <action name="sqoop-import-node">
        <sqoop xmlns="uri:oozie:sqoop-action:0.3">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <command>import --options-file sqoop-import.sql</command>
            <file>${nameNode}/user/cen/${oozieAppsRoot}/sqoop2hive2sqoop/sqoop-import.sql#sqoop-import.sql</file>
        </sqoop>
        <ok to="hive-node"/>
        <error to="sqoop-import-fail"/>
    </action>

    <action name="hive-node">
        <hive xmlns="uri:oozie:hive-action:0.5">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <prepare>
                <delete path="${nameNode}/user/cen/${oozieAppsRoot}/${outputDir}"/>
            </prepare>
            <job-xml>${nameNode}/user/cen/${oozieAppsRoot}/sqoop2hive2sqoop/hive-site.xml</job-xml>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <script>select.sql</script>
        </hive>
        <ok to="sqoop-export-node"/>
        <error to="hive-fail"/>
    </action>

    <action name="sqoop-export-node">
        <sqoop xmlns="uri:oozie:sqoop-action:0.3">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <command>export --options-file sqoop-export.sql</command>
            <file>${nameNode}/user/cen/${oozieAppsRoot}/sqoop2hive2sqoop/sqoop-export.sql#sqoop-export.sql</file>
        </sqoop>
        <ok to="end"/>
        <error to="sqoop-export-fail"/>
    </action>

    <kill name="sqoop-import-fail">
        <message>Sqoop import failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <kill name="hive-fail">
        <message>hive failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <kill name="sqoop-export-fail">
        <message>Sqoop export failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>

原因剖析:错误出现在 sqoop-import-node

  • 找不到 hive 配置文件,尝试 (1):如 hive-node 一样增加说明 --> 无效 尝试(2):在 sqoop-import.sql 中增加 --hive-home /opt/xxx/xxx/xxx --> 无效 尝试(3):修改conf/cation-conf/hive.xml --> 并未配置
  • 无法从本地的 sqoop 执行 Hive ? ? 有机会再探索

注2:执行 sqoop 过程出现错误could not load org.apache.hadoop.hive.conf.HiveConf.Make sure HIVE_CONF_DIR is set corretly.

  • 原因:系统使用了变量$HADOOP_CLASSPATH 但本机未定义

  • 解决:增加用户环境变量~/.bash_profile

    export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/opt/cdh5.3.6/hive-1.1.0-cdh5.12.0/lib/*

详情请参考

posted @ 2017-07-30 09:54  岑忠满  阅读(2069)  评论(0编辑  收藏  举报