dolphinscheduler报错Error: /opt/soft/hadoop does not exist! Please set $HADOOP_COMMON_HOME to the root of your Hadoop installation.

背景:

sqoop抽取mysql数据到hive是成功的,但是使用dolphinscheduler调度sqoop抽取数据到mysql失败,报错信息如下:

4.[INFO]   - [taskAppId=TASK-1-4-4]:[138] -  -> Error: /opt/soft/hadoop does not exist!
    Please set $HADOOP_COMMON_HOME to the root of your Hadoop installation.

原因:

发现dolphinscheduler的shell默认执行环境会添加HADOOP相关配置,修改.escheduler_env.sh 配置文件就能够获取环境变量。
默认环境变量可以与实际环境变量有所差异,故需要修改。

解决方法:

vim  /opt/module/dolphinscheduler/conf/env/dolphinscheduler_env.sh

定位问题并修改,最终修改如下:

export HADOOP_HOME=/opt/cloudera/parcels/CDH/lib/hadoop/

export HADOOP_CONF_DIR=/etc/hadoop/conf.cloudera.hdfs/

export SPARK_HOME1=/opt/soft/spark1

export SPARK_HOME2=/opt/soft/spark2

export PYTHON_HOME=/opt/soft/python

export JAVA_HOME=/usr/java/jdk1.8.0_181-cloudera

export HIVE_HOME=/opt/cloudera/parcels/CDH/lib/hive/

export HADOOP_MAPRED_HOME=/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce

export FLINK_HOME=/opt/soft/flink

export DATAX_HOME=/opt/soft/datax

export PATH=$HADOOP_HOME/bin:$SPARK_HOME1/bin:$SPARK_HOME2/bin:$PYTHON_HOME:$JAVA_HOME/bin:$HIVE_HOME/bin:$FLINK_HOME/bin:$DATAX_HOME/bin:$PATH

 

posted @ 2022-07-28 11:11  所向披靡zz  阅读(407)  评论(0编辑  收藏  举报