spark-2.2.2-bin-hadoop2.7 HA 配置

安装spark-2.2.2-bin-hadoop2.7:https://blog.csdn.net/drl_blogs/article/details/91948394

1.编辑 主节点conf/spark-env.sh

export JAVA_HOME=/usr/local/jdk1.8.0_211
# export SPARK_MASTER_HOST=hadoop01
# export SPARK_MASTER_PORT=7077

export SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=hadoop01:2181,hadoop02:2181,hadoop03:2181 -Dspark.deploy.zookeeper.dir=/spark"

 

2.发送给其他节点

scp -r /usr/local/spark-2.2.2-bin-hadoop2.7/conf/spark-env.sh hadoop02:/usr/local/spark-2.2.2-bin-hadoop2.7/conf/
scp -r /usr/local/spark-2.2.2-bin-hadoop2.7/conf/spark-env.sh hadoop03:/usr/local/spark-2.2.2-bin-hadoop2.7/conf/

3.启动

   1)主节点启动

 start-all.sh

2)备用节点启动

start-master.sh 

4.OK

posted @ 2019-06-14 11:08  drl_blogs  阅读(316)  评论(0编辑  收藏  举报