Spark2.1.0安装

 

1.解压安装spark

 tar zxf spark-2.1.O-bin-2.6.0-CDH5.10.0.tgz

2.修改配置文件

 vim /etc/profile

 export SPARK_HOME=/opt/spark/spark-2.1.O

 export PATH=$PATH:$SPARK_HOME/bin

 source /etc/profile

3.spark配置

 cd /opt/spark/spark-2.1.O/conf

 mv spark-env.sh.template spark-env.sh

 

 vim spark-env.sh

  export JAVA_HOME=/usr/java/jdk1.7.0_79

  export PATH=$PATH:$JAVA_HOME/bin

  export SCALA_HOME=/usr/scala/scala-2.11.8

  export HADOOP_CONF_DIR=/opt/cloudera/parcels/CDH-5.10.0-1.cdh5.10.0.p0.41/lib/hadoop/etc/hadoop

  export SPARK_MASTER_IP=192.168.1.7

  export SPARK_MASTER_PORT=7077

  export SPARK_DIST_CLASSPATH=$(/opt/cloudera/parcels/CDH-5.10.0-1.cdh5.10.0.p0.41/bin/hadoop classpath)

 

 

 

 

 

 

 

 

4.修改slaves

 cd /opt/spark/spark-2.1.O/conf

  mv slaves.template slaves

  vim slaves

  192.168.1.7

  192.168.1.8

  192.168.1.9

5.远程拷贝到其他节点

scp -r /opt/spark/spark-2.1.0 root@192.168.1.8:/opt/spark/

scp -r /opt/spark/spark-2.1.0 root@192.168.1.9:/opt/spark/

 

6.启动spark

sbin/start-master.sh

sbin/start-slaves.sh

 

7.web界面查看

192.168.1.7:8080

posted @ 2017-05-24 17:42  鱼果说  阅读(170)  评论(0编辑  收藏  举报