Spark本地配置

1. 下载解压安装包

tar -xvf spark-2.0.2-bin-hadoop2.6.tgz

tar -xvf scala-2.11.8.tgz

2. 修改Spark配置文件

cd spark-2.0.2-bin-hadoop2.6/conf/

 vim spark-env.sh

export SCALA_HOME=/usr/local/src/scala-2.11.8
export JAVA_HOME=/usr/local/src/jdk1.8.0_221
export HADOOP_HOME=/usr/local/src/hadoop-2.6.1
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
SPARK_MASTER_IP=master
SPARK_LOCAL_DIRS=/usr/local/src/spark-2.0.2-bin-hadoop2.6
SPARK_DRIVER_MEMORY=1G

vim slaves

slave1
slave2

3. 拷贝安装包到slave1、slave2

scp -r /usr/local/src/spark-2.0.2-bin-hadoop2.6 root@slave1:/usr/local/src/spark-2.0.2-bin-hadoop2.6

scp -r /usr/local/src/spark-2.0.2-bin-hadoop2.6 root@slave2:/usr/local/src/spark-2.0.2-bin-hadoop2.6

4. 启动集群

先启动hadoop集群

再启动spark集群

cd /usr/local/src/spark-2.0.2-bin-hadoop2.6/sbin

./shart-all.sh

5. 网页监控面板:

master:8080

6. 验证

本地模式:./bin/run-example SparkPi 10 --master local[2]

集群Standlone:

./spark-submit --class org.apache.spark.examples.SparkPi --master spark://master:7077 /usr/local/src/spark-2.0.2-bin-hadoop2.6/examples/jars/spark-examples_2.11-2.0.2.jar 100

集群模式Spark on Yarn:

./spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster /usr/local/src/spark-2.0.2-bin-hadoop2.6/examples/jars/spark-examples_2.11-2.0.2.jar 10

 

 

posted @ 2019-08-24 17:03  水木青楓  阅读(332)  评论(0编辑  收藏  举报