Centos下安装Spark

(注:由于第一次安装操作失误,所以重新安装了,因此截图为第一次的截图,命令为第二次安装的命令)

(注:图是本人安装所截图,本人安装参考网址:https://www.cnblogs.com/shaosks/p/9242536.html)

1、下载压缩包

命令:wget https://downloads.lightbend.com/scala/2.11.8/scala-2.11.8.tgz

2、解压压缩包

命令:tar -xzvf scala-2.11.8.tgz

3、文件配置

环境配置如下:复制自己所需内容即可

export JAVA_HOME=/opt/java/jdk1.8.0_301
export HADOOP_HOME=/opt/Hadoop/hadoop-2.7.3
export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native
export SPARK_HOME=/opt/spark/spark-2.1.0-bin-hadoop2.6
export SCALA_HOME=/opt/scala/scala-2.11.8
export CLASSPATH=$:CLASSPATH:$JAVA_HOME/lib/
export PATH=.:${JAVA_HOME}/bin:${HIVE_HOME}/bin:${HADOOP_HOME}/bin:/opt/mongodb/bin:${SPARK_HOME}/bin:${SCALA_HOME}/bin:$PATH
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native
export HIVE_HOME=/opt/hive/apache-hive-2.3.9-bin
export HIVE_CONF_DIR=${HIVE_HOME}/conf

 

4、启动Hadoop

 

5、启动Spark

命令(没有空格!!没有空格!!!没有空格!!!闷头查了半天bash,,,,):spark-shell

6、启动报错

  1、未启动Hadoop时,报错如下:

    2、未修改log4j.properties前报错如下:

  

  修改log4j.properties:

    进入conf文件夹下进行如下操作,如图:

[root@june spark]# cd spark-2.1.0-bin-hadoop2.6/
[root@june spark-2.1.0-bin-hadoop2.6]# cd conf
[root@june conf]# ls
docker.properties.template  metrics.properties.template   spark-env.sh.template
fairscheduler.xml.template  slaves.template
log4j.properties.template   spark-defaults.conf.template
[root@june conf]#  cp log4j.properties.template log4j.properties
[root@june conf]# ls
docker.properties.template  metrics.properties.template
fairscheduler.xml.template  slaves.template
log4j.properties            spark-defaults.conf.template
log4j.properties.template   spark-env.sh.template
[root@june conf]# vim log4j.properties

 

  

找到下图位置修改,修改完后如图:

  

posted @ 2022-01-12 17:35  往心。  阅读(114)  评论(0编辑  收藏  举报