【Spark】Spark环境配置

版本控制:

openjdk 1.8.0_212
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
 
scala-2.12.8
export SCALA_HOME=/usr/local/scala/scala-2.12.8
export PATH=${SCALA_HOME}/bin:$PATH
 
hadoop-3.2.0
export HADOOP_HOME=/usr/local/hadoop/hadoop-3.2.0
export PATH=$PATH:/usr/local/hadoop/hadoop-3.2.0/bin:/usr/local/hadoop/hadoop-3.2.0/sbin
 
spark-2.4.3-bin-hadoop2.7
export SPARK_HOME=/usr/local/spark/spark-2.4.3-bin-hadoop2.7
export PATH=${SPARK_HOME}/bin:$PATH
 
ideaIC-2019.1.2

 

 

参考资料:

安装JDK8:https://blog.csdn.net/u012707739/article/details/78489833

安装Hadoop: https://blog.csdn.net/fenquegong2126/article/details/80988562

安装spark:https://blog.csdn.net/weixin_42001089/article/details/82346367

安装idea https://blog.csdn.net/qinkang1993/article/details/54631606

下载网址:

hadoop:https://mirrors.cnnic.cn/apache/hadoop/common/hadoop-3.2.0/

spark:http://spark.apache.org/downloads.html

scala:https://www.scala-lang.org/download/2.11.8.html

 

posted on 2019-06-04 20:06  雪原那么远  阅读(484)  评论(0编辑  收藏  举报

导航