展开
拓展 关闭
订阅号推广码
GitHub
视频
公告栏 关闭

linux安装scala、spark

  • 前提
已经案例jdk、hadoop、hive

主机名:master
映射:192.168.128.129 master
  • 安装scala
# 将scala-2.12.15.tgz上传到服务器
# 解压
tar -zxvf scala-2.12.15.tgz
# 移动
mv scala-2.12.15 /usr/local/software/

# 配置
vi /etc/profile
# 配置如下
export SCALA_HOME=/usr/local/software/scala-2.12.15
export PATH=$SCALA_HOME/bin:$PATH
# 生效
source /etc/profile

# 测试
scala -version
  • 安装spark
# 将spark-2.2.0-bin-hadoop2.7.tgz上传到服务器
# 解压
tar -zxvf spark-2.2.0-bin-hadoop2.7.tgz
# 移动
mv spark-2.2.0-bin-hadoop2.7/* /usr/local/software/spark-2.2.0

# 配置
vi /etc/profile
# 配置如下
export SPARK_HOME=/usr/local/software/spark-2.2.0
export PATH=$SPARK_HOME/bin:$SPARK_HOME/sbin:$PATH
# 生效
source /etc/profile

# 复制1份
cp /usr/local/software/spark-2.2.0/conf/spark-env.sh.template /usr/local/software/spark-2.2.0/conf/spark-env.sh
# 编辑
vi /usr/local/software/spark-2.2.0/conf/spark-env.sh
# 配置如下
export JAVA_HOME=/usr/local/software/jdk1.8.0_181
export SCALA_HOME=/usr/local/software/scala-2.12.15
export HADOOP_HOME=/usr/local/software/hadoop-2.9.2
export HADOOP_CONF_DIR=/usr/local/software/hadoop-2.9.2/etc/hadoop
export SPARK_MASTER_IP=master

# mysql驱动包移动到spark/jars
mv mysql-connector-java-8.0.30.jar /usr/local/software/spark-2.2.0/jars
# 将hive配置文件复制到spark/conf
cp /usr/local/software/hive-2.3.9/conf/hive-site.xml /usr/local/software/spark-2.2.0/conf
# 将hadoop配置文件复制到spark/conf
cp /usr/local/software/hadoop-2.9.2/etc/hadoop/core-site.xml /usr/local/software/spark-2.2.0/conf
cp /usr/local/software/hadoop-2.9.2/etc/hadoop/hdfs-site.xml /usr/local/software/spark-2.2.0/conf
  • 测试
# 启动mysql
mysqld_safe --defaults-file=/usr/local/software/mysql-8.0.34/my.cnf --user=root &
# 启动hadoop
start-dfs.sh
start-yarn.sh
# 启动hive
hive --service metastore &
hive --service hiveserver2 &
# 启动spark
start-all.sh
# 验证
[root@master conf]# spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
23/12/14 21:19:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://192.168.128.129:4040
Spark context available as 'sc' (master = local[*], app id = local-1702559961801).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.2.0
      /_/
         
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_181)
Type in expressions to have them evaluated.
Type :help for more information.

scala> spark.sql("show databases").show
+------------+
|databaseName|
+------------+
|     default|
+------------+
posted @ 2023-12-14 09:05  DogLeftover  阅读(13)  评论(0编辑  收藏  举报