sbt 编译打包(六)
1. 安装 sbt
cd /home/hadoop/apps
mkdir sbt
cd sbt
cp ~/Download/sbt-1.3.8.tgz .
// 解压
tar -zxvf sbt-1.3.8.tgz
// 将 sbt-launch.jar 拷贝到外层目录
cp sbt/bin/sbt-launch.jar .
// 新建 run.sh ,用于编译打包 scala 程序
vim run.sh
#!/bin/bash
SBT_OPTS="-Xms512M -Xmx1536M -Xss1M -XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256M"
java $SBT_OPTS -jar `dirname $0`/sbt-launch.jar "$@"
// 当前结构
[hadoop@hadoop1 sbt]$ ll
total 57592
-rwxrwxr-x 1 hadoop hadoop 153 Oct 24 15:17 run.sh
drwxrwxr-x 5 hadoop hadoop 4096 Feb 4 2020 sbt
-rw-r--r-- 1 hadoop hadoop 57567520 Oct 24 15:13 sbt-1.3.8.tgz
-rwxrwxr-x 1 hadoop hadoop 1389808 Oct 24 15:15 sbt-launch.jar
drwxrwxr-x 3 hadoop hadoop 4096 Oct 24 15:29 target
// 查看 sbt 版本信息,第一次执行命令会下载一些依赖
./run.sh sbtVersion
2. 使用 sbt 编译 scala 独立应用
1、创建应用目录:
cd /home/hadoop/apps
mkdir my_code
cd my_code
2、编写 scala 应用:
mkdir -p first_spark/src/main/scala/
vim first_spark/src/main/scala/
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object SimpleApp {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
// 过滤出大于 20 的元素
val rdd = sc.parallelize(List(30, 50, 7, 6, 1, 20), 2).filter(x => x > 20)
rdd.collect().foreach(println)
sc.stop()
}
}
3、编写 sbt
程序:
cd /home/hadoop/apps/my_code/first_spark
vim simple.sbt
name := "Simple Project"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
注意:
scala、spark-core
的版本可以通过spark-shell
查看
4、使用 sbt
打包 Scala
程序:
cd /home/hadoop/apps/my_code/first_spark
// 编译,第一次编译需要下载依赖,会有点慢
[hadoop@hadoop1 first_spark]$ /home/hadoop/apps/sbt/run.sh package
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256M; support was removed in 8.0
[info] Loading project definition from /home/hadoop/apps/my_code/first_spark/project
[info] Loading settings for project first_spark from simple.sbt ...
[info] Set current project to Simple Project (in build file:/home/hadoop/apps/my_code/first_spark/)
[info] Compiling 1 Scala source to /home/hadoop/apps/my_code/first_spark/target/scala-2.11/classes ...
[success] Total time: 25 s, completed Oct 24, 2021 3:57:27 PM
编译成功后,在当前目录下会生成两个目录:project/、target/
,jar
包所在位置:arget/scala-2.11/simple-project_2.11-1.0.jar
5、运行 scala
程序:
/home/hadoop/apps/spark-2.2.0/bin/spark-submit --class "SimpleApp" /home/hadoop/apps/my_code/first_spark/target/scala-2.11/simple-project_2.11-1.0.jar
参考文章