Spark sbt configure and examples
使用sbt 0.13.5
wget http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.5/sbt-launch.jar
新建sbt脚本在/usr/local/sbt/ 下 并加入执行权限。
#!/bin/bash
SBT_OPTS="-Xms512M -Xmx1536M -Xss1M -XX:+CMSClassUnloadingEnabled "
java $SBT_OPTS -jar `dirname $0`/sbt-launch.jar "$@"
/usr/local/sbt/sbt sbt-version
进行sbt初始化。会下载一些文件。
出现超时报错:
export JAVA_OPTS="$JAVA_OPTS -Dhttp.proxyHost=yourserver -Dhttp.proxyPort=8080 -Dhttp.proxyUser=username -Dhttp.proxyPassword=password"
然后重试,成功后
会打印出sbt版本 为0.13.5
新建sbt配置文件:
name := "Simple Project" version := "1.0" scalaVersion := "2.11.8" val sparkVersion = "2.3.3" resolvers ++= Seq( "apache-snapshots" at "http://repository.apache.org/snapshots/" ) libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % sparkVersion, "org.apache.spark" %% "spark-sql" % sparkVersion, "org.apache.spark" %% "spark-mllib" % sparkVersion, "org.apache.spark" %% "spark-streaming" % sparkVersion, "org.apache.spark" %% "spark-hive" % sparkVersion, "mysql" % "mysql-connector-java" % "5.1.6" )
/usr/local/sbt/sbt clean package
没有用到的就删掉。不然会很慢。