Spark2.0 VS Spark 1.* -------SparkSession的区别

Spark 2.0以前版本:
val sparkConf = new SparkConf().setAppName("soyo")
    val spark = new SparkContext(sparkConf)

Spark 2.0以后版本:(上面的写法兼容)
直接用SparkSession:
val spark = SparkSession
      .builder
      .appName("soyo")
      .getOrCreate()
    var tc = spark.sparkContext.parallelize(数据).cache()
import org.apache.spark.{SparkConf, SparkContext}


object text {
  def main(args: Array[String]): Unit = {
   // val conf=new SparkConf().setAppName("测试").setMaster("local[2]")
   // val sc= new SparkContext(conf)
   //  val file=sc.textFile("file:///home/soyo/桌面/spark编程测试数据/1.txt")
    val spark=SparkSession.builder().getOrCreate()
   // val file=spark.read.textFile("file:///home/soyo/桌面/spark编程测试数据/1.txt").rdd

    val file=spark.sparkContext.textFile("file:///home/soyo/桌面/spark编程测试数据/1.txt")
    val word=file.flatMap(lines=>lines.split(" ")).map(word=>(word,1)).reduceByKey(_+_)
     word.foreach(println)
  }
}

都好使!!-------2.2.0

posted @ 2017-10-19 16:34  soyosuyang  阅读(2505)  评论(0编辑  收藏  举报