Spark 异常汇总(持续更新)

 

Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:82) com.demo.sadsa.SparkDemo(sadsa.scala:26) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 

原因:

尝试在已启动SparkContext的相同JVM中实例化另一个SparkContext时,SparkContext构造函数将引发异常。

解决方法:spark.driver.allowMultipleContexts = true关闭异常。

posted @ 2018-05-03 17:40  凯心宝牙  阅读(673)  评论(0编辑  收藏  举报