Spark开发
scala中的main函数
java.lang.NoSuchMethodError: scala.tools.nsc.interpreter.ILoop.main
在Object对象中,需要添加main函数才能够右键,run
CacheLoader
1 Exception in thread "main" java.lang.NoClassDefFoundError: org/spark_project/guava/cache/CacheLoader 2 at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73) 3 at org.apache.spark.SparkConf.<init>(SparkConf.scala:68) 4 at org.apache.spark.SparkConf.<init>(SparkConf.scala:55) 5 at org.apache.spark.SparkContext.<init>(SparkContext.scala:117) 6 at com.spark.Main$.main(Main.scala:18) 7 at com.spark.Main.main(Main.scala)
添加了spark-network-common依赖,问题解决
1 <dependency> 2 <groupId>org.apache.spark</groupId> 3 <artifactId>spark-network-common_2.11</artifactId> 4 <version>2.1.0</version> 5 </dependency>
Scala的Array的NosuchMethod
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
scala的版本不对导致;配置文件中添加了scala的引用是2.11;intelliJ的工程师2.12。