spark错误集
1、ExitCodeException exitCode=13
spark-submit 命令提交的代码中master设置的是"local[6]" ,应该设置为”yarn" ;
2、外部声明对象,而后在转换操作中搜集数据,最后再处理,这种思路的问题
最后你会发现啥也没搜集到,用于收集数据的对象是空的!!!
3、转换操作中使用SparkContext、SQLContext的问题
有时候出于想使用相关功能的原因,比如使用SQLContext的createDataFrame方法来构建DataFrame,而在转换操作内使用SQLContext,在spark集群上运行程序就会报类似错误: Failed to get broadcast_32_piece0 of broadcast_32
4、Reason: Executor heartbeat timed out
--conf spark.network.timeout 10000000 --conf spark.executor.heartbeatInterval=10000000 --conf spark.driver.maxResultSize=4g
Exception in thread "Thread-3" com.esotericsoftware.kryo.KryoException: java.util.ConcurrentModificationException
Serialization trace:
classes (sun.misc.Launcher$AppClassLoader)
classloader (java.security.ProtectionDomain)
context (java.security.AccessControlContext)
acc (org.apache.spark.util.MutableURLClassLoader)
classLoader (org.apache.hadoop.conf.Configuration)
conf (com.chinapex.etl.kafka.model.ParquetWriterConfig)
parquetWriterConfig (com.mycompany.data.kafka.model.SparkDataExecutorArgs)
at com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:101)
...
posted on 2019-10-18 19:23 mylittlecabin 阅读(474) 评论(0) 编辑 收藏 举报