spark - tasks is bigger than spark.driver.maxResultSize
Error
ERROR TaskSetManager: Total size of serialized results of 8113 tasks (1131.0 MB) is bigger than spark.driver.maxResultSize (1024.0 MB)
ERROR TaskSetManager: Total size of serialized results of 8114 tasks (1131.1 MB) is bigger than spark.driver.maxResultSize (1024.0 MB)
ERROR TaskSetManager: Total size of serialized results of 8115 tasks (1131.2 MB) is bigger than spark.driver.maxResultSize (1024.0 MB)
ERROR TaskSetManager: Total size of serialized results of 8116 tasks (1131.3 MB) is bigger than spark.driver.maxResultSize (1024.0 MB)
Cause
- caused by actions like RDD’s
collect()
that send big chunk of data to the driver(不一定是因为RDD的问题哦~)
Solution
- set by SparkConf:
conf.set("spark.driver.maxResultSize", "3g")
- set by
spark-defaults.conf
:spark.driver.maxResultSize 3g
- set when calling
spark-submit
:--conf spark.driver.maxResultSize=3g
澄轶: suanec -
http://www.cnblogs.com/suanec/
友链:marsggbo
声援博主:如果您觉得文章对您有帮助,可以点击文章右下角【推荐】一下。您的鼓励是博主的最大动力!
点个关注吧~
http://www.cnblogs.com/suanec/
友链:marsggbo
声援博主:如果您觉得文章对您有帮助,可以点击文章右下角【推荐】一下。您的鼓励是博主的最大动力!
点个关注吧~