SparkStreaming读取Kakfa数据时发生OffsetOutOfRangeException异常

参考文章:http://www.jianshu.com/p/791137760c14

 

运行SparkStreming程序一段时间后,发现产生了异常:

ERROR JobScheduler: Error running job streaming job 1496767480000 ms.0
org.apache.spark.SparkException: 
Job aborted due to stage failure:
Task
13 in stage 37560.0 failed 4 times,
most recent failure: Lost task 13.3 in stage 37560.0
(TID 3891416, 192.169.2.33, executor 1):
kafka.common.OffsetOutOfRangeException

如果消息体太大了,超过 fetch.message.max.bytes=1m的默认配置,那么Spark Streaming会直接抛出OffsetOutOfRangeException异常,然后停止服务。

 

解决方案:Kafka consumer中设置fetch.message.max.bytes为大一点的内存

 

比如设置为50M:1024*1024*50

fetch.message.max.bytes=52428800

posted @ 2017-06-07 16:17  静若清池  阅读(4381)  评论(1编辑  收藏  举报