spark 修改分区(存储结果需要)
摘要:修改分区就行了val rdd1 = sc.parallelize(Array(1,2,3,4,5,6,7,8))# 查看分区数rdd1.partitions.length# 改成1个分区val rdd2 = rdd1.repartition(1)rdd2.partitions.length
阅读全文
posted @ 2016-04-18 10:56
posted @ 2016-04-18 10:56
posted @ 2016-04-16 17:27
posted @ 2016-04-16 11:53
posted @ 2016-04-12 17:00
posted @ 2016-04-08 15:30