打赏
摘要: pyspark有个bug, 虽然下面代码写了 write.mode("overwrite")但是不生效 spark.conf.set("hive.exec.dynamic.partition.mode", "constrict") db_df.repartition(1).write.mode("o 阅读全文
posted @ 2021-02-22 15:04 listenviolet 阅读(1524) 评论(0) 推荐(0) 编辑