大叔经验分享(84)spark sql中设置hive.exec.max.dynamic.partitions无效
spark 2.4
spark sql中执行
set hive.exec.max.dynamic.partitions=10000;
后再执行sql依然会报错:
org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions created is 1001, which is more than 1000. To solve this try to set hive.exec.max.dynamic.partitions to at least 1001.
这个参数hive.exec.max.dynamic.partitions的默认值是1000,修改没有生效,
原因如下:
`HiveClient` does not know new value 1001. There is no way to change the default value of `hive.exec.max.dynamic.partitions` of `HiveCilent` with `SET` command.
The root cause is that `hive` parameters are passed to `HiveClient` on creating. So, the workaround is to use `--hiveconf` when starting `spark-shell`.
解决方法是在启动spark-sql时设置hiveconf
spark-sql --hiveconf hive.exec.max.dynamic.partitions=10000
参考:
https://issues.apache.org/jira/browse/SPARK-19881
---------------------------------------------------------------- 结束啦,我是大魔王先生的分割线 :) ----------------------------------------------------------------
- 由于大魔王先生能力有限,文中可能存在错误,欢迎指正、补充!
- 感谢您的阅读,如果文章对您有用,那么请为大魔王先生轻轻点个赞,ありがとう