mthoutai

  博客园  :: 首页  :: 新随笔  :: 联系 :: 订阅 订阅  :: 管理

动态分区数太大的问题:[Fatal Error] Operator FS_2 (id=2): Number of dynamic partitions exceeded hive.exec.max.dynamic.partitions.pernode.


hive> insert into table sogouq_test partition(query_time) select user_id,query_word,query_order,click_order,url,query_time from sogouq_test_tmp;
Total MapReduce jobs = 3
Launching Job 1 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1409113942738_0026, Tracking URL = http://centos1:8088/proxy/application_1409113942738_0026/
Kill Command = /home/hadoop-2.2/bin/hadoop job  -kill job_1409113942738_0026
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2014-08-27 03:55:16,868 Stage-1 map = 0%,  reduce = 0%
[Fatal Error] Operator FS_2 (id=2): Number of dynamic partitions exceeded hive.exec.max.dynamic.partitions.pernode.
Sample of 100 partitions created under hdfs://centos1:8020/hive/scratchdir/hive_2014-08-27_03-55-09_118_348369539322185503-1/_tmp.-ext-10002:
        .../query_time=20111230000005
        .../query_time=20111230000007
        .../query_time=20111230000008
        .../query_time=20111230000009
        .../query_time=20111230000010
        .../query_time=20111230000011


查看最大分区数:

hive> set hive.exec.max.dynamic.partitions.pernode;
hive.exec.max.dynamic.partitions.pernode=100

将该參数设置的大一点,问题即解决。

posted on 2017-08-19 16:06  mthoutai  阅读(932)  评论(0编辑  收藏  举报