Hbase写Hfile报错:Trying to load more than 32 hfiles to one family of one region

 

在写Hfile的时候 ,如果一个family下超过了默认的32个hfile,就会报如下错误:

ERROR mapreduce.LoadIncrementalHFiles: Trying to load more than 32 hfiles to family d of region with start key
Exception in thread "main" java.io.IOException: Trying to load more than 32 hfiles to one family of one region
        at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:288)
        at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.run(LoadIncrementalHFiles.java:842)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.main(LoadIncrementalHFiles.java:847)

解决:

在Hbase-site.xml中添加:

<property>
      <name>hbase.mapreduce.bulkload.max.hfiles.perRegion.perFamily</name>
      <value>3200</value>
    </property>

 




posted @ 2019-04-10 13:29  niutao  阅读(2292)  评论(0编辑  收藏  举报