hive 报错 return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. Exception when loading 1 in table uniaction1 with loadPath=
使用insert into from从源表向目标表导入数据时报错如下
insert into uniaction1 values('136.206.220.16','1542011089896','www.mi.com','Buy','2018-11-12','海南','67084475796635524'); ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. Exception when loading 1 in table uniaction1 with loadPath=hdfs://host:8020/warehouse/tablespace/managed/hive/db1.db/action1 Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. Exception when loading 1 in table uniaction1 with loadPath=hdfs://host:8020/warehouse/tablespace/managed/hive/db1.db/action1 (state=08S01,code=1)
目标表表结构:
0: jdbc:hive2://node3:2181,node2:2181,node1:2> desc uniaction1; +--------------------------+------------+----------+ | col_name | data_type | comment | +--------------------------+------------+----------+ | ipaddress | string | | | thetimestamp | string | | | web | string | | | operator | string | | | thedate | string | | | prov | string | | | userid | string | | | | NULL | NULL | | # Partition Information | NULL | NULL | | # col_name | data_type | comment | | thedate | string | | | prov | string | | | userid | string | | +--------------------------+------------+----------+ 13 rows selected (0.173 seconds)
是的,这里就是把中文数据的字段作为自动分区字段了,没法自动创建,具体原因说不上,
也有人能解决:
关于mysql网上的很多的修改编码的方式都不可靠,并不能保证所有表的所有字段的编码都改正过来。
这个中文字段的分区之所以不能建,在日志中提示的很明确就是hive meta store exception,所以就去找hive元数据的问题,经过排查发现,partitions表的par_name字段还是lanten1的编码,将其修改文utf8的编码以后,就可以创建中文分区了。至于怎么修改,自己网上查吧,
https://www.oschina.net/question/2909997_2289170
诸业皆是自作自受,休咎祸福,尽从心生。