关于在sqoop 导出数据到mysql数据库的过程对于空字符的处理。
今天在做sqoop的导入和导出的操作。在导出数据到mysql数据库的时候一直有问题,在导入空字段的时候就出现下面这个错误。
WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/09/05 12:02:12 INFO input.FileInputFormat: Total input paths to process : 1
18/09/05 12:02:12 INFO input.FileInputFormat: Total input paths to process : 1
GXG 2018/9/5 12:30:34
ERROR tool.ExportTool: Error during export:
Export job failed!
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:439)
at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/09/05 12:02:12 INFO input.FileInputFormat: Total input paths to process : 1
18/09/05 12:02:12 INFO input.FileInputFormat: Total input paths to process : 1
GXG 2018/9/5 12:30:34
ERROR tool.ExportTool: Error during export:
Export job failed!
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:439)
at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
然后发现在hive表当中的数据是为空的。而且没有指定对空字符进行操作。所以导致拿不到这个字段。最后在对string类型的数据进行处理之后顺利的导入数据库:
--input-null-string '\\N' \ --input-null-non-string '\\N' \
对sqoop的脚本加上这段代码就可以实现对数据的导入。具体的sqoop脚本如下:
sqoop export \ --connect jdbc:mysql://172.16.230.138/report \ --username u_report \ --password 1234 \ --table middle_table1 \ --export-dir /user/hive/warehouse/middle_table_copy/* \ --input-null-string '\\N' \ --input-null-non-string '\\N' \ --input-fields-terminated-by '\t'
至此问题得到解决。数据顺利的导入到关系型数据库mysql当中。