大叔问题定位分享(16)spark写数据到hive外部表报错ClassCastException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to org.apache.hadoop.hive.ql.io.HiveOutputFormat
spark 2.1.1
spark在写数据到hive外部表(底层数据在hbase中)时会报错
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to org.apache.hadoop.hive.ql.io.HiveOutputFormat
at org.apache.spark.sql.hive.SparkHiveWriterContainer.outputFormat$lzycompute(hiveWriterContainers.scala:82)
org.apache.spark.sql.hive.SparkHiveWriterContainer
org.apache.spark.sql.hive.SparkHiveWriterContainer @transient private lazy val outputFormat = conf.value.getOutputFormat.asInstanceOf[HiveOutputFormat[AnyRef, Writable]]
报错的是这一句,查看代码发现此时这个变量并没有什么用处,可以在不能cast时置为null
@transient private lazy val outputFormat = // conf.value.getOutputFormat.asInstanceOf[HiveOutputFormat[AnyRef, Writable]] conf.value.getOutputFormat match { case format if format.isInstanceOf[HiveOutputFormat[AnyRef, Writable]] => format.asInstanceOf[HiveOutputFormat[AnyRef, Writable]] case _ => null }
问题解决,官方讨论如下: https://issues.apache.org/jira/browse/SPARK-6628
---------------------------------------------------------------- 结束啦,我是大魔王先生的分割线 :) ----------------------------------------------------------------
- 由于大魔王先生能力有限,文中可能存在错误,欢迎指正、补充!
- 感谢您的阅读,如果文章对您有用,那么请为大魔王先生轻轻点个赞,ありがとう