Flink与Hive集成错误总结


1. Caused by: java.lang.ClassNotFoundException: org.apache.hive.common.util.HiveVersionInfo

原因:flink缺少hive-exec-3.1.2.jar包
解决方法:cp /usr/local/hive/lib/hive-exec-3.1.2.jar /usr/local/flink/lib/

2. [ERROR] Could not execute SQL statement. Reason:java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream

原因:缺少hadoop 依赖或者hadoop 的环境变量
解决办法:export HADOOP_CLASSPATH=hadoop classpath(可以直接加载到环境变量)

3. Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'hive' that implements 'org.apache.flink.table.factories.CatalogFactory' in the classpath

原因:flink缺少flink-connector-hive_2.12-1.13.0.jar包(需要和flink、scala版本一致)
解决办法:cp /usr/local/hive/lib/hive-exec-3.1.2.jar /usr/local/flink/lib/

4. Caused by: java.lang.NoClassDefFoundError: com/ctc/wstx/io/InputBootstrapper
原因:是缺少woodstox-core-5.0.3.jar包
解决方法: /usr/local/hadoop/share/hadoop/common/lib/woodstox-core-5.0.3.jar /usr/local/flink/lib/

5. Caused by: java.lang.NoClassDefFoundError: org/codehaus/stax2/XMLInputFactory2

原因:是缺少stax2-api-3.1.4.jar包
解决方法: /usr/local/hadoop/share/hadoop/common/lib/stax2-api-3.1.4.jar /usr/local/flink/lib/

6. Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration2.Configuration

原因:是缺少commons-configuration2-2.1.1.jar包
解决方法: /usr/local/hadoop/share/hadoop/common/lib/commons-configuration2-2.1.1.jar /usr/local/flink/lib/
posted @   wang_zai  阅读(2222)  评论(0编辑  收藏  举报
相关博文:
阅读排行:
· 25岁的心里话
· 闲置电脑爆改个人服务器(超详细) #公网映射 #Vmware虚拟网络编辑器
· 基于 Docker 搭建 FRP 内网穿透开源项目(很简单哒)
· 零经验选手,Compose 一天开发一款小游戏!
· 一起来玩mcp_server_sqlite,让AI帮你做增删改查!!
点击右上角即可分享
微信分享提示