Could not find any factory for identifier 'kafka' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath

java配置flinksql表连接kafka。

例如:

tableEnv.executeSql("CREATE TABLE invalidCtp (\n" +
" sys_name STRING,\n" +
" broker_id STRING,\n" +
" investor_id STRING\n," +
" row_rank BIGINT" +
") WITH (\n" +
" 'connector' = 'kafka',\n" +
" 'topic' = 'invalidCtpDetail',\n" +
" 'properties.bootstrap.servers' = '47.104.234.54:9092',\n" +
// " 'connector.startup-mode' = 'latest-offset',\n" +
" 'scan.startup.mode' = 'earliest-offset',\n" +
// " 'kafka.auto.offset.reset' = 'latest',\n" +
" 'format' = 'json'\n" +
")");
本地可运行,服务器报错:Flink 1.12 Could not find any factory for identifier 'kafka' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath

解决办法:

pom.xml文件中加入依赖(也可去如下网站下载对应版本)

https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-kafka_2.11/1.12.1

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-sql-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<!--<scope>provided</scope>-->
</dependency>
然后把该依赖包(flink-sql-connector-kafka_${scala.binary.version})放到flink安装目录的lib目录下:

 

加入该依赖包后,可能会导致如下报错:flink提交任务报错:java.lang.ClassCastException LinkedMap cannot be cast to LinkedMap exceptions

解决办法:

在conf/flink-conf.yaml添加如下内容并重启flink:

classloader.resolve-order: parent-first
————————————————
版权声明:本文为CSDN博主「wangandh」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/DH2442897094/article/details/120220852

posted on   yuechuan  阅读(3649)  评论(0编辑  收藏  举报

相关博文:
阅读排行:
· TypeScript + Deepseek 打造卜卦网站:技术与玄学的结合
· Manus的开源复刻OpenManus初探
· .NET Core 中如何实现缓存的预热?
· 阿里巴巴 QwQ-32B真的超越了 DeepSeek R-1吗?
· 如何调用 DeepSeek 的自然语言处理 API 接口并集成到在线客服系统
< 2025年3月 >
23 24 25 26 27 28 1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31 1 2 3 4 5

统计

点击右上角即可分享
微信分享提示