Could not find any factory for identifier 'kafka' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath

java配置flinksql表连接kafka。

例如:

tableEnv.executeSql("CREATE TABLE invalidCtp (\n" +
" sys_name STRING,\n" +
" broker_id STRING,\n" +
" investor_id STRING\n," +
" row_rank BIGINT" +
") WITH (\n" +
" 'connector' = 'kafka',\n" +
" 'topic' = 'invalidCtpDetail',\n" +
" 'properties.bootstrap.servers' = '47.104.234.54:9092',\n" +
// " 'connector.startup-mode' = 'latest-offset',\n" +
" 'scan.startup.mode' = 'earliest-offset',\n" +
// " 'kafka.auto.offset.reset' = 'latest',\n" +
" 'format' = 'json'\n" +
")");
本地可运行,服务器报错:Flink 1.12 Could not find any factory for identifier 'kafka' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath

解决办法:

pom.xml文件中加入依赖(也可去如下网站下载对应版本)

https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-kafka_2.11/1.12.1

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-sql-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<!--<scope>provided</scope>-->
</dependency>
然后把该依赖包(flink-sql-connector-kafka_${scala.binary.version})放到flink安装目录的lib目录下:

 

加入该依赖包后,可能会导致如下报错:flink提交任务报错:java.lang.ClassCastException LinkedMap cannot be cast to LinkedMap exceptions

解决办法:

在conf/flink-conf.yaml添加如下内容并重启flink:

classloader.resolve-order: parent-first
————————————————
版权声明:本文为CSDN博主「wangandh」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/DH2442897094/article/details/120220852

posted on 2022-01-20 18:00  yuechuan  阅读(3523)  评论(0编辑  收藏  举报