Table API&SQL和常见问题总结
官网参考:
常见问题总结:
问题一:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 | Exception in thread "main" org.apache.flink.table.api.ValidationException: Unable to create a sink for writing table 'default_catalog.default_database.order_result' . Table options are: 'connector' = 'jdbc' 'driver' = 'com.mysql.jdbc.Driver' 'password' = 'xxxxx' 'sink.buffer-flush.interval' = '2s' 'sink.buffer-flush.max-rows' = '200' 'table-name' = 'order_result' 'url' = 'jdbc:mysql://xx.xxx.xx.xxx:3306/fk_test' 'username' = 'root' at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java: 166 ) at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala: 362 ) at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala: 220 ) at org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$ 1 .apply(PlannerBase.scala: 164 ) at org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$ 1 .apply(PlannerBase.scala: 164 ) at scala.collection.TraversableLike$$anonfun$map$ 1 .apply(TraversableLike.scala: 234 ) at scala.collection.TraversableLike$$anonfun$map$ 1 .apply(TraversableLike.scala: 234 ) at scala.collection.Iterator$ class .foreach(Iterator.scala: 893 ) at scala.collection.AbstractIterator.foreach(Iterator.scala: 1336 ) at scala.collection.IterableLike$ class .foreach(IterableLike.scala: 72 ) at scala.collection.AbstractIterable.foreach(Iterable.scala: 54 ) at scala.collection.TraversableLike$ class .map(TraversableLike.scala: 234 ) at scala.collection.AbstractTraversable.map(Traversable.scala: 104 ) at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala: 164 ) at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java: 1267 ) at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java: 675 ) at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java: 759 ) at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java: 665 ) at flink.cdc.OrderInfo.main(OrderInfo.java: 83 ) Caused by: org.apache.flink.table.api.ValidationException: Cannot discover a connector using option: 'connector' = 'jdbc' at org.apache.flink.table.factories.FactoryUtil.enrichNoMatchingConnectorError(FactoryUtil.java: 385 ) at org.apache.flink.table.factories.FactoryUtil.getDynamicTableFactory(FactoryUtil.java: 372 ) at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java: 159 ) ... 18 more Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'jdbc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath. Available factory identifiers are: blackhole datagen filesystem mysql-cdc print at org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java: 245 ) at org.apache.flink.table.factories.FactoryUtil.enrichNoMatchingConnectorError(FactoryUtil.java: 382 ) ... 20 more Process finished with exit code 1 |
问题二:
1 2 3 4 5 6 7 8 | Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/table/api/bridge/java/StreamTableEnvironment at flink.cdc.OrderInfo.main(OrderInfo.java: 11 ) Caused by: java.lang.ClassNotFoundException: org.apache.flink.table.api.bridge.java.StreamTableEnvironment at java.net.URLClassLoader.findClass(URLClassLoader.java: 381 ) at java.lang.ClassLoader.loadClass(ClassLoader.java: 424 ) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java: 338 ) at java.lang.ClassLoader.loadClass(ClassLoader.java: 357 ) ... 1 more |
问题三:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 | Exception in thread "main" java.lang.IllegalStateException: please declare primary key for sink table when query contains update/delete record. at org.apache.flink.util.Preconditions.checkState(Preconditions.java: 198 ) at org.apache.flink.connector.jdbc.table.JdbcDynamicTableSink.validatePrimaryKey(JdbcDynamicTableSink.java: 72 ) at org.apache.flink.connector.jdbc.table.JdbcDynamicTableSink.getChangelogMode(JdbcDynamicTableSink.java: 63 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkChangelogModeInferenceProgram$SatisfyModifyKindSetTraitVisitor.visit(FlinkChangelogModeInferenceProgram.scala: 124 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkChangelogModeInferenceProgram.optimize(FlinkChangelogModeInferenceProgram.scala: 50 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkChangelogModeInferenceProgram.optimize(FlinkChangelogModeInferenceProgram.scala: 39 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkGroupProgram$$anonfun$optimize$ 1 $$anonfun$apply$ 1 .apply(FlinkGroupProgram.scala: 63 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkGroupProgram$$anonfun$optimize$ 1 $$anonfun$apply$ 1 .apply(FlinkGroupProgram.scala: 60 ) at scala.collection.TraversableOnce$$anonfun$foldLeft$ 1 .apply(TraversableOnce.scala: 157 ) at scala.collection.TraversableOnce$$anonfun$foldLeft$ 1 .apply(TraversableOnce.scala: 157 ) at scala.collection.Iterator$ class .foreach(Iterator.scala: 893 ) at scala.collection.AbstractIterator.foreach(Iterator.scala: 1336 ) at scala.collection.IterableLike$ class .foreach(IterableLike.scala: 72 ) at scala.collection.AbstractIterable.foreach(Iterable.scala: 54 ) at scala.collection.TraversableOnce$ class .foldLeft(TraversableOnce.scala: 157 ) at scala.collection.AbstractTraversable.foldLeft(Traversable.scala: 104 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkGroupProgram$$anonfun$optimize$ 1 .apply(FlinkGroupProgram.scala: 60 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkGroupProgram$$anonfun$optimize$ 1 .apply(FlinkGroupProgram.scala: 55 ) at scala.collection.TraversableOnce$$anonfun$foldLeft$ 1 .apply(TraversableOnce.scala: 157 ) at scala.collection.TraversableOnce$$anonfun$foldLeft$ 1 .apply(TraversableOnce.scala: 157 ) at scala.collection.immutable.Range.foreach(Range.scala: 160 ) at scala.collection.TraversableOnce$ class .foldLeft(TraversableOnce.scala: 157 ) at scala.collection.AbstractTraversable.foldLeft(Traversable.scala: 104 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkGroupProgram.optimize(FlinkGroupProgram.scala: 55 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$ 1 .apply(FlinkChainedProgram.scala: 62 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$ 1 .apply(FlinkChainedProgram.scala: 58 ) at scala.collection.TraversableOnce$$anonfun$foldLeft$ 1 .apply(TraversableOnce.scala: 157 ) at scala.collection.TraversableOnce$$anonfun$foldLeft$ 1 .apply(TraversableOnce.scala: 157 ) at scala.collection.Iterator$ class .foreach(Iterator.scala: 893 ) at scala.collection.AbstractIterator.foreach(Iterator.scala: 1336 ) at scala.collection.IterableLike$ class .foreach(IterableLike.scala: 72 ) at scala.collection.AbstractIterable.foreach(Iterable.scala: 54 ) at scala.collection.TraversableOnce$ class .foldLeft(TraversableOnce.scala: 157 ) at scala.collection.AbstractTraversable.foldLeft(Traversable.scala: 104 ) at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram.optimize(FlinkChainedProgram.scala: 57 ) at org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.optimizeTree(StreamCommonSubGraphBasedOptimizer.scala: 163 ) at org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.doOptimize(StreamCommonSubGraphBasedOptimizer.scala: 79 ) at org.apache.flink.table.planner.plan.optimize.CommonSubGraphBasedOptimizer.optimize(CommonSubGraphBasedOptimizer.scala: 77 ) at org.apache.flink.table.planner.delegation.PlannerBase.optimize(PlannerBase.scala: 286 ) at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala: 165 ) at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java: 1267 ) at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java: 675 ) at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java: 759 ) at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java: 665 ) at flink.cdc.OrderInfo.main(OrderInfo.java: 87 ) |
问题四待续。。
参考:
https://blog.lixuemin.com/2020/12/11/flink/Flink-CDC%E8%B8%A9%E5%9D%91%E9%9B%86%E5%90%88/ (Flink CDC踩坑集合)
https://ci.apache.org/projects/flink/flink-docs-release-1.13/zh/docs/dev/table/sql/create/
JDBC SQL 连接器 #
Scan Source: Bounded Lookup Source: Sync Mode Sink: Batch Sink: Streaming Append & Upsert Mode
JDBC 连接器允许使用 JDBC 驱动向任意类型的关系型数据库读取或者写入数据。本文档描述了针对关系型数据库如何通过建立 JDBC 连接器来执行 SQL 查询。
如果在 DDL 中定义了主键,JDBC sink 将以 upsert 模式与外部系统交换 UPDATE/DELETE 消息;否则,它将以 append 模式与外部系统交换消息且不支持消费 UPDATE/DELETE 消息。
https://ci.apache.org/projects/flink/flink-docs-release-1.13/zh/docs/connectors/table/jdbc/
Flink SQL CDC 上线!我们总结了 13 条生产实践经验
https://my.oschina.net/u/2828172/blog/4545836
FLIP-84:改进和重构TableEnvironment和Table的API
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=134745878
如果执行mysql中写入数据不成功的话 ,查看demo没有问题后,可以把相关的sql拿到mysql中进行执行查看
posted on 2021-05-20 17:05 RICH-ATONE 阅读(5995) 评论(2) 编辑 收藏 举报
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· AI与.NET技术实操系列:向量存储与相似性搜索在 .NET 中的实现
· 基于Microsoft.Extensions.AI核心库实现RAG应用
· Linux系列:如何用heaptrack跟踪.NET程序的非托管内存泄露
· 开发者必知的日志记录最佳实践
· SQL Server 2025 AI相关能力初探
· winform 绘制太阳,地球,月球 运作规律
· 震惊!C++程序真的从main开始吗?99%的程序员都答错了
· AI与.NET技术实操系列(五):向量存储与相似性搜索在 .NET 中的实现
· 【硬核科普】Trae如何「偷看」你的代码?零基础破解AI编程运行原理
· 超详细:普通电脑也行Windows部署deepseek R1训练数据并当服务器共享给他人