Flink Sql线上环境踩坑记录
背景:Flink Sql程序本机测试执行没问题,上到生产环境就报各种诡异得问题,搞得头都大了。。。特此记录下解决过程。
问题原因主要两点:
1.JDK版本问题
2.Flink Sql相关jar包冲突
问题一
2020-09-27 06:06:33,125 INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager [] - Registering TaskManager with ResourceID c6131ee551f94eb9c3db0568f40b4ad2 (akka.tcp://flink@10.42.4.11:6122/user/rpc/taskmanager_0) at ResourceManager 2020-09-27 06:06:46,727 INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager [] - Registering TaskManager with ResourceID 6e7bbcf953908a8bdd42327b40d325c7 (akka.tcp://flink@10.42.1.117:6122/user/rpc/taskmanager_0) at ResourceManager 2020-09-27 06:07:47,447 WARN org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Configuring the job submission via query parameters is deprecated. Please migrate to submitting a JSON request instead. 2020-09-27 06:07:47,524 INFO org.apache.flink.client.ClientUtils [] - Starting program (detached: true) 2020-09-27 06:07:47,548 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - ------------program params------------------------- 2020-09-27 06:07:47,548 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - -metadataUrl 2020-09-27 06:07:47,548 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - http://dev-env.jcinfo.com//metadata/api/pipeline/0e67d7c9ee02445a9c709f83b1a2ca82 2020-09-27 06:07:47,548 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - ------------------------------------------- 2020-09-27 06:07:47,569 INFO com.jc.dw.metadata.MetadataInfoImpl [] - http get metadata. url:http://dev-env.jcinfo.com//metadata/api/pipeline/0e67d7c9ee02445a9c709f83b1a2ca82 2020-09-27 06:07:48,107 INFO com.jc.dw.metadata.MetadataInfoImpl [] - PipelineMetadata before sorting:{"attributeProcess":[],"destModelName":"ccc","destSchema":"{\"type\":\"record\",\"name\":\"ccc\",\"doc\":\"\",\"fields\":[{\"name\":\"ccc\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}","dictInputs":[],"inputs":[{"database":"default","layer":10,"layerName":"ODS","name":"ceshi19","password":"password","tableName":"ods_ceshi19","type":"hive","url":"172.31.6.20:10000","username":"username"}],"output":{"database":"default","layer":20,"layerName":"DWD","password":"password","tableName":"dwd_ccc","type":"hive","url":"172.31.6.20:10000","username":"username"},"pipelineId":"0e67d7c9ee02445a9c709f83b1a2ca82","pipelineName":"sql测试01","pipelineType":"2","sqls":["insert into user_behavior_sink3 SELECT keyword FROM sql_test_07"],"srcInput":{"$ref":"$.inputs[0]"},"srcModelName":"ceshi19","srcSchema":"{\"type\":\"record\",\"name\":\"测试\",\"doc\":\"\",\"fields\":[{\"name\":\"c\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}"} 2020-09-27 06:07:48,108 INFO com.jc.dw.metadata.MetadataInfoImpl [] - PipelineMetadata:{"attributeProcess":[],"destModelName":"ccc","destSchema":"{\"type\":\"record\",\"name\":\"ccc\",\"doc\":\"\",\"fields\":[{\"name\":\"ccc\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}","dictInputs":[],"inputs":[{"database":"default","layer":10,"layerName":"ODS","name":"ceshi19","password":"password","tableName":"ods_ceshi19","type":"hive","url":"172.31.6.20:10000","username":"username"}],"output":{"database":"default","layer":20,"layerName":"DWD","password":"password","tableName":"dwd_ccc","type":"hive","url":"172.31.6.20:10000","username":"username"},"pipelineId":"0e67d7c9ee02445a9c709f83b1a2ca82","pipelineName":"sql测试01","pipelineType":"2","sqls":["insert into user_behavior_sink3 SELECT keyword FROM sql_test_07"],"srcInput":{"$ref":"$.inputs[0]"},"srcModelName":"ceshi19","srcSchema":"{\"type\":\"record\",\"name\":\"测试\",\"doc\":\"\",\"fields\":[{\"name\":\"c\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}"} 2020-09-27 06:07:49,096 INFO org.apache.hadoop.hive.conf.HiveConf [] - Found configuration file jar:file:/tmp/jars/flink-web-35e52647-4cdb-484b-a37d-bf3949e2acea/flink-web-upload/2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar!/hive-site.xml 2020-09-27 06:07:49,653 INFO com.jc.dw.sql.catalog.HiveCatalogManager [] - getHiveCataLog. name:myhive, defaultDatabase:default 2020-09-27 06:07:49,671 INFO org.apache.flink.table.catalog.hive.HiveCatalog [] - Created HiveCatalog 'myhive' 2020-09-27 06:07:49,820 WARN org.apache.hadoop.util.NativeCodeLoader [] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2020-09-27 06:07:49,916 ERROR org.apache.hadoop.hive.metastore.utils.MetaStoreUtils [] - Got exception: java.lang.ClassCastException class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap') java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap') at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.resolveUris(HiveMetaStoreClient.java:262) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?] at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?] at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[?:?] at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at com.jc.dw.sql.Main.main(Main.java:24) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:78) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:67) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at org.apache.flink.runtime.webmonitor.handlers.JarRunHandler.lambda$handleRequest$0(JarRunHandler.java:100) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at java.util.concurrent.CompletableFuture$AsyncSupply.run(Unknown Source) [?:?] at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [?:?] at java.util.concurrent.FutureTask.run(Unknown Source) [?:?] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) [?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:?] at java.lang.Thread.run(Unknown Source) [?:?] 2020-09-27 06:07:49,922 ERROR org.apache.hadoop.hive.metastore.utils.MetaStoreUtils [] - Converting exception to MetaException 2020-09-27 06:07:49,924 WARN org.apache.flink.client.deployment.application.DetachedApplicationRunner [] - Could not execute application: org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Failed to create Hive Metastore client at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:302) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:78) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:67) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at org.apache.flink.runtime.webmonitor.handlers.JarRunHandler.lambda$handleRequest$0(JarRunHandler.java:100) ~[flink-dist_2.12-1.11.1.jar:1.11.1] at java.util.concurrent.CompletableFuture$AsyncSupply.run(Unknown Source) [?:?] at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [?:?] at java.util.concurrent.FutureTask.run(Unknown Source) [?:?] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) [?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:?] at java.lang.Thread.run(Unknown Source) [?:?] Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: Failed to create Hive Metastore client at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:105) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[?:?] at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[?:?] at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[?:?] at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[?:?] at com.jc.dw.sql.Main.main(Main.java:24) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1] ... 12 more Caused by: java.lang.reflect.InvocationTargetException at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[?:?] at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[?:?] at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[?:?] at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[?:?] at com.jc.dw.sql.Main.main(Main.java:24) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1] ... 12 more Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:86) ~[?:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) ~[?:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[?:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[?:?] at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[?:?] at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[?:?] at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[?:?] at com.jc.dw.sql.Main.main(Main.java:24) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1] ... 12 more Caused by: java.lang.reflect.InvocationTargetException at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?] at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?] at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[?:?] at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) ~[?:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) ~[?:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[?:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[?:?] at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[?:?] at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[?:?] at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[?:?] at com.jc.dw.sql.Main.main(Main.java:24) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1] ... 12 more Caused by: org.apache.hadoop.hive.metastore.api.MetaException: Got exception: java.lang.ClassCastException class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap') at org.apache.hadoop.hive.metastore.utils.MetaStoreUtils.logAndThrowMetaException(MetaStoreUtils.java:168) ~[?:?] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.resolveUris(HiveMetaStoreClient.java:267) ~[?:?] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182) ~[?:?] at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?] at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?] at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[?:?] at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) ~[?:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) ~[?:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[?:?] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[?:?] at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[?:?] at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1] at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[?:?] at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[?:?] at com.jc.dw.sql.Main.main(Main.java:24) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?] at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1] ... 12 more 2020-09-27 06:07:49,931 ERROR org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Exception occurred in REST handler: Could not execute application.
问题原因:这是Hive得bug,3版本还没修复
https://issues.apache.org/jira/browse/HIVE-22190
解决办法:将JDK版本改为1.8
问题二
2020-09-27 09:16:44,194 ERROR org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Exception occurred in REST handler: Could not execute application. 2020-09-27 09:30:05,853 WARN org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Configuring the job submission via query parameters is deprecated. Please migrate to submitting a JSON request instead. 2020-09-27 09:30:06,024 INFO org.apache.flink.client.ClientUtils [] - Starting program (detached: true) 2020-09-27 09:30:06,028 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - ------------program params------------------------- 2020-09-27 09:30:06,029 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - -metadataUrl 2020-09-27 09:30:06,029 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - http://dev-env.jcinfo.com/metadata/api/pipeline/0e67d7c9ee02445a9c709f83b1a2ca82 2020-09-27 09:30:06,029 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - -database 2020-09-27 09:30:06,029 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - dw 2020-09-27 09:30:06,029 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - ------------------------------------------- 2020-09-27 09:30:06,045 INFO com.jc.dw.metadata.MetadataInfoImpl [] - http get metadata. url:http://dev-env.jcinfo.com/metadata/api/pipeline/0e67d7c9ee02445a9c709f83b1a2ca82 2020-09-27 09:30:06,412 INFO com.jc.dw.metadata.MetadataInfoImpl [] - PipelineMetadata before sorting:{"attributeProcess":[],"destModelName":"ccc","destSchema":"{\"type\":\"record\",\"name\":\"ccc\",\"doc\":\"\",\"fields\":[{\"name\":\"ccc\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}","dictInputs":[],"inputs":[{"database":"default","layer":10,"layerName":"ODS","name":"ceshi19","password":"password","tableName":"ods_ceshi19","type":"hive","url":"172.31.6.20:10000","username":"username"}],"output":{"database":"default","layer":20,"layerName":"DWD","password":"password","tableName":"dwd_ccc","type":"hive","url":"172.31.6.20:10000","username":"username"},"pipelineId":"0e67d7c9ee02445a9c709f83b1a2ca82","pipelineName":"sql测试01","pipelineType":"2","sqls":["insert into dwd_baidu_news_test01 SELECT * from dwd_baidu_news LIMIT 10"],"srcInput":{"$ref":"$.inputs[0]"},"srcModelName":"ceshi19","srcSchema":"{\"type\":\"record\",\"name\":\"测试\",\"doc\":\"\",\"fields\":[{\"name\":\"c\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}"} 2020-09-27 09:30:06,413 INFO com.jc.dw.metadata.MetadataInfoImpl [] - PipelineMetadata:{"attributeProcess":[],"destModelName":"ccc","destSchema":"{\"type\":\"record\",\"name\":\"ccc\",\"doc\":\"\",\"fields\":[{\"name\":\"ccc\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}","dictInputs":[],"inputs":[{"database":"default","layer":10,"layerName":"ODS","name":"ceshi19","password":"password","tableName":"ods_ceshi19","type":"hive","url":"172.31.6.20:10000","username":"username"}],"output":{"database":"default","layer":20,"layerName":"DWD","password":"password","tableName":"dwd_ccc","type":"hive","url":"172.31.6.20:10000","username":"username"},"pipelineId":"0e67d7c9ee02445a9c709f83b1a2ca82","pipelineName":"sql测试01","pipelineType":"2","sqls":["insert into dwd_baidu_news_test01 SELECT * from dwd_baidu_news LIMIT 10"],"srcInput":{"$ref":"$.inputs[0]"},"srcModelName":"ceshi19","srcSchema":"{\"type\":\"record\",\"name\":\"测试\",\"doc\":\"\",\"fields\":[{\"name\":\"c\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}"} 2020-09-27 09:30:06,446 INFO org.apache.hadoop.hive.conf.HiveConf [] - Found configuration file jar:file:/tmp/jars/flink-web-2ba55738-0699-4597-bd86-a61c528b22f8/flink-web-upload/daed4a12-a55f-415e-a8b4-09cbe433194b_core-1.11.1-SNAPSHOT.jar!/hive-site.xml 2020-09-27 09:30:06,863 INFO com.jc.dw.sql.catalog.HiveCatalogManager [] - getHiveCataLog. name:myhive, defaultDatabase:dw 2020-09-27 09:30:06,889 INFO org.apache.flink.table.catalog.hive.HiveCatalog [] - Created HiveCatalog 'myhive' 2020-09-27 09:30:06,991 WARN org.apache.hadoop.util.NativeCodeLoader [] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2020-09-27 09:30:07,059 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Trying to connect to metastore with URI thrift://172.31.5.20:9083 2020-09-27 09:30:07,078 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Opened a connection to metastore, current connections: 1 2020-09-27 09:30:07,103 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Connected to metastore. 2020-09-27 09:30:07,104 INFO org.apache.hadoop.hive.metastore.RetryingMetaStoreClient [] - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.metastore.HiveMetaStoreClient ugi=flink (auth:SIMPLE) retries=1 delay=1 lifetime=0 2020-09-27 09:30:07,233 INFO org.apache.flink.table.catalog.hive.HiveCatalog [] - Connected to Hive metastore 2020-09-27 09:30:07,252 INFO org.apache.flink.table.catalog.CatalogManager [] - Set the current default catalog as [myhive] and the current default database as [dw]. 2020-09-27 09:30:08,001 WARN org.apache.flink.client.deployment.application.DetachedApplicationRunner [] - Could not execute application: org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Unable to instantiate java compiler at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:302) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:78) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:67) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.runtime.webmonitor.handlers.JarRunHandler.lambda$handleRequest$0(JarRunHandler.java:100) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604) [?:1.8.0_265] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_265] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_265] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_265] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_265] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_265] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_265] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_265] Caused by: java.lang.IllegalStateException: Unable to instantiate java compiler at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.compile(JaninoRelMetadataProvider.java:433) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.load3(JaninoRelMetadataProvider.java:374) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.lambda$static$0(JaninoRelMetadataProvider.java:109) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:149) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3542) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2323) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2286) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache.get(LocalCache.java:3953) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3957) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4875) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.create(JaninoRelMetadataProvider.java:474) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:487) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.RelMetadataQueryBase.revise(RelMetadataQueryBase.java:95) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.RelMetadataQuery.getPulledUpPredicates(RelMetadataQuery.java:780) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.rules.ReduceExpressionsRule$ProjectReduceExpressionsRule.onMatch(ReduceExpressionsRule.java:300) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.AbstractRelOptPlanner.fireRule(AbstractRelOptPlanner.java:328) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepPlanner.applyRule(HepPlanner.java:562) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepPlanner.applyRules(HepPlanner.java:427) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepPlanner.executeInstruction(HepPlanner.java:264) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepInstruction$RuleInstance.execute(HepInstruction.java:127) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepPlanner.executeProgram(HepPlanner.java:223) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepPlanner.findBestExp(HepPlanner.java:210) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.program.FlinkHepProgram.optimize(FlinkHepProgram.scala:69) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.program.FlinkHepRuleSetProgram.optimize(FlinkHepRuleSetProgram.scala:87) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$1.apply(FlinkChainedProgram.scala:62) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$1.apply(FlinkChainedProgram.scala:58) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.Iterator$class.foreach(Iterator.scala:891) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.AbstractTraversable.foldLeft(Traversable.scala:104) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram.optimize(FlinkChainedProgram.scala:57) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.optimizeTree(StreamCommonSubGraphBasedOptimizer.scala:164) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.doOptimize(StreamCommonSubGraphBasedOptimizer.scala:80) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.CommonSubGraphBasedOptimizer.optimize(CommonSubGraphBasedOptimizer.scala:77) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.delegation.PlannerBase.optimize(PlannerBase.scala:279) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:164) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1264) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:700) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:787) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:690) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:172) ~[?:?] at com.jc.dw.sql.Main.main(Main.java:28) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_265] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_265] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_265] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_265] at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.11-1.11.2.jar:1.11.2] ... 13 more Caused by: java.lang.ClassCastException: org.codehaus.janino.CompilerFactory cannot be cast to org.codehaus.commons.compiler.ICompilerFactory at org.codehaus.commons.compiler.CompilerFactoryFactory.getCompilerFactory(CompilerFactoryFactory.java:129) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.codehaus.commons.compiler.CompilerFactoryFactory.getDefaultCompilerFactory(CompilerFactoryFactory.java:79) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.compile(JaninoRelMetadataProvider.java:431) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.load3(JaninoRelMetadataProvider.java:374) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.lambda$static$0(JaninoRelMetadataProvider.java:109) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:149) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3542) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2323) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2286) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache.get(LocalCache.java:3953) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3957) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4875) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.create(JaninoRelMetadataProvider.java:474) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:487) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.RelMetadataQueryBase.revise(RelMetadataQueryBase.java:95) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.metadata.RelMetadataQuery.getPulledUpPredicates(RelMetadataQuery.java:780) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.rel.rules.ReduceExpressionsRule$ProjectReduceExpressionsRule.onMatch(ReduceExpressionsRule.java:300) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.AbstractRelOptPlanner.fireRule(AbstractRelOptPlanner.java:328) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepPlanner.applyRule(HepPlanner.java:562) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepPlanner.applyRules(HepPlanner.java:427) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepPlanner.executeInstruction(HepPlanner.java:264) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepInstruction$RuleInstance.execute(HepInstruction.java:127) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepPlanner.executeProgram(HepPlanner.java:223) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.calcite.plan.hep.HepPlanner.findBestExp(HepPlanner.java:210) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.program.FlinkHepProgram.optimize(FlinkHepProgram.scala:69) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.program.FlinkHepRuleSetProgram.optimize(FlinkHepRuleSetProgram.scala:87) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$1.apply(FlinkChainedProgram.scala:62) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$1.apply(FlinkChainedProgram.scala:58) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.Iterator$class.foreach(Iterator.scala:891) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at scala.collection.AbstractTraversable.foldLeft(Traversable.scala:104) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram.optimize(FlinkChainedProgram.scala:57) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.optimizeTree(StreamCommonSubGraphBasedOptimizer.scala:164) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.doOptimize(StreamCommonSubGraphBasedOptimizer.scala:80) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.plan.optimize.CommonSubGraphBasedOptimizer.optimize(CommonSubGraphBasedOptimizer.scala:77) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.delegation.PlannerBase.optimize(PlannerBase.scala:279) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:164) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1264) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:700) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:787) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:690) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2] at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:172) ~[?:?] at com.jc.dw.sql.Main.main(Main.java:28) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_265] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_265] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_265] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_265] at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.11-1.11.2.jar:1.11.2] ... 13 more 2020-09-27 09:30:08,005 ERROR org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Exception occurred in REST handler: Could not execute application.
问题原因:Jar包冲突
解决办法:不要将flink-table-planner-blink/flink-table-api-java-bridge打到Jar包里
待调查问题
Caused by: java.lang.IllegalArgumentException: Job client must be a CoordinationRequestGateway. This is a bug. ERROR org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Exception occurred in REST handler: Could not execute application. Exception occurred in REST handler: No jobs included in application. Caused by: java.lang.IllegalStateException: BUG: vertex bc764cd8ddf7a0cff126f51c16239658_720 tries to allocate a slot when its previous slot request is still pending
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 震惊!C++程序真的从main开始吗?99%的程序员都答错了
· 【硬核科普】Trae如何「偷看」你的代码?零基础破解AI编程运行原理
· 单元测试从入门到精通
· 上周热点回顾(3.3-3.9)
· winform 绘制太阳,地球,月球 运作规律