Spark2.2出现异常:ERROR SparkUI: Failed to bind SparkUI
详细错误信息如下:
19/03/19 11:04:18 INFO util.log: Logging initialized @5402ms 19/03/19 11:04:18 INFO server.Server: jetty-9.3.z-SNAPSHOT 19/03/19 11:04:18 INFO server.Server: Started @5604ms 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4042. Attempting port 4043. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4043. Attempting port 4044. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4044. Attempting port 4045. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4045. Attempting port 4046. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4046. Attempting port 4047. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4047. Attempting port 4048. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4048. Attempting port 4049. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4049. Attempting port 4050. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4050. Attempting port 4051. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4051. Attempting port 4052. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4052. Attempting port 4053. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4053. Attempting port 4054. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4054. Attempting port 4055. 19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4055. Attempting port 4056. 19/03/19 11:04:18 ERROR ui.SparkUI: Failed to bind SparkUI java.net.BindException: 地址已在使用: Service 'SparkUI' failed after 16 retries (starting from 4040)! Consider explicitly setting the appropriate port for the service 'SparkUI' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries. at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:433) at sun.nio.ch.Net.bind(Net.java:425) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:317) at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80) at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:235) at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$newConnector$1(JettyUtils.scala:333) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$httpConnect$1(JettyUtils.scala:365) at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:368) at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:368) at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2237) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2229) at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:368) at org.apache.spark.ui.WebUI.bind(WebUI.scala:130) at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:460) at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:460) at scala.Option.foreach(Option.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:460) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901) at com.dx.b.streaming.domain.perf.SparkHelper.getAndConfigureSparkSession(SparkHelper.java:96) at com.dx.b.streaming.Main.main(Main.java:97) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
错误原因:
每一个Spark任务都会占用一个SparkUI端口,默认为4040,如果被占用则依次递增端口重试。但是有个默认重试次数,为16次。16次重试都失败后,会放弃该任务的运行。
解决方法
初始化SparkConf时,添加conf.set(“spark.port.maxRetries”,“100”)语句;使用spark-submit提交任务时,在命令行中添加-Dspark.port.maxRetries=100;在spark-defaults.conf中添加spark.port.maxRetries=100
基础才是编程人员应该深入研究的问题,比如:
1)List/Set/Map内部组成原理|区别
2)mysql索引存储结构&如何调优/b-tree特点、计算复杂度及影响复杂度的因素。。。
3)JVM运行组成与原理及调优
4)Java类加载器运行原理
5)Java中GC过程原理|使用的回收算法原理
6)Redis中hash一致性实现及与hash其他区别
7)Java多线程、线程池开发、管理Lock与Synchroined区别
8)Spring IOC/AOP 原理;加载过程的。。。
【+加关注】。