Spark2.2出现异常:ERROR SparkUI: Failed to bind SparkUI

详细错误信息如下:

19/03/19 11:04:18 INFO util.log: Logging initialized @5402ms
19/03/19 11:04:18 INFO server.Server: jetty-9.3.z-SNAPSHOT
19/03/19 11:04:18 INFO server.Server: Started @5604ms
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4042. Attempting port 4043.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4043. Attempting port 4044.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4044. Attempting port 4045.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4045. Attempting port 4046.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4046. Attempting port 4047.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4047. Attempting port 4048.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4048. Attempting port 4049.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4049. Attempting port 4050.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4050. Attempting port 4051.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4051. Attempting port 4052.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4052. Attempting port 4053.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4053. Attempting port 4054.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4054. Attempting port 4055.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4055. Attempting port 4056.
19/03/19 11:04:18 ERROR ui.SparkUI: Failed to bind SparkUI
java.net.BindException: 地址已在使用: Service 'SparkUI' failed after 16 retries (starting from 4040)! Consider explicitly setting the appropriate port for the service 'SparkUI' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:433)
        at sun.nio.ch.Net.bind(Net.java:425)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:317)
        at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
        at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:235)
        at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$newConnector$1(JettyUtils.scala:333)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$httpConnect$1(JettyUtils.scala:365)
        at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:368)
        at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:368)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2237)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2229)
        at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:368)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:130)
        at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:460)
        at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:460)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
        at com.dx.b.streaming.domain.perf.SparkHelper.getAndConfigureSparkSession(SparkHelper.java:96)
        at com.dx.b.streaming.Main.main(Main.java:97)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

错误原因:

每一个Spark任务都会占用一个SparkUI端口,默认为4040,如果被占用则依次递增端口重试。但是有个默认重试次数,为16次。16次重试都失败后,会放弃该任务的运行。

解决方法

初始化SparkConf时,添加conf.set(“spark.port.maxRetries”,“100”)语句;使用spark-submit提交任务时,在命令行中添加-Dspark.port.maxRetries=100;在spark-defaults.conf中添加spark.port.maxRetries=100

posted @ 2019-03-19 14:16  cctext  阅读(2732)  评论(0编辑  收藏  举报