Marking as slave lost.
Spark on Yarn提交任务时报ClosedChannelException解决方案_服务器应用_Linux公社-Linux系统门户网站 http://www.linuxidc.com/Linux/2017-01/140068.htm
<property>
<name>yarn.nodemanager.pmem-check-enabled</name>
<value>false</value>
</property>
<property>
<name>yarn.nodemanager.vmem-check-enabled</name>
<value>false</value>
</property>
ssh://root@192.168.2.51:22/usr/bin/python -u /root/.pycharm_helpers/pydev/pydevd.py --multiproc --qt-support=auto --client '0.0.0.0' --port 40894 --file /home/data/crontab_chk_url/pyspark/pyspark_yarn_test.py
pydev debugger: process 3778 is connecting
Connected to pydev debugger (build 172.4343.24)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/12/03 21:30:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/12/03 21:30:46 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
17/12/03 21:30:47 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
17/12/03 21:30:55 ERROR client.TransportClient: Failed to send RPC 8836767686150811845 to /192.168.2.51:34970: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
17/12/03 21:30:55 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to get executor loss reason for executor id 1 at RPC address 192.168.2.51:35068, but got no response. Marking as slave lost.