java.nio.channels.ClosedChannelException:
给节点分配的内存太小,yarn直接kill掉了进程,导致ClosedChannelException。
解决方案:
修改hadoop所有节点的yarn-site.xml,添加下列property
<property>
<name>yarn.nodemanager.pmem-check-enabled</name>
<value>false</value>
</property>
<property>
<name>yarn.nodemanager.vmem-check-enabled</name>
<value>false</value>
</property>
-------------
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
检查文件spark-env.sh的内容是否有写错!