spark-submit提交任务到yarn错误

1.Error initializing SparkContext.

20/06/29 05:52:43 INFO yarn.Client: Deleted staging directory hdfs://master:9000/user/hadoop/.sparkStaging/application_1593402611188_0003
20/06/29 05:52:43 ERROR spark.SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.

遇到这问题先不要纠结这个点,先看其他错误。我的另一个错误是

For more detailed output, check application tracking page:http://master:8088/cluster/app/application_1593402611188_0003Then, click on links to logs of each attempt.
Diagnostics: Wrong FS: file://home/hadoop/data/hadoop/tmp/nm-local-dir, expected: file:///
Failing this attempt. Failing the application.

也就是找不到yarn的nm-local-dir所以应该在yarn-site.xmlite.xml里配置yarn.nodemanager.local-dirs

<property>
<name>yarn.nodemanager.local-dirs</name>
<value>/home/hadoop/data/hadoop/tmp/nm-local-dir</value>
</property>

 2.spark sql on yarn 启动失败 ERROR client.TransportClient: Failed to send RPC RPC

ERROR client.TransportClient: Failed to send RPC 8705795366260026656 to /192.168.182.163:58492: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
    at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to get executor loss reason for executor id 1 at RPC address 192.168.182.163:58520, but got no response. Marking as slave lost.
java.io.IOException: Failed to send RPC 8705795366260026656 to /192.168.182.163:58492: java.nio.channels.ClosedChannelException

出现这样的原因主要是 给节点分配的内存少,yarn kill了spark application。

给yarn-site.xml增加配置:

<property>
<name>yarn.nodemanager.pmem-check-enabled</name>
<value>false</value>
</property>
<property>
<name>yarn.nodemanager.vmem-check-enabled</name>
<value>false</value>
</property>

 3.org.apache.hadoop.yarm.exception.InvalidAuxServiceException: The auxService:spark_shuffle does not exist

yarn报错Class org.apache.spark.network.yarn.YarnShuffleService not found与此解决方法相同

于yarn-site.xml修改或添加如下配置;拷贝“${SPARK_HOME}/lib/spark-2.2.0-yarn-shuffle.jar”到“${HADOOP_HOME}/share/hadoop/yarn/lib/”目录下

<property>
<name>yarn.nodemanager.aux-services</name>
<value>spark_shuffle,mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.spark_shuffle.class</name>
<value>org.apache.spark.network.yarn.YarnShuffleService</value>
</property>
<property>
<name>spark.shuffle.service.port</name>
<value>7337</value>
</property>

 

posted @ 2020-06-29 14:52  鱼丸河粉  阅读(2082)  评论(0编辑  收藏  举报