spark基于Yarn模式提交作业出现的问题

问题一:

 

 1 2019-06-22 20:51:22 INFO  Client:54 - Application report for application_1561087892111_0001 (state: ACCEPTED)
 2 2019-06-22 20:51:23 INFO  Client:54 - Application report for application_1561087892111_0001 (state: FAILED)
 3 2019-06-22 20:51:23 INFO  Client:54 - 
 4      client token: N/A
 5      diagnostics: Application application_1561087892111_0001 failed 2 times due to AM Container for appattempt_1561087892111_0001_000002 exited with  exitCode: 1
 6 For more detailed output, check application tracking page:http://node003:8088/proxy/application_1561087892111_0001/Then, click on links to logs of each attempt.
 7 Diagnostics: Exception from container-launch.
 8 Container id: container_1561087892111_0001_02_000001
 9 Exit code: 1
10 Stack trace: ExitCodeException exitCode=1: 
11     at org.apache.hadoop.util.Shell.runCommand(Shell.java:575)
12     at org.apache.hadoop.util.Shell.run(Shell.java:478)
13     at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:766)
14     at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
15     at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
16     at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
17     at java.util.concurrent.FutureTask.run(FutureTask.java:262)
18     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
19     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
20     at java.lang.Thread.run(Thread.java:745)
21 
22 
23 Container exited with a non-zero exit code 1
24 Failing this attempt. Failing the application.
25      ApplicationMaster host: N/A
26      ApplicationMaster RPC port: -1
27      queue: default
28      start time: 1561207823232
29      final status: FAILED
30      tracking URL: http://node003:8088/cluster/app/application_1561087892111_0001
31      user: root
32 2019-06-22 20:51:23 INFO  Client:54 - Deleted staging directory hdfs://mycluster/user/root/.sparkStaging/application_1561087892111_0001
33 2019-06-22 20:51:24 ERROR SparkContext:91 - Error initializing SparkContext.
34 org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
35     at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:89)
36     at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
37     at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
38     at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
39     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)
40     at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:933)
41     at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:924)
42     at scala.Option.getOrElse(Option.scala:121)
43     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:924)
44     at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
45     at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
46     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
47     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
48     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
49     at java.lang.reflect.Method.invoke(Method.java:498)
50     at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
51     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
52     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
53     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
54     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
55     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
56 2019-06-22 20:51:24 INFO  AbstractConnector:318 - Stopped Spark@5b057ab9{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
57 2019-06-22 20:51:24 INFO  SparkUI:54 - Stopped Spark web UI at http://node004:4040
58 2019-06-22 20:51:24 WARN  YarnSchedulerBackend$YarnSchedulerEndpoint:66 - Attempted to request executors before the AM has registered!
59 2019-06-22 20:51:24 INFO  YarnClientSchedulerBackend:54 - Shutting down all executors
60 2019-06-22 20:51:24 INFO  YarnSchedulerBackend$YarnDriverEndpoint:54 - Asking each executor to shut down
61 2019-06-22 20:51:24 INFO  SchedulerExtensionServices:54 - Stopping SchedulerExtensionServices
62 (serviceOption=None,
63  services=List(),
64  started=false)
65 2019-06-22 20:51:24 INFO  YarnClientSchedulerBackend:54 - Stopped
66 2019-06-22 20:51:24 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
67 2019-06-22 20:51:24 INFO  MemoryStore:54 - MemoryStore cleared
68 2019-06-22 20:51:24 INFO  BlockManager:54 - BlockManager stopped
69 2019-06-22 20:51:24 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
70 2019-06-22 20:51:24 WARN  MetricsSystem:66 - Stopping a MetricsSystem that is not running
71 2019-06-22 20:51:24 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
72 2019-06-22 20:51:24 INFO  SparkContext:54 - Successfully stopped SparkContext
73 Exception in thread "main" org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
74     at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:89)
75     at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
76     at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
77     at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
78     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)
79     at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:933)
80     at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:924)
81     at scala.Option.getOrElse(Option.scala:121)
82     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:924)
83     at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
84     at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
85     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
86     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
87     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
88     at java.lang.reflect.Method.invoke(Method.java:498)
89     at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
90     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
91     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
92     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
93     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
94     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
95 2019-06-22 20:51:24 INFO  ShutdownHookManager:54 - Shutdown hook called
96 2019-06-22 20:51:24 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-638ce602-0e7d-4fd6-b0ea-2c5c185522a1
97 2019-06-22 20:51:24 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-d440bd38-f2ba-4be8-a330-a1a00d3d0b2a

看了很多网站写的
配置 yarn-site.xml , 关闭内存检查(详细度一下),我的不行

我曾经更改过jdk版本,查看java -version 和javac -version ,都是对的

查看 hadoop.env.sh 配置的export JAVA_HOME 是对的

再运行 ,yarn还是只能停留在accepted后就failed。。晕

经过仔细翻看笔记,发现以前装hadoop和yarn时,还有两个地方配置了JAVA_HOME,这两个文件是mapred-env.sh和yarn-env.sh,仔细对照修改了新的export JAVA_HOME

重新启动hadoop集群!喔ho,其他启动很正常,只有namenode没有启动(或者说启动了又消失了),网上查阅资料(格式化一下hadoop即可,详细看我上一篇转了网友的一篇博客)。

好了这下可以启动了,,好了好了好了,SparkPi例子可以在yarn上执行了!

posted on 2019-06-24 11:16  大猫食小鱼  阅读(1011)  评论(0编辑  收藏  举报