Spark stop all 失败

spark stop all 不成功,提示如下

[root@sm61 sbin]# ./stop-all.sh
s92.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s95.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s76.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s93.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s74.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s78.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s71.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s73.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s72.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s99.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s75.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s94.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s77.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s96.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s97.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s98.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s91.spark.starv.com: ssh: connect to host s91.spark.starv.com port 22: No route to host
no org.apache.spark.deploy.master.Master to stop

因为没有 找到进程的 ID ,所以 无法stop, 那么 进程的 ID 存储在 哪呢 ?

spark-daemon.sh 里有这样一句话:

if [ "$SPARK_PID_DIR" = "" ]; then
  SPARK_PID_DIR=/tmp
fi


如果没有 配置$SPARK_PID_DIR这个环境变量,那么就在/tmp里面,但是因为系统会定期清空/tmp目录。

所以在 conf/spark-evn.sh 配置了 这个 环境变量。然后重启。

 

 

posted on 2017-07-13 10:20  汤汤水水  阅读(247)  评论(0编辑  收藏  举报