Spark 0.9.0启动脚本——bin/compute-classpath.sh

1. 设置SCALA_VERSION

2. 执行conf/spark-env.sh

3. 设置CLASSPATH=<conf目录>

4. 如果存在assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*-deps.jar,则添加

[core|repl|mllib|bagel|graphx|streaming]/target/scala-$SCALA_VERSION/classes:/assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*-deps.jar

如果不存在,则检测RELEASE目录,存在则添加jars/spark-assembly*.jar,不存在则添加assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*.jar

5. 检测SPARK_TESTING,为1则添加

[core|repl|mllib|bagel|graphx|streaming]/target/scala-$SCALA_VERSION/test-classes

6. 添加HADOOP_CONF_DIR、YARN_CONF_DIR

 

posted @ 2014-03-26 08:13  飞天虎  阅读(487)  评论(0编辑  收藏  举报