spark1.4

spark1.4 Windows local调试环境搭建总结

1.scala版本 scala-2.10.4 官方推荐 scala-2.11.7【不推荐,非sbt项目、需要后加载】

2.spark版本 spark-1.4.0-bin-hadoop2.6.tgz

3.hadoop 3.1版本 hadoop-2.6.0.tar.gz 3.2环境变量 HADOOP_HOME=E:/ysg.tools/spark/hadoop-2.6.0 或 System.setProperty("hadoop.home.dir", "E:\ysg.tools\spark\hadoop-2.6.0"); 3.3winutils.exe

winutils.exe拷贝至spark/hadoop-2.6.0/bin

文件下载地址 https://files.cnblogs.com/files/yjmyzz/hadoop2.6%28x64%29V0.2.zip

4.idea 新建 NO-SBT项目

libraties 增加 scala sdk 
spark-1.4.0-bin-hadoop2.6\lib\spark-assembly-1.4.0-hadoop2.6.0.jar

spark.test.iml 先加载 spark-assembly-1.4.0-hadoop2.6.0 再加载 scala-sdk-2.11.7

<?xml version="1.0" encoding="UTF-8"?>
<module type="JAVA_MODULE" version="4">
  <component name="NewModuleRootManager" inherit-compiler-output="true">
    <exclude-output />
    <content url="file://$MODULE_DIR$">
      <sourceFolder url="file://$MODULE_DIR$/src" isTestSource="false" />
    </content>
    <orderEntry type="inheritedJdk" />
    <orderEntry type="sourceFolder" forTests="false" />
    <orderEntry type="library" name="spark-assembly-1.4.0-hadoop2.6.0" level="project" />

    <orderEntry type="library" name="scala-sdk-2.11.7" level="project" />
  </component>
</module>

http://my.oschina.net/itnms/blog/476192
转载地址:http://my.oschina.net/itnms/blog/476192
posted @ 2016-04-29 13:28  小毛驴  阅读(518)  评论(0编辑  收藏  举报