macbook 下 hadoop 源码开发环境搭建问题
首先按照3.2.2-rc5 BUILDING.txt 说明安装指定包
git clone 源码
git checkout 到对应版本
然后执行:
mvn package -Pdist,native -DskipTests -Dmaven.javadoc.skip
issue 1:
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 9.276 s
[INFO] Finished at: 2021-04-30T11:43:55+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:native-maven-plugin:1.0-alpha-8:javah (default) on project hadoop-common: Error running javah command: Error executing command line. Exit code:127 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-common
需要设置java环境变量:
vi ~/.bash_profile
//添加下面代码 export JAVA_HOME=\((/usr/libexec/java_home) export PATH=\)JAVA_HOME/bin:\(PATH export CLASS_PATH=\)JAVA_HOME/lib
source ~/.bash_profile
issue 2:
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 33.046 s
[INFO] Finished at: 2021-04-30T11:54:35+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.2.1:cmake-compile (cmake-compile) on project hadoop-hdfs-native-client: CMake failed with error code 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-hdfs-native-client
尝试过设置OPENSSL环境变量 无用;暂时无解
issue 3:
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:46 min
[INFO] Finished at: 2021-04-30T18:33:54+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.2.2:cmake-compile (cmake-compile) on project hadoop-mapreduce-client-nativetask: make failed with error code 2 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-mapreduce-client-nativetask
暂时无解;
更换成 2.10.1 版本后编译成功;编译后会生成 hadoop 可执行目录,需要按参考内容修改配置文件后启动服务;
配置文件:
core-site.xml
hdfs-site.xml
mapred-site.xml
yarn-site.xml
启动指令:
start-dfs.sh start-yarn.sh
启动进程:
SecondaryNameNode
NameNode
DataNode
ResourceManager
NodeManager
issue 4:
start-dfs.sh
启动报错 ssh: connect to host localhost port 22: Connection refused
原因:mac未开启ssh服务
解决:系统偏好设置--》共享--》设置远程登录
issue 5:
hdfs指令报错
hdfs dfs -ls
ls: Failed on local exception: java.io.IOException: Couldn't set up IO streams: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V; Host Details : local host is: "C02F60CBMD6T/192.168.0.110"; destination host is: "localhost":9000;
安装包版本不对,
mvn clean 后重新下载源解决
其他:
启动后默认地址:
hdfs:
http://localhost:50070/dfshealth.html#tab-overview
yarn:
http://localhost:8088/cluster/apps
参考:
https://github.com/apache/hadoop/blob/release-3.2.2-RC5/BUILDING.txt
https://github.com/apache/hadoop/blob/release-2.10.1-RC0/BUILDING.txt
https://zhuanlan.zhihu.com/p/33117305
https://blog.csdn.net/Hu_wen/article/details/73481296