展开
拓展 关闭
订阅号推广码
GitHub
视频
公告栏 关闭

hadoop常见问题

  • 错误描述
# 启动hadoop集群后DataNode不显示
# 执行命令 hdfs dfs -ls -R / 不起作用
  • 解决方案
# 先重启并格式化
hdfs namenode -format

# 如果不起作用
# 删除data目录并新建
cd /opt/software/hadoop-2.9.2/dfs
rm -rf data
mkdir data
  • 错误2
# 启动hive,向表中插入1条数据时报错
# 浏览器查看http://192.168.128.103:8088/cluster

# 报错详情
Application application_1705164916779_0001 failed 1 times (global limit =2; local limit is =1) due to AM Container for appattempt_1705164916779_0001_000001 exited with exitCode: 1
Failing this attempt.Diagnostics: [2024-01-14 00:57:25.540]Exception from container-launch.
Container id: container_1705164916779_0001_01_000001
Exit code: 1
[2024-01-14 00:57:25.550]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
错误: 找不到或无法加载主类 org.apache.spark.deploy.yarn.ApplicationMaster
[2024-01-14 00:57:25.551]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
错误: 找不到或无法加载主类 org.apache.spark.deploy.yarn.ApplicationMaster
For more detailed output, check the application tracking page: http://slave3:8088/cluster/app/application_1705164916779_0001 Then click on links to logs of each attempt.
. Failing the application.
  • 解决方案
# 错误原因,当前环境配置了hive on spark,hdfs中jar包被删除了

# hdfs中新建目录,将spark/jars中的包复制到该目录
hdfs dfs -mkdir /spark/hive-jars
hadoop fs -put /usr/local/software/spark-2.2.0/jars/* /spark/hive-jars
posted @ 2024-01-13 21:35  DogLeftover  阅读(86)  评论(0编辑  收藏  举报