Hadoop_不识别主机名称


场景: 在服务器(或者Linux)中配置好了Hadoop集群以及服务端口,访问时报错

Linux环境

# Linux环境
[cannice@hadoop100 root]$ ssh hadoop300
ssh: Could not resolve hostname hadoop300: Name or service not known

`或者`
java.net.UnknownHostException: hadoop102: hadoop102
        at java.net.InetAddress.getLocalHost(InetAddress.java:1475)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:146)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)

Window环境服务Hadoop的NameNode服务:http://hadoop300:9870/

# Window环境
This site can’t be reached
Check if there is a typo in hadoop300.
If spelling is correct, try running Windows Network Diagnostics.
DNS_PROBE_FINISHED_NXDOMAIN

解决: 分别在映射hosts文件中添加映射名

# Linux环境
配置主机ip及对应的主机名,主机名称映射 hosts文件 打开/etc/hosts
[root@hadoop100 ~]# vim /etc/hosts
### 添加如下内容(`IP 和 主机名`):
172.16.6.139 hadoop100
172.16.6.158 hadoop200
172.16.6.185 hadoop300

# Window环境
hadoop300需要在win的hosts文件里添加映射名
hosts文件路径:`C:\Windows\System32\drivers\etc\hosts`

 

posted @ 2022-07-22 14:01  梅子猪  阅读(113)  评论(0编辑  收藏  举报