Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
该错误和我的上一篇博文ERROR [org.apache.hadoop.util.Shell] - Failed to locate the winutils binary in the hadoop binary path一样是由于缺少文件引起的,错误日志如下:
2018-04-11 16:32:28,514 INFO [org.apache.hadoop.mapreduce.JobSubmitter] - Submitting tokens for job: job_local1975654255_0001
2018-04-11 16:32:28,561 WARN [org.apache.hadoop.conf.Configuration] - file:/tmp/hadoop-Zimo/mapred/staging/Zimo1975654255/.staging/job_local1975654255_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.
2018-04-11 16:32:28,562 WARN [org.apache.hadoop.conf.Configuration] - file:/tmp/hadoop-Zimo/mapred/staging/Zimo1975654255/.staging/job_local1975654255_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.
2018-04-11 16:32:28,663 DEBUG [org.apache.hadoop.security.UserGroupInformation] - PrivilegedAction as:Zimo (auth:SIMPLE) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
2018-04-11 16:32:28,758 INFO [org.apache.hadoop.mapreduce.JobSubmitter] - Cleaning up the staging area file:/tmp/hadoop-Zimo/mapred/staging/Zimo1975654255/.staging/job_local1975654255_0001
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:435)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:177)
at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:164)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:98)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:157)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:636)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:430)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
at com.hadoop.phoneStatistics.ExcelPhoneStatistics.run(ExcelPhoneStatistics.java:117)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at com.hadoop.phoneStatistics.ExcelPhoneStatistics.main(ExcelPhoneStatistics.java:128)
2018-04-11 16:32:28,767 DEBUG [org.apache.hadoop.ipc.Client] - Stopping client
2018-04-11 16:32:28,767 DEBUG [org.apache.hadoop.ipc.Client] - IPC Client (1166151249) connection to centpy/192.168.86.134:9000 from Zimo: closed
2018-04-11 16:32:28,768 DEBUG [org.apache.hadoop.ipc.Client] - IPC Client (1166151249) connection to centpy/192.168.86.134:9000 from Zimo: stopped, remaining connections 0
解决方法是下载https://github.com/srccodes/hadoop-common-2.2.0-bin文件然后将其中的hadoop.dll文件放到hadoop安装路径的bin文件夹下(配置好HADOOP_HOME的环境变量),然后重启电脑,这样问题就能得到解决了!
以上就是博主为大家介绍的这一板块的主要内容,这都是博主自己的学习过程,希望能给大家带来一定的指导作用,有用的还望大家点个支持,如果对你没用也望包涵,有错误烦请指出。如有期待可关注博主以第一时间获取更新哦,谢谢!
版权声明:本文为博主原创文章,未经博主允许不得转载。
本博文由博主子墨言良原创,未经允许禁止转载,若有兴趣请关注博主以第一时间获取更新哦!