解决spark运行中failed to locate the winutils binary in the hadoop binary path的问题

1.下载hadoop-common-2.2.0-bin并解压到某个目录

 https://github.com/srccodes/hadoop-common-2.2.0-bin

 

2.设置hadoop.home.dir

 

System.setProperty("hadoop.home.dir", "D:\\hadoop-common-2.2.0-bin-master")

 

posted @ 2016-03-26 09:06  高手教程  阅读(6257)  评论(0编辑  收藏  举报