spark单机环境下运行一些解决问题
- ERROR1、hadoop依赖
[ERROR] - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
解决办法:
1) 下载所需的winutils.exe http://social.msdn.microsoft.com/Forums/windowsazure/en-US/28a57efb-082b-424b-8d9e-731b1fe135de/please-read-if-experiencing-job-failures?forum=hdinsight
2)存放至一个路径如——d:\winutil\bin
3)文件中加入System.setProperty("hadoop.home.dir", "d:\\winutil\\")
2.ERROR2、端口冲突
java.net.BindException: Address already in use: Cannot bind
解决办法:打开任务处理器,关闭java进程
3.ERROR3、主机未设定
A master URL must be set in your configuration
解决办法:
代码中:val conf = new SparkConf().setAppName("Word Test")
改成:val conf = new SparkConf().setMaster("local[*]").setAppName("Word Test")
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步