spark Error while running command to get file permissions

环境:win10 +hadoop2.7.1,服务器hadoop2.6.0+spark2.2.1+hive1.1.0

代码:
 1 import org.apache.spark.sql.Dataset;
 2 import org.apache.spark.sql.Row;
 3 import org.apache.spark.sql.SparkSession;
 4 import org.apache.spark.sql.hive.HiveContext;
 5 
 6 public class Dwd_rec_kqmj {
 7 
 8 
 9     public static void main(String[] args) {
10       SparkSession spark = SparkSession
11           .builder()
12           .master("local[*]")
13           .appName("Dwd_rec_kqmj")
14           .enableHiveSupport()
15           .config("spark.some.config.option", "some-value")
16           .config("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
17           .getOrCreate();
18 
19       spark.sql("show tables").show();
20         spark.stop();
21 }

 

问题:
 
1 Error while running command to get file permissions : 
2 java.io.IOException: (null) entry in command string: null ls -F E:\tmp\hive

 

 
看问题描述应该是文件权限问题
 
解决办法:
    1.将winutils.exe和hadoop.dll放到C:\Windows\System32和本地$HADOOP_HOME/bin下
    2.创建c:\tmp\hive目录
    3.在$HADOOP_HOME/bin执行winutils.exe chmod -R 777c:\tmp\hive
 
posted @ 2020-05-25 20:33  夏天换上冬装  阅读(3478)  评论(0编辑  收藏  举报