sparksql错误报No such file or director

今天在非hadoop用户使用sparksql处理insert overwrite table a select b left join c这个句型遇到以下的错误。

Error: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 1601.0 failed 4 times, most recent failure: Lost task 1.3 in stage 1601.0 (TID 30784, ytc-11, executor 2): java.io.FileNotFoundException: /tmp/hadoop-hadoop/nm-local-dir/usercache/nonhadoop/appcache/application_1536194566851_1099/blockmgr-ef7931da-ffe9-4ab1-b3dd-92d720069430/35/temp_shuffle_c056197b-f61f-40c1-91ee-a0a52a0afe2f (No such file or directory)

修复方式:

操作系统上的/tmp/hadoop-hadoop的可写权限赋给nonhadoop用户,安全的方法是将用户加到相应的组,测试环境可以直接使用777权限配置。

posted on 2018-10-17 22:06  camash  阅读(860)  评论(0编辑  收藏  举报

导航