大数据学习——上传本地文件到集群根目录下

TestHDFS.java

package cn.itcast.hdfs;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

public class TestHDFS {
    public static void main(String[] args) throws IOException {
        Configuration conf = new Configuration();
        //1首先需要一个hdfs的客户端对象
        conf.set("fs.defaultFS", "hdfs://mini1:9000");
        FileSystem fs = FileSystem.get(conf);
        fs.copyFromLocalFile(new Path("E://he.txt"), new Path("/"));
        fs.close();
    }
}

此时报错,权限不足:

 

修改后的代码(伪造一个root身份):

package cn.itcast.hdfs;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

public class TestHDFS {
    public static void main(String[] args) throws IOException {
        Configuration conf = new Configuration();

        //伪造root用户身份
        System.setProperty("HADOOP_USER_NAME","root");

        //1首先需要一个hdfs的客户端对象
        conf.set("fs.defaultFS", "hdfs://mini1:9000");
        FileSystem fs = FileSystem.get(conf);
        fs.copyFromLocalFile(new Path("E://he.txt"), new Path("/"));
        fs.close();
    }
}

 

或者是运行时加一个参数:

 

VM-arguements:

-DHADOOP——USER_NAME=root

 

 

 

posted on 2019-01-03 19:12  o_0的园子  阅读(1396)  评论(0编辑  收藏  举报