Java操作Hadoop-HDFS API Maven环境搭建
1、创建一个Java项目,将一下代码粘贴到pom.xml中
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<hadoop.version>2.6.0</hadoop.version>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<-- <dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoop.version}</version>
</dependency>
-->
</dependencies>
2、测试代码:
Configuration configuration = null;
FileSystem fileSystem = null;
String url = "hdfs://CentOS7:8020";
String user = "hadoop";
@Before
public void before() throws URISyntaxException, IOException, InterruptedException {
configuration = new Configuration();
configuration.set("dfs.replication","1");//设置副本数量为1
fileSystem = FileSystem.get(new URI(url),configuration,user);
System.out.println("before doing");
}
@After
public void after(){
System.out.println("after doing");
configuration = null;
try {
fileSystem.close();
} catch (IOException e) {
fileSystem = null;
e.printStackTrace();
}
}
/**
* 创建文件
* @throws IOException
*/
@Test
public void mkdir() throws IOException {
Path path = new Path("/mkdirTest1/123");
boolean re = fileSystem.mkdirs(path);
System.out.println(re);
}
3、运行,测试成功!