大数据之hadoop常用操作命令

1.列出根目录下所有的目录或文件

hadoop dfs -ls /

2.列出/user目录下的所有目录和文件

Hadoop  dfs  -ls  /user

3.列出/user目录及其子目录下的所有文件(谨慎使用)

hadoop dfs -ls -R /user

4.创建/soft目录

hadoop dfs -mkdir /soft

5.创建多级目录

hadoop dfs -mkdir -p /apps/windows/2017/01/01

6.将本地的wordcount.jar文件上传到/wordcount目录下

hadoop dfs -put wordcount.jar /wordcount

7.下载words.txt文件到本地

hadoop dfs -get /words.txt 

8.将/stu/students.txt文件拷贝到本地

hadoop dfs -copyToLocal /stu/students.txt

9.将word.txt文件拷贝到/wordcount/input/目录

hadoop dfs -copyFromLocal word.txt /wordcount/input 

10.将word.txt文件从本地移动到/wordcount/input/目录下

hadoop dfs -moveFromLocal word.txt /wordcount/input/

11.将/stu/students.txt拷贝一份为/stu/students.txt.bak

hadoop dfs -cp /stu/students.txt /stu/students.txt.bak 

12.将/flume/tailout/目录下的子目录或文件都拷贝到/logs目录(如果此目录不存在会创建)下

hadoop dfs -cp /flume/tailout/ /logs 

13.将/word.txt文件重命名为/words.txt

hadoop dfs -mv /word.txt /words.txt

14.将/words.txt文件移动到/wordcount/input/目录下

hadoop dfs -mv /words.txt /wordcount/input/

15.将/ws目录以及子目录和文件都删除(谨慎使用)

hadoop dfs -rm -r /ws 

16.删除以"xbs-"开头的目录及其子目录

hadoop dfs -rm -r /xbs-*

17.将/wordcount/output2/目录下的a.txt文件删除

hadoop dfs -rm /wordcount/output2/a.txt 

18.将/wordcount/input/目录下面的所有文件都删除

hadoop dfs -rm /wordcount/input/*

19.查看HDFS集群的磁盘空间使用情况

hadoop dfs -df -h 

20.查看/word.txt文件的内容

hadoop dfs -cat /word.txt 

21.将name.txt文件中的内容添加到/wordcount/input/words.txt文件中

hadoop dfs -appendToFile name.txt /wordcount/input/words.txt

22.动态查看/wordcount/input/words.txt文件的内容

hadoop dfs -tail -f /wordcount/input/words.txt

23.统计/flume目录总大小

hadoop dfs -du -s -h /flume

24.分别统计/flume目录下各个子目录(或文件)大小

hadoop dfs -du -s -h /flume/*

25.运行jar包中的程序

//hadoop jar + 要执行的jar包 + 要运行的类 + 输入目录 + 输出目录
hadoop jar wordcount.jar com.xuebusi.hadoop.mr.WordCountDriver /wordcount/input /wordcount/out

26.查看hdfs集群状态

hdfs dfsadmin -report

 View Code

27.查看hadoop fs命令使用帮助

 View Code

 

posted on 2020-08-21 11:24  鑫春  阅读(196)  评论(0编辑  收藏  举报

导航