摘要:
1准备4台linux PC,确保机器间能ping通/*VMware Bridged*/(1)编辑每台/etc/hosts文件,如下:49.123.90.186 redhnamenode49.123.90.181 redhdatanode149.123.90.182 redhdatanode249.123.90.184 redhdatanode3(2)关闭防火墙/*需root权限*/service iptables stop(3)在所有机器上建立相同的用户hadoop-user(4)安装jdk到/home2 ssh配置/*hadoop-user*/(1)在所有redhdatanode上建立.ss 阅读全文
摘要:
1cd /home/chenyong/paper/hadoop-1.1.22mkdir inputcd inputecho "hello world !" > test1.txtecho "hello hadoop" > test2.txtecho "hello redhat" > test3.txt3 ./bin/hadoop dfs -put input /in/*将input目录复制到hdfs根目录下,重命名为in,执行前out目录必须为空*/./bin/hadoop jar hadoop-examples-1 阅读全文
摘要:
1 hadoop fs -rmr /tmp2 stop-all.sh3 rm -rf /tmp/hadoop* /*** all pc ***/4 hadoop namenode -format5 start-all.sh 阅读全文