Setting up eclipse-hadoop environment on your mountain lion (hadoop 1.0.3)

I assume you already set up your hadoop 1.0.3 on your os x.

open your ssh like this 

First of all , you need to change your hadoop configuration, 

the masters and slaves have the same name localhost

  • your core-site.xml should be like this:
<configuration>
    <property>
        <name>fs.default.name</name>
        <value>hdfs://localhost:9000</value>
    </property>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>/tmp/hadoop-shuaizki</value>
    </property>
</configuration>

9000 (important) is the port of namenode  and  hadoop.tmp.dir defines the path of DFS on your file system (IF you set it in /tmp , you have to format this filesystem everytime you restart your computer)

your  mapred-site.xml should be like this:

<configuration>
    <property>
        <name>mapred.job.tracker</name>
        <value>localhost:9001</value>
    </property>
</configuration>

9001 is the port of the Map/Reduce jobtracker

  • hdfs-site.xml
<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>

dfs.replication defines the copy number of your data in your datanode

then format your dfs, using 

cd ${HADOOP_HOME}
./bin/hadoop namenode -fromat

start your hadoop service

bash bin/start-all.sh

loign http://localhost:50030 and http://localhost:50070 to see if your configuration is right ~~

Second, begin with the eclipse things

 down hadoop-eclipse-plugin-1.0.3.jar 
  • find preference in eclipse , entet your hadoop installation directory in Hadoop Map/Reduce
 

  • open window->show view->other views, you will see this

press ok 

now you load the Map/Reduce locations view , create a new hadoop location 

map/reduce master port is 9001 as we set before (namenode port)

DFS master 's port is 9000

ok !  success!

posted on 2012-12-04 10:59  王 帅  阅读(234)  评论(0编辑  收藏  举报