java.lang.OutOfMemoryError: Java heap space

在安装完hadoop后跑example,结果抛出异常java.lang.OutOfMemoryError: Java heap space

 

下面是在stackoverflow上看到的解决办法:

1、don't forget to execute "ssh localhost" first. Believe or not! No ssh would throw an error message on Java heap space as well(我就是因为这个原因抛异常)

2、For anyone using RPM or DEB packages, the documentation and common advice is misleading. These packages install hadoop configuration files into /etc/hadoop. These will take priority over other settings.

The /etc/hadoop/hadoop-env.sh sets the maximum java heap memory for Hadoop, by Default it is:

export HADOOP_CLIENT_OPTS="-Xmx128m $HADOOP_CLIENT_OPTS"

This Xmx setting is too low, simply change it to this and rerun

export HADOOP_CLIENT_OPTS="-Xmx2048m $HADOOP_CLIENT_OPTS"

 

P.S.

如果是用rpm安装的,那个示例程序的路径是:

/usr/share/hadoop/hadoop-examples-1.0.4.jar

执行命令行应该是:

hadoop jar /usr/share/hadoop/hadoop-examples-1.0.4.jar grep input output 'dfz[a-z.]+'

 

 

 

posted @ 2013-04-01 17:28  mender  阅读(350)  评论(0编辑  收藏  举报