在用nutch抓取网页的时候,设置了10层,运行5个多小时之后,系统提示内存溢出异常:
java.lang.OutOfMemoryError: Java heap space
fetcher caught:java.lang.OutOfMemoryError: Java heap space
java.lang.OutOfMemoryError: Java heap space
fetcher caught:java.lang.OutOfMemoryError: Java heap space
Exception in thread "main" java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604)
at org.apache.nutch.fetcher.Fetcher.fetch(Fetcher.java:470)
at org.apache.nutch.crawl.Crawl.main(Crawl.java:124)
fetcher caught:java.lang.OutOfMemoryError: Java heap space
java.lang.OutOfMemoryError: Java heap space
fetcher caught:java.lang.OutOfMemoryError: Java heap space
Exception in thread "main" java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604)
at org.apache.nutch.fetcher.Fetcher.fetch(Fetcher.java:470)
at org.apache.nutch.crawl.Crawl.main(Crawl.java:124)
问题分析:
解决办法1:
Add the following to hadoop-site.xml. This sets the Java heap size for the
spawned child process. You can set it to whatever you want. I believe the
default size is 200 MB which is way too small.
<property>
<name>mapred.child.java.opts</name>
<value>-Xmx512m</value>
</property>
spawned child process. You can set it to whatever you want. I believe the
default size is 200 MB which is way too small.
<property>
<name>mapred.child.java.opts</name>
<value>-Xmx512m</value>
</property>
解决办法2:
increase the heapsize available to your java VM. Do this using the Run->"Open Run Dialog" menu, clicking the arguments tab and setting the vm arguments to include -Xmx1024m (or something larger, if need be). Obviously, it doesn't really make sense to run jobs over huge datasets on your puny little laptop. You should debug and develop only on a small portion of the actual data.
相关资料:
Re: java.lang.OutOfMemoryError: Java heap space
>
>>
>> On Thu, Sep 18, 2008 at 4:19 PM, Edward Quick <edwardquick@...> wrote:
>> >
>> > Hi,
>> >
>> > I'm getting java.lang.OutOfMemoryError: Java heap space errors when running nutch in a hadoop cluster.
>> > I have doubled the heap by setting export HADOOP_HEAPSIZE=2048 in hadoop-env.sh but this doesn't seem to make a difference.
>> >
>> > I'm need to hadoop so appreciate any help.
>> >
>>
>> Are you parsing during fetching? If so try disabling that and run
>> parsing as a separate job. At least, you
>> won't lose the results of fetching :)
>
>
> The threads in nutch-site.xml were set too high (at 50) so I put those down to 10 and it seems ok now.
>
>>
>> On Thu, Sep 18, 2008 at 4:19 PM, Edward Quick <edwardquick@...> wrote:
>> >
>> > Hi,
>> >
>> > I'm getting java.lang.OutOfMemoryError: Java heap space errors when running nutch in a hadoop cluster.
>> > I have doubled the heap by setting export HADOOP_HEAPSIZE=2048 in hadoop-env.sh but this doesn't seem to make a difference.
>> >
>> > I'm need to hadoop so appreciate any help.
>> >
>>
>> Are you parsing during fetching? If so try disabling that and run
>> parsing as a separate job. At least, you
>> won't lose the results of fetching :)
>
>
> The threads in nutch-site.xml were set too high (at 50) so I put those down to 10 and it seems ok now.
>