mapreduce任务出错最大尝试次数

一个任务出现失败时,tasktracker会将此任务的失败信息报告给jobtracker,jobtracker会分配新的节点执行此任务。这种情况下不会影响整个作业的完整执行。但是如果容易任务出现多次失败,且失败次数超出失败的最大指定次数,那么作业会在未完成的情况下被终止。

mapred-site.xml:
hadoop1.x:
    mapred.max.map.attempts
    mapred.max.reduce.attempts
hadoop2.x:
    <property>
      <name>mapreduce.reduce.maxattempts</name>
      <value>4</value>
      <description>Expert: The maximum number of attempts per reduce task.
      In other words, framework will try to execute a reduce task these many number
      of times before giving up on it.
      </description>
    </property>
    
    <property>
      <name>mapreduce.map.maxattempts</name>
      <value>4</value>
      <description>Expert: The maximum number of attempts per map task.
      In other words, framework will try to execute a map task these many number
      of times before giving up on it.
      </description>
    </property>

 

posted @ 2014-04-28 22:13  weian404  阅读(192)  评论(0编辑  收藏  举报