mapreduce新旧api对比

对比:hadoop版本1.x 新版,hadoop版本0.x 旧版

1.新api引用包一般是mapreduce ,旧版api引用的包一般是mapred

2.新api使用Job,旧版api使用JobConf

3.新api类的包名使用mapreduce,旧版api使用mapred

4.新api使用job.waitForCompletion(true)提交作业,旧版api使用JobClient.runJob(job);

5.新api:extends Mapper,旧版api:extends MapRedcueBase implements Mapper

6.新api:使用上下文Context ctx ctx.write(),旧版api:使用OutputCollector<>collector Reporter  collector.collect();

-------------------------

新版:

public class kpi extends Configured implements Tool

{

static class MyMapper extends Mapper<LongWritable, Text, Text, kpiwritable>

{

protected void map(LongWritable k1, Text v1, Context context)

context.write(k2, v2);

}

static class MyReducer extends Reducer<Text, kpiwritable, Text, kpiwritable>

{

protected void reduce(Text k2, java.lang.Iterable<kpiwritable> v2s, Context ctx)

context.write(k2, v2);

}

public int run(String[] arg0)

{Job job=new Job(conf,"kpi");

job.waitForCompletion(true);
return 0;

}

public static void main(String[] args) throws Exception

{
ToolRunner.run(new kpi(), args);
}

}

---------------------------------------------------------

旧版:

public class old_api

{

static class MyMapper extends MapReduceBase implements Mapper<LongWritable, Text, Text, LongWritable>{

public void map(LongWritable k1, Text v1,
OutputCollector<Text, LongWritable> collector, Reporter reporter){

collector.collect(new Text(word), new LongWritable(1));

}

}

static class MyReducer extends MapReduceBase implements Reducer<Text, LongWritable, Text, LongWritable>{

public void reduce(Text k2, Iterator<LongWritable> v2s,
OutputCollector<Text, LongWritable> collector, Reporter reporter){

collector.collect(k2, new LongWritable(times));

}

}

public static void main(String[] args) {

final JobConf job = new JobConf(conf , old_api.class);

JobClient.runJob(job);

}

}

posted @ 2015-05-05 15:26  孟想阳光  阅读(338)  评论(0编辑  收藏  举报