spark报错:Class org.apache.mapreduce.io.compress.GzipCodec not found

现象:在进行spark本地IDEA运行的时候,报错

报错:

Exception in thread "main" java.lang.RuntimeException: Error in configuring object
     at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:112)
     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
     at org.apache.spark.rdd.HadoopRDD.getInputFormat(HadoopRDD.scala:185)
     at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:198)
     at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252)
     at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250)
     at scala.Option.getOrElse(Option.scala:121)
     at org.apache.spark.rdd.RDD.partitions(RDD.scala:250)
     at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:46)
     at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252)
     at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250)
     at scala.Option.getOrElse(Option.scala:121)
     at org.apache.spark.rdd.RDD.partitions(RDD.scala:250)
     at org.apache.spark.SparkContext.runJob(SparkContext.scala:2094)
     at org.apache.spark.rdd.RDD.count(RDD.scala:1166)
     at spark._core.describe.sparkRunJob$.main(sparkRunJob.scala:20)
     at spark._core.describe.sparkRunJob.main(sparkRunJob.scala)
Caused by: java.lang.reflect.InvocationTargetException
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     at java.lang.reflect.Method.invoke(Method.java:498)
     at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
     ... 17 more
Caused by: java.lang.IllegalArgumentException: Compression codec org.apache.mapreduce.io.compress.GzipCodec not found.
     at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:139)
     at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:180)
     at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45)
     ... 22 more
Caused by: java.lang.ClassNotFoundException: Class org.apache.mapreduce.io.compress.GzipCodec not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2171)
     at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:132)
     ... 24 more


原因: 程序在运行的时候回去读取配置文件,在IDEA中也就是配置在resource中的core.xml文件,xml文件中如果配置以下属性:

<property>
<name>io.compression.codecs</name>
<value>org.apache.mapreduce.io.compress.GzipCodec,org.apache.mapreduce.io.compress.DefaultCodec,org.apache.mapreduce.io.compress.SnappyCodec</value>
</property>

,程序如果发现没有这几个类对应的jar包的话,就会包上面的错,如果已经导入了上面几个类对应的包的话,在配置上面这个属性的时候,不要有换行符,不要有空格等,否则会引起另外的错误

解决方法:

第一种:本地测试可以将这个配置给注释掉

第二种,如果是生产上的话,则需要将报错的类对应的jar包放到对应的额lib目录下

posted @ 2020-04-11 21:45  郭小白  阅读(588)  评论(0编辑  收藏  举报