hadoop调度程序时出现“Error opening job jar”错误
提示出现的问题:
Exception in thread "main" java.io.IOException: Error opening job jar: /home/deploy/recsys/workspace/ouyangyewei/recommender-dm-1.0-SNAPSHOT-lib at org.apache.hadoop.util.RunJar.main(RunJar.java:90) Caused by: java.util.zip.ZipException: error in opening zip file at java.util.zip.ZipFile.open(Native Method) at java.util.zip.ZipFile.<init>(ZipFile.java:127) at java.util.jar.JarFile.<init>(JarFile.java:135) at java.util.jar.JarFile.<init>(JarFile.java:72) at org.apache.hadoop.util.RunJar.main(RunJar.java:88) -bash-3.2$ hadoop fs -ls /home/deploy/recsys/workspace/ouyangyewei Warning: $HADOOP_HOME is deprecated.
调度命令:
hadoop jar recommender-dm_fat.jar com.yhd.ml.statistics.category.GenCategoryUserProfileJob --userProfileTable full_user_profile --categoryId 957370 --categoryFile /user/hive/warehouse/category/part-m-00000 --output /home/deploy/recsys/workspace/ouyangyewei/output
一般这个错误原因有两个:
1. 先检查相应路径下是否有该jar包
2. jar包的路径是否写正确
3. jar包有问题
查明原因是:相应文件夹里面没有该jar包,我用hadoop fs -ls 文件夹路径 命令发现有该jar包,可是用ll命令却没有看到该jar包,
“ll”命令和“hadoop fs -ls 路径” 命令的差别:
1. ll命令:查看server本地的全部文件
2. hadoop fs -ls 路径 命令:查看HDFS的全部文件
hadoop 调度命令调度的是server本地的jar包