problems_maven
problems_maven
1 pom.xml中删除了某个依赖,右侧maven栏里仍然有该依赖
desc:
并且有5个相同的依赖banboo-0.0.1.jar,其中2个依赖有红色下划波浪线,代表有错误。
RCA:
待定。
solution:
关闭该idea项目,删除工作空间根目录下的.idea文件夹,然后打开idea,点击File - open 按钮,重新打开该项目。
重新设置所有的编码为UTF-8,重新设置maven路径、maven使用的jdk,重新设置jdk版本。
2 maven警告提示
desc:
maven package warning:
Some problems were encountered while building the effective settings
expected START_TAG or END_TAG not TEXT (position: TEXT seen ...</mirror>\r\n -->\r\n <mirror>\r\n \u3000\u3000<i... @161:9) @ D:\develop\mvnrepo\settingsforsca.xml, line 161, column 9
RCA:
在Maven警告提示区域存在空格等不规范字符,在网上复制到项目中时经常出现类似问题。
pop.xml文件,setting.xml文件极易出现此类问题。
solution:
将空格删除,规范一下格式。
reference: https://www.cnblogs.com/tfxz/p/12662423.html
3 maven打包失败invalid LOC header (bad signature)
maven进行package,构建失败,并报如下错误:
Error creating shaded jar: invalid LOC header (bad signature) -> [Help 1]
一开始更换高版本的maven plugin,再次打包,还是失败,不过这次的报错更加详细,指明了有问题的jar包:/mnt/d/Develop/mavenRepo/com/alibaba/fastjson/1.2.44/fastjson-1.2.44.jar
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 32.235 s
[INFO] Finished at: 2019-11-28T20:46:18+08:00
[INFO] Final Memory: 56M/720M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-shade-plugin:3.1.1:shade (default) on project flink-base: Error creating shaded jar: Problem shading JAR /mnt/d/Develop/mavenRepo/com/alibaba/fastjson/1.2.44/fastjson-1.2.44.jar entry META-INF/services/org.glassfish.jersey.internal.spi.AutoDiscoverable: java.util.zip.ZipException: invalid LOC header (bad signature) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
也可以根据提示,在工程目录文件夹下的命令行下运行mvn -e package,mvn -X package,并且我看到网上也会打印具体的错误信息,指出有问题的jar包,不过这个要重新下载所有的jar包,太耗费时间了,就没有尝试。
然后进入目录,/mnt/d/Develop/mavenRepo/com/alibaba/fastjson/1.2.44/,发现里面有一个aether-fb0089ce-dff1-4256-9ccc-a2bb768c48f7-fastjson-1.2.44.jar.sha1-in-progress 文件,删除该目录下所有的文件,重新package,成功!
小结:本次问题,还是maven下载jar错误的原因,遇到的大多数问题都是maven下载jar包错误的原因。
4 导入spark程序的maven依赖包时,无法导入,报错
desc: 导入spark程序的maven依赖包时,无法导入,且报错。
errorlog:
0:23 Unable to import maven project: See logs for details
2019-08-23 00:34:05,140 [ 747292] WARN - #org.jetbrains.idea.maven - Cannot reconnect.
java.lang.RuntimeException: Cannot reconnect.
at org.jetbrains.idea.maven.server.RemoteObjectWrapper.perform(RemoteObjectWrapper.java:111)
at org.jetbrains.idea.maven.server.MavenIndexerWrapper.createIndex(MavenIndexerWrapper.java:61)
at org.jetbrains.idea.maven.indices.MavenIndex.createContext(MavenIndex.java:396)
at org.jetbrains.idea.maven.indices.MavenIndex.access$500(MavenIndex.java:48)
at org.jetbrains.idea.maven.indices.MavenIndex$IndexData.<init>(MavenIndex.java:703)
at org.jetbrains.idea.maven.indices.MavenIndex.doOpen(MavenIndex.java:236)
at org.jetbrains.idea.maven.indices.MavenIndex.open(MavenIndex.java:202)
at org.jetbrains.idea.maven.indices.MavenIndex.<init>(MavenIndex.java:104)
at org.jetbrains.idea.maven.indices.MavenIndices.add(MavenIndices.java:92)
at org.jetbrains.idea.maven.indices.MavenIndicesManager.ensureIndicesExist(MavenIndicesManager.java:174)
at org.jetbrains.idea.maven.indices.MavenProjectIndicesManager$3.run(MavenProjectIndicesManager.java:117)
at com.intellij.util.ui.update.MergingUpdateQueue.execute(MergingUpdateQueue.java:337)
at com.intellij.util.ui.update.MergingUpdateQueue.execute(MergingUpdateQueue.java:327)
at com.intellij.util.ui.update.MergingUpdateQueue.lambda$flush$1(MergingUpdateQueue.java:277)
at com.intellij.util.ui.update.MergingUpdateQueue.flush(MergingUpdateQueue.java:291)
at com.intellij.util.ui.update.MergingUpdateQueue.run(MergingUpdateQueue.java:246)
at com.intellij.util.concurrency.QueueProcessor.runSafely(QueueProcessor.java:246)
at com.intellij.util.Alarm$Request.runSafely(Alarm.java:417)
at com.intellij.util.Alarm$Request.access$700(Alarm.java:344)
at com.intellij.util.Alarm$Request$1.run(Alarm.java:384)
at com.intellij.util.Alarm$Request.run(Alarm.java:395)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at com.intellij.util.concurrency.SchedulingWrapper$MyScheduledFutureTask.run(SchedulingWrapper.java:242)
at com.intellij.util.concurrency.BoundedTaskExecutor$2.run(BoundedTaskExecutor.java:212)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.rmi.UnmarshalException: Error unmarshaling return header; nested exception is:
java.net.SocketException: Connection reset
RCA: maven版本问题,我原来使用的是maven3.6.0,不兼容。
我需要导入的maven依赖如下:
<properties>
<scala.version>2.11.8</scala.version>
<hadoop.version>2.7.4</hadoop.version>
<spark.version>2.1.3</spark.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<args>
<arg>-dependencyfile</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass></mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
action:
- 更换仓库为一个空白的repository。同时该仓库的路径比较浅,怀疑是原来的仓库的路径太深了。或者原来仓库内容有问题。没用。
- pom.xml中删除一些依赖、插件,然后一个个添加,没用。
solution: 更换maven为idea自带的maven3.3.9
5 spark程序编译报错error: object apache is not a member of package org
Spark程序编译报错, errorlog:
[INFO] Compiling 2 source files to E:\Develop\IDEAWorkspace\spark\target\classes at 1567004370534
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:3: error: object apache is not a member of package org
[ERROR] import org.apache.spark.rdd.RDD
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:4: error: object apache is not a member of package org
[ERROR] import org.apache.spark.{SparkConf, SparkContext}
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCount").setMaster("local[2]")
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCount").setMaster("local[2]")
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:18: error: not found: type RDD
[ERROR] val data: RDD[String] = sc.textFile("E:\\Study\\BigData\\heima\\stage5\\2spark����\\words.txt")
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:20: error: not found: type RDD
[ERROR] val words: RDD[String] = data.flatMap(_.split(" "))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:22: error: not found: type RDD
[ERROR] val wordToOne: RDD[(String, Int)] = words.map((_,1))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:24: error: not found: type RDD
[ERROR] val result: RDD[(String, Int)] = wordToOne.reduceByKey(_+_)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCount.scala:27: error: not found: type RDD
[ERROR] val ascResult: RDD[(String, Int)] = result.sortBy(_._2,false)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCountCluster.scala:3: error: object apache is not a member of package org
[ERROR] import org.apache.spark.{SparkConf, SparkContext}
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCountCluster.scala:4: error: object apache is not a member of package org
[ERROR] import org.apache.spark.rdd.RDD
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCountCluster.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCountCluster")
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCountCluster.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCountCluster")
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCountCluster.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCountCluster.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCountCluster.scala:18: error: not found: type RDD
[ERROR] val data: RDD[String] = sc.textFile(args(0))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCountCluster.scala:20: error: not found: type RDD
[ERROR] val words: RDD[String] = data.flatMap(_.split(" "))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCountCluster.scala:22: error: not found: type RDD
[ERROR] val wordToOne: RDD[(String, Int)] = words.map((_,1))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\develop\wordCount\WordCountCluster.scala:24: error: not found: type RDD
[ERROR] val result: RDD[(String, Int)] = wordToOne.reduceByKey(_+_)
[ERROR] ^
[ERROR] 21 errors found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
RCA: 本地仓库有问题。很可能是原来的本地仓库路径太长了太深了,仓库本身没问题,因为我把原来的仓库拷贝到E:\目录下,就能正常使用。
solution:
原来spark工程的maven本地仓库是:E:\develop\BigData\maven\maven1\maven2\maven3\sparkRepository
后来我修改为:E:\repository 就可以了。
6 springboot多模块使用maven打包报错
我的项目中springboot版本是2.4.12,项目名称是 java8. 在springboot项目中,打开maven,找到父模块java8,点击“package”按钮进行打包。
desc1: 报错内容如下:
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for java8 1.0:
[INFO]
[INFO] java8 .............................................. SUCCESS [ 1.005 s]
[INFO] common ............................................. SUCCESS [ 0.948 s]
[INFO] ch01 ............................................... SUCCESS [ 0.367 s]
[INFO] ch26 ............................................... FAILURE [ 1.345 s]
[INFO] ch16 ............................................... SKIPPED
[INFO] ch23 ............................................... SKIPPED
[INFO] ch14 ............................................... SKIPPED
[INFO] ch10 ............................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.078 s
[INFO] Finished at: 2021-11-08T10:15:09+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project ch26: Compilation failure: Compilation failure:
[ERROR] /develop/ideaws/java8/ch26/src/main/java/com/mediocre/concurrency/Baked.java:[3,29] 程序包com.mediocre.constant不存在
[ERROR] /develop/ideaws/java8/ch26/src/main/java/com/mediocre/concurrency/CollectionIntoStream.java:[3,25] 程序包com.mediocre.util不存在
RCA1:我在ch26子模块中引用/依赖了common子模块,该springboot项目是通过 Spring Initializer
构建的,父模块中自动带出的打包插件是 spring-boot-maven-plugin
,无法识别到引用的其他子模块,所以报错了:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<excludes>
<exclude>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</exclude>
</excludes>
</configuration>
</plugin>
solution1:父模块弃用springboot自带插件,改用以下3个打包插件:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
<warName>${project.artifactId}</warName>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<!-- 打包时跳过测试 -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M5</version>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
desc2: 再次打包,报其他错误如下:
[INFO] ch26 ............................................... FAILURE [ 2.126 s]
[ERROR] Failed to execute goal org.springframework.boot:spring-boot-maven-plugin:2.4.12:repackage (repackage) on project ch26: Execution repackage of goal org.springframework.boot:spring-boot-maven-plugin:2.4.12:repackage failed: Unable to find a single main class from the following candidates [com.mediocre.concurrency.CachedThreadPool, com.mediocre.concurrency.CachedThreadPool2, com.mediocre.concurrency.CachedThreadPool3, com.mediocre.concurrency.CatchCompletableExceptions, com.mediocre.concurrency.CollectionIntoStream, com.mediocre.concurrency.CompletableApply, com.mediocre.concurrency.CompletableExceptions, com.mediocre.concurrency.CompletableOperations, com.mediocre.concurrency.CompletedMachine, com.mediocre.concurrency.CountingStream, com.mediocre.concurrency.DiningPhilosophers, com.mediocre.concurrency.DualCompletableOperations, com.mediocre.concurrency.FrostedCake, com.mediocre.concurrency.Futures, com.mediocre.concurrency.LambdasAndMethodReferences, com.mediocre.concurrency.MoreTasksAfterShutdown, com.mediocre.concurrency.ParallelPrime, com.mediocre.concurrency.ParallelStreamPuzzle, com.mediocre.concurrency.ParallelStreamPuzzle2, com.mediocre.concurrency.ParallelStreamPuzzle3, com.mediocre.concurrency.QuittingCompletable, com.mediocre.concurrency.QuittingTasks, com.mediocre.concurrency.SingleThreadExecutor, com.mediocre.concurrency.SingleThreadExecutor2, com.mediocre.concurrency.SingleThreadExecutor3, com.mediocre.concurrency.StreamExceptions, com.mediocre.concurrency.Summing, com.mediocre.concurrency.Summing2, com.mediocre.concurrency.Summing3, com.mediocre.concurrency.Summing4]
RCA:
注意其中的一句话:Unable to find a single main class from the following candidates
,找不到住类,原因是打包时程序将子模块ch26当成了可执行的程序,即打成jar包后可以使用java -jar命令来启动起来,而要可执行,必须有 Application.java这个主类,但是该子模块却没有,也不是可执行的jar包。
但是,为什么程序打包时会把它当成可执行程序呢? 因为该子模块ch26使用了一个打包插件spring-boot-maven-plugin,如下:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<excludes>
<exclude>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</exclude>
</excludes>
</configuration>
</plugin>
这还是和上面一样,是springboot项目生成时自动带出来的一个插件。
solution:去掉该打包插件,实际上,不需要任何插件。
不过,如果该子模块确实有主类,且也要打包成可执行jar包,那么就需要该插件,此时,可以使用如下的插件组合:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.4.12</version>
<configuration>
<fork>true</fork> <!– 如果没有该配置,devtools不会生效 –>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
<warName>${project.artifactId}</warName>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
desc3: 通过以上2个解决方案,再打包父目录,就没问题了。不过单独打包其中的子模块为可执行jar包时,还是报错:“找不到引用的其他子模块依赖”,或者 “java编译报错***.java:[131,25] 找不到符号”。
RCA: 未知
solution:maven中找到父模块,点击install或package,然后下一次就可以单独打包子模块了,比如有时需要清空子模块的打包文件,然后重新打包。
7 maven执行install报错“程序包org.bouncycastle.jce.provider不存在”
DESC:
我是要将一个java utils项目,整个install到 maven仓库中,以便被其他项目引用。
但是在执行maven install时,却报错“程序包org.bouncycastle.jce.provider不存在”。
RCA:
检查才发现,该程序包所在的jar包,是我手动添加到maven仓库的,当时因为该依赖包是我从网上下载到本地,然后直接添加到了项目中,所以和maven仓库无关,并不在maven本地仓库中。
SOLUTION:
解决方法也很简单,我只需要将该依赖直接install到maven本地仓库中,然后在 java utils项目 中去掉之前手动添加的依赖,并引用该maven依赖即可:
# 将依赖install到maven本地仓库中
mvn install:install-file -Dfile=/develop/ideaws/javaUtils/lib/bcprov-jdk16-139.jar -DgroupId=org.bouncycastle -DartifactId=bcprov-jdk16 -Dversion=1.39.0 -Dpackaging=jar -s /develop/mvnrepo2/settings.xml -Dmaven.repo.local=/develop/mvnrepo2 -DskipTests=true
<!-- install完毕后,在pom.xml中引用之前 install 到本地maven仓库的依赖 -->
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcprov-jdk16</artifactId>
<version>1.39.0</version>
</dependency>
引用了该依赖后,我就可以对 java utils项目 执行 maven install
操作了。
REVIEW:
其实后来我在maven官方网站上,也找到了该依赖,并直接引入了 pom.xml,这样就更方便也更正规了。