Provided Maven Coordinates must be in the form 'groupId:artifactId:version'.
[hadoop@hadoop1 bin]$ ./spark-shell --packages org.mongodb.spark:mongo-spark-connector_2.10-2.2.1 Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Provided Maven Coordinates must be in the form 'groupId:artifactId:version'. The coordinate provided is: org.mongodb.spark:mongo-spark-connector_2.10-2.2.1 at scala.Predef$.require(Predef.scala:224) at org.apache.spark.deploy.SparkSubmitUtils$$anonfun$extractMavenCoordinates$1.apply(SparkSubmit.scala:901) at org.apache.spark.deploy.SparkSubmitUtils$$anonfun$extractMavenCoordinates$1.apply(SparkSubmit.scala:899) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186) at org.apache.spark.deploy.SparkSubmitUtils$.extractMavenCoordinates(SparkSubmit.scala:899) at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1127) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:298) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [hadoop@hadoop1 bin]$ ./spark-shell --packages org.mongodb.spark:mongo-spark-connector_2.10:2.2.1 Ivy Default Cache set to: /home/hadoop/.ivy2/cache The jars for the packages stored in: /home/hadoop/.ivy2/jars :: loading settings :: url = jar:file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml org.mongodb.spark#mongo-spark-connector_2.10 added as a dependency :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 confs: [default] found org.mongodb.spark#mongo-spark-connector_2.10;2.2.1 in central found org.mongodb#mongo-java-driver;3.4.2 in central downloading https://repo1.maven.org/maven2/org/mongodb/spark/mongo-spark-connector_2.10/2.2.1/mongo-spark-connector_2.10-2.2.1.jar ... [SUCCESSFUL ] org.mongodb.spark#mongo-spark-connector_2.10;2.2.1!mongo-spark-connector_2.10.jar (1138ms) downloading https://repo1.maven.org/maven2/org/mongodb/mongo-java-driver/3.4.2/mongo-java-driver-3.4.2.jar ... [SUCCESSFUL ] org.mongodb#mongo-java-driver;3.4.2!mongo-java-driver.jar (614ms) :: resolution report :: resolve 2947ms :: artifacts dl 1756ms :: modules in use: org.mongodb#mongo-java-driver;3.4.2 from central in [default] org.mongodb.spark#mongo-spark-connector_2.10;2.2.1 from central in [default] --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 2 | 2 | 2 | 0 || 2 | 2 | --------------------------------------------------------------------- :: retrieving :: org.apache.spark#spark-submit-parent confs: [default] 2 artifacts copied, 0 already retrieved (2300kB/7ms) SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop-2.6.5/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 17/11/24 04:12:12 WARN util.Utils: Your hostname, hadoop1 resolves to a loopback address: 127.0.0.1; using 192.168.2.51 instead (on interface eno1) 17/11/24 04:12:12 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address 17/11/24 04:12:14 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/hadoop/spark/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/datanucleus-rdbms-3.2.9.jar." 17/11/24 04:12:14 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/hadoop/spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/datanucleus-api-jdo-3.2.6.jar." 17/11/24 04:12:14 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/hadoop/spark/jars/datanucleus-core-3.2.10.jar." 17/11/24 04:12:17 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException Spark context Web UI available at http://192.168.2.51:4040 Spark context available as 'sc' (master = local[*], app id = local-1511467933297). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.0 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_144) Type in expressions to have them evaluated. Type :help for more information. scala>