mongo-spark 安装排故 ./sbt check
[error] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:114) [error] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:168) [error] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:289) [error] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:176) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:216) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:207) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:146) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:139) [error] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:89) [error] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:86) [error] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:422) [error] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:413) [error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:86) [error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:45) [error] at com.mongodb.Mongo.execute(Mongo.java:845) [error] at com.mongodb.Mongo$2.execute(Mongo.java:828) [error] at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:143) [error] at com.mongodb.spark.MongoDBDefaults.dropDB(MongoDBDefaults.scala:67) [error] at com.mongodb.spark.JavaRequiresMongoDB.tearDown(JavaRequiresMongoDB.java:106) [error] ... [info] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromTheSparkConf started [error] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromTheSparkConf failed: com.mongodb.MongoCommandException: Command failed with error 13: 'not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }' on server localhost:27017. The full response is { "ok" : 0.0, "errmsg" : "not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }", "code" : 13, "codeName" : "Unauthorized" }, took 0.001 sec [error] at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:115) [error] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:114) [error] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:168) [error] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:289) [error] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:176) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:216) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:207) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:146) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:139) [error] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:89) [error] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:86) [error] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:422) [error] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:413) [error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:86) [error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:45) [error] at com.mongodb.Mongo.execute(Mongo.java:845) [error] at com.mongodb.Mongo$2.execute(Mongo.java:828) [error] at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:143) [error] at com.mongodb.spark.MongoDBDefaults.dropDB(MongoDBDefaults.scala:67) [error] at com.mongodb.spark.JavaRequiresMongoDB.setUp(JavaRequiresMongoDB.java:100) [error] ... [error] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromTheSparkConf failed: com.mongodb.MongoCommandException: Command failed with error 13: 'not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }' on server localhost:27017. The full response is { "ok" : 0.0, "errmsg" : "not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }", "code" : 13, "codeName" : "Unauthorized" }, took 0.001 sec [error] at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:115) [error] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:114) [error] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:168) [error] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:289) [error] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:176) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:216) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:207) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:146) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:139) [error] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:89) [error] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:86) [error] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:422) [error] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:413) [error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:86) [error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:45) [error] at com.mongodb.Mongo.execute(Mongo.java:845) [error] at com.mongodb.Mongo$2.execute(Mongo.java:828) [error] at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:143) [error] at com.mongodb.spark.MongoDBDefaults.dropDB(MongoDBDefaults.scala:67) [error] at com.mongodb.spark.JavaRequiresMongoDB.tearDown(JavaRequiresMongoDB.java:106) [error] ... [info] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromAJavaMapAndUseDefaults started [error] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromAJavaMapAndUseDefaults failed: com.mongodb.MongoCommandException: Command failed with error 13: 'not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }' on server localhost:27017. The full response is { "ok" : 0.0, "errmsg" : "not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }", "code" : 13, "codeName" : "Unauthorized" }, took 0.001 sec [error] at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:115) [error] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:114) [error] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:168) [error] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:289) [error] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:176) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:216) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:207) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:146) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:139) [error] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:89) [error] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:86) [error] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:422) [error] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:413) [error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:86) [error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:45) [error] at com.mongodb.Mongo.execute(Mongo.java:845) [error] at com.mongodb.Mongo$2.execute(Mongo.java:828) [error] at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:143) [error] at com.mongodb.spark.MongoDBDefaults.dropDB(MongoDBDefaults.scala:67) [error] at com.mongodb.spark.JavaRequiresMongoDB.setUp(JavaRequiresMongoDB.java:100) [error] ... [error] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromAJavaMapAndUseDefaults failed: com.mongodb.MongoCommandException: Command failed with error 13: 'not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }' on server localhost:27017. The full response is { "ok" : 0.0, "errmsg" : "not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }", "code" : 13, "codeName" : "Unauthorized" }, took 0.001 sec [error] at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:115) [error] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:114) [error] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:168) [error] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:289) [error] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:176) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:216) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:207) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:146) [error] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:139) [error] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:89) [error] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:86) [error] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:422) [error] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:413) [error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:86) [error] at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:45) [error] at com.mongodb.Mongo.execute(Mongo.java:845) [error] at com.mongodb.Mongo$2.execute(Mongo.java:828) [error] at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:143) [error] at com.mongodb.spark.MongoDBDefaults.dropDB(MongoDBDefaults.scala:67) [error] at com.mongodb.spark.JavaRequiresMongoDB.tearDown(JavaRequiresMongoDB.java:106) [error] ... [info] Test run finished: 6 failed, 0 ignored, 3 total, 0.003s [info] HelpersSpec: [info] Exception encountered when attempting to run a suite with class name: UDF.HelpersSpec *** ABORTED *** [info] com.mongodb.MongoCommandException: Command failed with error 13: 'not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }' on server localhost:27017. The full response is { "ok" : 0.0, "errmsg" : "not authorized on mongo-spark-connector-test to execute command { dropDatabase: 1 }", "code" : 13, "codeName" : "Unauthorized" } [info] at com.mongodb.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:115) [info] at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:114) [info] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:168) [info] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:289) [info] at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:176) [info] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:216) [info] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:207) [info] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:146) [info] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:139) [info] at com.mongodb.operation.DropDatabaseOperation$1.call(DropDatabaseOperation.java:89) [info] ... 17/12/01 09:34:10 INFO AuthConnectionSpec: Ended Test: 'AuthConnectionSpec' [info] AuthConnectionSpec: [info] MongoRDD [info] - should be able to connect to an authenticated db *** FAILED *** [info] org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: [info] org.apache.spark.SparkContext.<init>(SparkContext.scala:76) [info] com.mongodb.spark.SparkConfOverrideSpec.<init>(SparkConfOverrideSpec.scala:66) [info] sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [info] sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [info] sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [info] java.lang.reflect.Constructor.newInstance(Constructor.java:423) [info] java.lang.Class.newInstance(Class.java:442) [info] org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:646) [info] sbt.TestRunner.runTest$1(TestFramework.scala:76) [info] sbt.TestRunner.run(TestFramework.scala:85) [info] sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202) [info] sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202) [info] sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:185) [info] sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202) [info] sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202) [info] sbt.TestFunction.apply(TestFramework.scala:207) [info] sbt.Tests$.sbt$Tests$$processRunnable$1(Tests.scala:239) [info] sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245) [info] sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245) [info] sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44) [info] at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2472) [info] at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2468) [info] at scala.Option.foreach(Option.scala:257) [info] at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2468) [info] at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2557) [info] at org.apache.spark.SparkContext.<init>(SparkContext.scala:85) [info] at com.mongodb.spark.AuthConnectionSpec$$anonfun$1.apply$mcV$sp(AuthConnectionSpec.scala:31) [info] at com.mongodb.spark.AuthConnectionSpec$$anonfun$1.apply(AuthConnectionSpec.scala:30) [info] at com.mongodb.spark.AuthConnectionSpec$$anonfun$1.apply(AuthConnectionSpec.scala:30) [info] at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22) [info] ... [info] Reading scoverage instrumentation [/usr/local/mongo-spark/target/scala-2.11/scoverage-data/scoverage.coverage.xml] [info] Reading scoverage measurements... [info] Generating scoverage reports... [info] Written Cobertura report [/usr/local/mongo-spark/target/scala-2.11/coverage-report/cobertura.xml] [info] Written XML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/scoverage.xml] [info] Written HTML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/index.html] [info] Coverage reports completed [info] All done. Coverage was [24.73%] [info] ScalaCheck [info] Passed: Total 0, Failed 0, Errors 0, Passed 0 [info] ScalaTest [info] Run completed in 3 seconds, 56 milliseconds. [info] Total number of tests run: 23 [info] Suites: completed 4, aborted 18 [info] Tests: succeeded 22, failed 1, canceled 0, ignored 0, pending 0 [info] *** 18 SUITES ABORTED *** [info] *** 1 TEST FAILED *** [error] Error: Total 79, Failed 39, Errors 18, Passed 22 [error] Failed tests: [error] com.mongodb.spark.AuthConnectionSpec [error] com.mongodb.spark.sql.MongoDataFrameReaderTest [error] com.mongodb.spark.config.ReadConfigTest [error] com.mongodb.spark.MongoSparkTest [error] com.mongodb.spark.sql.MongoDataFrameWriterTest [error] com.mongodb.spark.config.WriteConfigTest [error] com.mongodb.spark.sql.MongoDataFrameTest [error] com.mongodb.spark.NoSparkConfTest [error] com.mongodb.spark.MongoConnectorTest [error] com.mongodb.spark.sql.fieldTypes.api.java.FieldTypesTest [error] Error during tests: [error] com.mongodb.spark.rdd.partitioner.MongoSplitVectorPartitionerSpec [error] com.mongodb.spark.connection.DefaultMongoClientFactorySpec [error] com.mongodb.spark.rdd.partitioner.MongoShardedPartitionerSpec [error] com.mongodb.spark.rdd.partitioner.PartitionerHelperSpec [error] com.mongodb.spark.rdd.partitioner.MongoSamplePartitionerSpec [error] com.mongodb.spark.MongoConnectorSpec [error] com.mongodb.spark.sql.MongoInferSchemaSpec [error] com.mongodb.spark.rdd.partitioner.MongoPaginateBySizePartitionerSpec [error] com.mongodb.spark.sql.MapFunctionsSpec [error] com.mongodb.spark.sql.MongoRelationHelperSpec [error] UDF.HelpersSpec [error] com.mongodb.spark.connection.MongoClientRefCounterSpec [error] com.mongodb.spark.sql.MongoDataFrameSpec [error] com.mongodb.spark.MongoRDDSpec [error] com.mongodb.spark.NoSparkConfSpec [error] com.mongodb.spark.sql.fieldTypes.FieldTypesSpec [error] com.mongodb.spark.connection.MongoClientCacheSpec [error] com.mongodb.spark.rdd.partitioner.MongoPaginateByCountPartitionerSpec [error] com.mongodb.spark.SparkConfOverrideSpec [error] (mongo-spark-connector/test:test) sbt.TestsFailedException: Tests unsuccessful [error] Total time: 189 s, completed 2017-12-1 9:34:11 [root@hadoop1 mongo-spark]# ./sbt check [info] Loading project definition from /usr/local/mongo-spark/project [info] Set current project to mongo-spark-connector (in build file:/usr/local/mongo-spark/) [warn] Credentials file /root/.ivy2/.spCredentials does not exist [success] Total time: 0 s, completed 2017-12-1 9:34:44 [info] scalastyle using config /usr/local/mongo-spark/project/scalastyle-config.xml [info] Processed 71 file(s) [info] Found 0 errors [info] Found 0 warnings [info] Found 0 infos [info] Finished in 7 ms [success] created output: /usr/local/mongo-spark/target [success] Total time: 2 s, completed 2017-12-1 9:34:46 [success] Total time: 0 s, completed 2017-12-1 9:34:46 [warn] Credentials file /root/.ivy2/.spCredentials does not exist [info] Updating {file:/usr/local/mongo-spark/}mongo-spark-connector... [info] Resolving org.scala-lang#scala-library;2.11.8 ... [info] Formatting 33 Scala sources {file:/usr/local/mongo-spark/}mongo-spark-connector(test) ... [info] Resolving org.apache#apache;14 ... [error] Server access Error: 连接超时 (Connection timed out) url=https://repo1.maven.org/maven2/org/apache/spark/spark-parent_2.11/2.2.0/spark-parent_2.11-2.2.0.jar [info] Resolving jline#jline;2.12.1 ... [info] Done updating. [info] Formatting 71 Scala sources {file:/usr/local/mongo-spark/}mongo-spark-connector(compile) ... [info] Compiling 71 Scala sources and 15 Java sources to /usr/local/mongo-spark/target/scala-2.11/classes... [info] [info] Cleaning datadir [/usr/local/mongo-spark/target/scala-2.11/scoverage-data] [info] [info] Beginning coverage instrumentation [info] [info] Instrumentation completed [2907 statements] [info] [info] Wrote instrumentation file [/usr/local/mongo-spark/target/scala-2.11/scoverage-data/scoverage.coverage.xml] [info] [info] Will write measurement data to [/usr/local/mongo-spark/target/scala-2.11/scoverage-data] [info] Compiling 33 Scala sources and 14 Java sources to /usr/local/mongo-spark/target/scala-2.11/test-classes... 17/12/01 09:37:48 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/12/01 09:37:48 WARN Utils: Your hostname, hadoop1 resolves to a loopback address: 127.0.0.1; using 192.168.2.51 instead (on interface eno1) 17/12/01 09:37:48 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 17/12/01 09:37:49 INFO log: Logging initialized @188125ms 17/12/01 09:37:49 INFO MongoClientCache: Creating MongoClient: [localhost:27017] 17/12/01 09:37:49 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:37:50 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:37:51 INFO MongoRelation: requiredColumns: _id, age, name, filters: [info] Test run started [info] Test com.mongodb.spark.NoSparkConfTest.shouldBeAbleToUseConfigsWithRDDs started [info] Test com.mongodb.spark.NoSparkConfTest.shouldBeAbleToUseConfigsWithDataFrames started [info] Test run finished: 0 failed, 0 ignored, 2 total, 3.87s 17/12/01 09:37:52 INFO MongoPaginateByCountPartitionerSpec: Running Test: 'MongoPaginateByCountPartitionerSpec' 17/12/01 09:37:52 INFO MongoPaginateByCountPartitionerSpec: Running Test: 'MongoPaginateByCountPartitionerSpec' 17/12/01 09:37:52 INFO MongoPaginateByCountPartitionerSpec: Running Test: 'MongoPaginateByCountPartitionerSpec' 17/12/01 09:37:52 INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoPaginateByCountPartitionerSpec' 17/12/01 09:37:52 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:37:52 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:37:53 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThanOrEqual(_id,00001), LessThan(_id,00031) 17/12/01 09:37:53 INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThanOrEqual(_id,00001), LessThan(_id,00031) 17/12/01 09:37:53 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), LessThan(_id,00099) 17/12/01 09:37:53 INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), LessThan(_id,00099) 17/12/01 09:37:53 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThan(_id,09000) 17/12/01 09:37:53 INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThan(_id,09000) 17/12/01 09:37:53 INFO MongoPaginateByCountPartitionerSpec: Running Test: 'MongoPaginateByCountPartitionerSpec' 17/12/01 09:37:53 WARN MongoPaginateByCountPartitioner: Inefficient partitioning, creating a partition per document. Decrease the `numberOfPartitions` property. 17/12/01 09:37:53 INFO MongoPaginateByCountPartitioner: Empty collection (MongoPaginateByCountPartitionerSpec), using a single partition 17/12/01 09:37:53 INFO MongoPaginateByCountPartitioner: Empty collection (MongoPaginateByCountPartitionerSpec), using a single partition 17/12/01 09:37:53 INFO MongoPaginateByCountPartitionerSpec: Ended Test: 'MongoPaginateByCountPartitionerSpec' [info] MongoPaginateByCountPartitionerSpec: [info] MongoPaginateByCountPartitioner [info] - should partition the database as expected [info] - should partition on an alternative shardkey as expected [info] - should use the users pipeline when set in a rdd / dataframe [info] - should handle fewer documents than partitions [info] - should handle no collection [info] WriteConfigSpec: [info] WriteConfig [info] - should have the expected defaults [info] - should be creatable from SparkConfig [info] - should round trip options [info] - should validate the values 17/12/01 09:37:54 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:37:54 INFO MongoRelation: requiredColumns: _id, age, name, filters: 17/12/01 09:37:54 INFO SparkConfOverrideSpec: Ended Test: 'be able to able to override partial configs with options' [info] SparkConfOverrideSpec: [info] MongoRDD [info] - should be able to override partial configs with Read / Write Configs [info] DataFrame Readers and Writers [info] - should be able to able to override partial configs with options [info] Test run started [info] Test com.mongodb.spark.sql.MongoDataFrameWriterTest.shouldBeEasilyCreatedFromADataFrameAndSaveToMongo started [info] Test com.mongodb.spark.sql.MongoDataFrameWriterTest.shouldTakeACustomOptions started [info] Test com.mongodb.spark.sql.MongoDataFrameWriterTest.shouldBeEasilyCreatedFromMongoSpark started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.658s 17/12/01 09:37:55 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:37:55 INFO MongoRelation: requiredColumns: _id, arrayInt, binary, boolean, code, codeWithScope, date, dbPointer, document, double, int32, int64, maxKey, minKey, null, objectId, oldBinary, regex, string, symbol, timestamp, undefined, filters: [info] Test run started [info] Test com.mongodb.spark.sql.MongoDataFrameTest.shouldRoundTripAllBsonTypes started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.4s 17/12/01 09:37:55 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:55 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:37:55 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) 17/12/01 09:37:55 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:55 INFO MongoRelation: requiredColumns: name, age, filters: 17/12/01 09:37:56 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:56 INFO MongoRelation: requiredColumns: a, filters: 17/12/01 09:37:56 INFO MongoRelation: requiredColumns: a, filters: 17/12/01 09:37:56 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:56 INFO MongoRelation: requiredColumns: a, filters: 17/12/01 09:37:56 INFO MongoRelation: requiredColumns: a, filters: 17/12/01 09:37:56 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:56 INFO MongoRelation: requiredColumns: _id, a, filters: 17/12/01 09:37:56 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:37:56 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:56 INFO MongoRelation: requiredColumns: _id, a, filters: 17/12/01 09:37:56 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:37:56 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:56 INFO MongoRelation: requiredColumns: , filters: IsNotNull(age) 17/12/01 09:37:57 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) 17/12/01 09:37:57 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:57 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) 17/12/01 09:37:57 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:57 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:37:57 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) 17/12/01 09:37:57 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:57 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:57 INFO MongoRelation: requiredColumns: , filters: IsNotNull(age) 17/12/01 09:37:57 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:57 INFO MongoRelation: requiredColumns: , filters: IsNotNull(age) 17/12/01 09:37:57 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:57 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:58 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:58 INFO MongoRelation: requiredColumns: _id, arrayInt, binary, bool, code, codeWithScope, date, dbPointer, dbl, decimal, document, int32, int64, maxKey, minKey, nullValue, objectId, oldBinary, regex, string, symbol, timestamp, undefined, filters: 17/12/01 09:37:58 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:58 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:37:58 INFO MongoRelation: requiredColumns: nullValue, int32, int64, bool, date, dbl, decimal, string, minKey, maxKey, objectId, code, codeWithScope, regex, symbol, timestamp, undefined, binary, oldBinary, arrayInt, document, dbPointer, filters: 17/12/01 09:37:58 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:58 INFO MongoRelation: requiredColumns: name, attributes, filters: IsNotNull(name) 17/12/01 09:37:58 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:59 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:37:59 INFO MongoRelation: requiredColumns: _id, count, filters: IsNotNull(_id), IsNotNull(count) 17/12/01 09:37:59 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:37:59 INFO MongoRelation: requiredColumns: _id, count, filters: IsNotNull(_id), IsNotNull(count) 17/12/01 09:37:59 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:59 INFO MongoRelation: requiredColumns: _id, name, age, filters: 17/12/01 09:37:59 INFO MongoDataFrameSpec: Running Test: 'MongoDataFrameSpec' 17/12/01 09:37:59 INFO MongoRelation: requiredColumns: _id, name, filters: 17/12/01 09:37:59 INFO MongoRelation: requiredColumns: _id, name, age, filters: 17/12/01 09:37:59 INFO MongoDataFrameSpec: Ended Test: 'MongoDataFrameSpec' [info] MongoDataFrameSpec: [info] DataFrameReader [info] - should should be easily created from the SQLContext and load from Mongo [info] - should handle selecting out of order columns [info] - should handle mixed numerics with long precedence [info] - should handle mixed numerics with double precedence [info] - should handle array fields with null values [info] - should handle document fields with null values [info] - should be easily created with a provided case class [info] - should include any pipelines when inferring the schema [info] - should use any pipelines when set via the MongoRDD [info] - should throw an exception if pipeline is invalid [info] DataFrameWriter [info] - should be easily created from a DataFrame and save to Mongo [info] - should take custom writeConfig [info] - should support INSERT INTO SELECT statements [info] - should support INSERT OVERWRITE SELECT statements [info] DataFrames [info] - should round trip all bson types [info] - should be able to cast all types to a string value [info] - should be able to round trip schemas containing MapTypes [info] - should be able to upsert and replace data in an existing collection [info] - should be able to handle optional _id fields when upserting / replacing data in a collection [info] - should be able to set only the data in the Dataset to the collection 17/12/01 09:37:59 INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSamplePartitionerSpec' 17/12/01 09:38:00 INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSamplePartitionerSpec' 17/12/01 09:38:00 INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSamplePartitionerSpec' 17/12/01 09:38:00 INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSamplePartitionerSpec' 17/12/01 09:38:00 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:38:00 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:00 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), LessThan(_id,02041) 17/12/01 09:38:00 INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), LessThan(_id,02041) 17/12/01 09:38:01 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThanOrEqual(_id,00000) 17/12/01 09:38:01 INFO MongoSamplePartitioner: Could not find collection (MongoSamplePartitionerSpec), using a single partition 17/12/01 09:38:01 INFO MongoSamplePartitionerSpec: Ended Test: 'MongoSamplePartitionerSpec' [info] MongoSamplePartitionerSpec: [info] MongoSamplePartitioner [info] - should partition the database as expected [info] - should partition on an alternative shardkey as expected [info] - should partition with a composite key [info] - should use the users pipeline when set in a rdd / dataframe [info] - should have a default bounds of min to max key [info] - should handle no collection [info] - should handle an empty collection [info] ReadConfigSpec: [info] ReadConfig [info] - should have the expected defaults [info] - should be creatable from SparkConfig [info] - should use the URI for default values [info] - should override URI values with named values [info] - should round trip options [info] - should be able to create a map [info] - should create the expected ReadPreference and ReadConcern [info] - should validate the values 17/12/01 09:38:01 INFO MongoClientRefCounterSpec: Ended Test: 'MongoClientRefCounterSpec' [info] MongoClientRefCounterSpec: [info] MongoClientRefCounter [info] - should count references as expected [info] - should be able to acquire multiple times [info] - should throw an exception for invalid releases of a MongoClient 17/12/01 09:38:01 INFO MongoDBDefaults: Loading sample Data: ~5MB data into 'MongoSplitVectorPartitionerSpec' 17/12/01 09:38:01 INFO MongoSplitVectorPartitioner: No splitKeys were calculated by the splitVector command, proceeding with a single partition. If this is undesirable try lowering 'partitionSizeMB' property to produce more partitions. 17/12/01 09:38:01 INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSplitVectorPartitionerSpec' 17/12/01 09:38:01 INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoSplitVectorPartitionerSpec' 17/12/01 09:38:01 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:38:01 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:02 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThanOrEqual(_id,00001), LessThan(_id,00031) 17/12/01 09:38:02 INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThanOrEqual(_id,00001), LessThan(_id,00031) 17/12/01 09:38:02 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), LessThan(_id,00049) 17/12/01 09:38:02 INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), LessThan(_id,00049) 17/12/01 09:38:02 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThanOrEqual(_id,00051) 17/12/01 09:38:02 INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThanOrEqual(_id,00051) 17/12/01 09:38:02 INFO MongoSplitVectorPartitioner: No splitKeys were calculated by the splitVector command, proceeding with a single partition. If this is undesirable try lowering 'partitionSizeMB' property to produce more partitions. 17/12/01 09:38:02 INFO MongoSplitVectorPartitioner: Could not find collection (MongoSplitVectorPartitionerSpec), using a single partition 17/12/01 09:38:02 INFO MongoSplitVectorPartitioner: No splitKeys were calculated by the splitVector command, proceeding with a single partition. If this is undesirable try lowering 'partitionSizeMB' property to produce more partitions. 17/12/01 09:38:02 INFO MongoSplitVectorPartitionerSpec: Ended Test: 'MongoSplitVectorPartitionerSpec' [info] MongoSplitVectorPartitionerSpec: [info] MongoSplitVectorPartitioner [info] - should partition the database as expected [info] - should use the provided pipeline for min and max keys [info] - should use the users pipeline when set in a rdd / dataframe [info] - should have a default bounds of min to max key [info] - should handle no collection [info] - should handle an empty collection 17/12/01 09:38:02 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:02 INFO MongoRelation: requiredColumns: _id, binary, dbPointer, javaScript, javaScriptWithScope, maxKey, minKey, regularExpression, symbol, timestamp, undefined, filters: 17/12/01 09:38:02 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:02 INFO MongoRelation: requiredColumns: _id, binary, dbPointer, javaScript, javaScriptWithScope, maxKey, minKey, regularExpression, symbol, timestamp, undefined, filters: [info] Test run started [info] Test com.mongodb.spark.sql.fieldTypes.api.java.FieldTypesTest.shouldBeAbleToCreateADatasetBasedOnAJavaBeanRepresentingComplexBsonTypes started [info] Test com.mongodb.spark.sql.fieldTypes.api.java.FieldTypesTest.shouldAllowTheRoundTrippingOfAJavaBeanRepresentingComplexBsonTypes started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.387s 17/12/01 09:38:02 INFO MongoClientCache: Creating MongoClient: [localhost:27017] 17/12/01 09:38:02 INFO MongoClientCache: Creating MongoClient: [localhost:27017] 17/12/01 09:38:02 INFO MongoConnectorSpec: Ended Test: 'MongoConnectorSpec' [info] MongoConnectorSpec: [info] MongoConnector [info] - should create a MongoClient [info] - should set the correct localThreshold [info] - should Use the cache for MongoClients [info] - should create a MongoClient with a custom MongoConnectionFactory 17/12/01 09:38:02 INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoPaginateBySizePartitionerSpec' 17/12/01 09:38:02 INFO MongoPaginateBySizePartitioner: Inefficient partitioning, creating a single partition. Decrease the `partitionsizemb` property. 17/12/01 09:38:02 INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoPaginateBySizePartitionerSpec' 17/12/01 09:38:03 INFO MongoPaginateBySizePartitionerSpec: Running Test: 'MongoPaginateBySizePartitionerSpec' 17/12/01 09:38:03 INFO MongoDBDefaults: Loading sample Data: ~10MB data into 'MongoPaginateBySizePartitionerSpec' 17/12/01 09:38:03 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:38:03 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:03 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThanOrEqual(_id,00001), LessThan(_id,00031) 17/12/01 09:38:03 INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThanOrEqual(_id,00001), LessThan(_id,00031) 17/12/01 09:38:03 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), LessThan(_id,00049) 17/12/01 09:38:03 INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), LessThan(_id,00049) 17/12/01 09:38:03 INFO MongoRelation: requiredColumns: _id, filters: IsNotNull(_id), GreaterThan(_id,00050) 17/12/01 09:38:03 INFO MongoRelation: requiredColumns: _id, pk, s, filters: IsNotNull(_id), GreaterThan(_id,00050) 17/12/01 09:38:03 INFO MongoPaginateBySizePartitioner: Inefficient partitioning, creating a single partition. Decrease the `partitionsizemb` property. 17/12/01 09:38:03 INFO MongoPaginateBySizePartitioner: Could not find collection (MongoPaginateBySizePartitionerSpec), using a single partition 17/12/01 09:38:03 INFO MongoPaginateBySizePartitioner: Empty collection (MongoPaginateBySizePartitionerSpec), using a single partition 17/12/01 09:38:03 INFO MongoPaginateBySizePartitionerSpec: Ended Test: 'MongoPaginateBySizePartitionerSpec' [info] MongoPaginateBySizePartitionerSpec: [info] MongoPaginateBySizePartitioner [info] - should partition the database as expected [info] - should partition on an alternative shardkey as expected [info] - should use the users pipeline when set in a rdd / dataframe [info] - should have a default bounds of min to max key [info] - should handle no collection [info] - should handle an empty collection 17/12/01 09:38:03 INFO MapFunctionsSpec: Ended Test: 'MapFunctionsSpec' [info] MapFunctionsSpec: [info] documentToRow [info] - should convert a Document into a Row with the given schema [info] - should not prune the schema when given a document with missing values [info] - should prune the schema when limited by passed required columns [info] - should ignore any extra data in the document that is not included in the schema [info] - should handle nested schemas [info] - should handle schemas containing maps [info] - should throw an exception when passed maps without string keys [info] rowToDocument [info] - should convert a Row into a Document [info] - should handle nested schemas [info] - should handle nested schemas within nested arrays [info] - should handle mixed numerics based on the schema [info] - should throw a MongoTypeConversionException when casting to an invalid DataType 17/12/01 09:38:03 INFO FieldTypesSpec: Running Test: 'FieldTypesSpec' 17/12/01 09:38:03 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:38:03 INFO MongoRelation: requiredColumns: _id, binary, dbPointer, javaScript, javaScriptWithScope, minKey, maxKey, regularExpression, symbol, timestamp, undefined, filters: 17/12/01 09:38:04 INFO FieldTypesSpec: Running Test: 'FieldTypesSpec' 17/12/01 09:38:04 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:04 INFO MongoRelation: requiredColumns: _id, binary, dbPointer, javaScript, javaScriptWithScope, minKey, maxKey, regularExpression, symbol, timestamp, undefined, filters: 17/12/01 09:38:04 INFO FieldTypesSpec: Ended Test: 'FieldTypesSpec' [info] FieldTypesSpec: [info] Fields [info] - should allow the round tripping of a case class representing complex bson types [info] - should be able to create a dataset based on a case class representing complex bson types [info] - should be able to create a Regular Expression from a Pattern [info] BsonValueOrderingSpec: [info] The BsonValueOrdering trait [info] - should order all bson types correctly [info] - should compare numbers types correctly [info] - should compare numbers and longs correctly [info] - should compare string types correctly [info] - should compare timestamp types correctly [info] - should compare binary types correctly [info] - should compare array types correctly [info] - should compare regex types correctly [info] - should compare document types correctly [info] - should have no defined order for undefined types 17/12/01 09:38:04 INFO MongoClientCache: Creating MongoClient: [localhost:27017] 17/12/01 09:38:04 INFO MongoClientCache: Closing MongoClient: [localhost:27017] 17/12/01 09:38:04 INFO MongoClientCache: Creating MongoClient: [localhost:27017] 17/12/01 09:38:04 INFO MongoClientCache: Closing MongoClient: [localhost:27017] 17/12/01 09:38:04 INFO MongoClientCache: Creating MongoClient: [localhost:27017] 17/12/01 09:38:04 INFO MongoClientCache: Closing MongoClient: [localhost:27017] 17/12/01 09:38:04 INFO MongoClientCache: Creating MongoClient: [localhost:27017] 17/12/01 09:38:04 INFO MongoClientCache: Closing MongoClient: [localhost:27017] 17/12/01 09:38:04 WARN MongoClientCache: Release without acquire for key: Mongo{options=MongoClientOptions{description='null', applicationName='null', readPreference=primary, writeConcern=WriteConcern{w=null, wTimeout=null ms, fsync=null, journal=null, readConcern=com.mongodb.ReadConcern@0, codecRegistry=org.bson.codecs.configuration.ProvidersCodecRegistry@8cef5306, commandListeners=[], clusterListeners=[], serverListeners=[], serverMonitorListeners=[], minConnectionsPerHost=0, maxConnectionsPerHost=100, threadsAllowedToBlockForConnectionMultiplier=5, serverSelectionTimeout=30000, maxWaitTime=120000, maxConnectionIdleTime=0, maxConnectionLifeTime=0, connectTimeout=10000, socketTimeout=0, socketKeepAlive=false, sslEnabled=false, sslInvalidHostNamesAllowed=false, alwaysUseMBeans=false, heartbeatFrequency=10000, minHeartbeatFrequency=500, heartbeatConnectTimeout=20000, heartbeatSocketTimeout=20000, localThreshold=15, requiredReplicaSetName='null', dbDecoderFactory=com.mongodb.DefaultDBDecoder$1@1999df34, dbEncoderFactory=com.mongodb.DefaultDBEncoder$1@4f04d552, socketFactory=null, cursorFinalizerEnabled=true, connectionPoolSettings=ConnectionPoolSettings{maxSize=100, minSize=0, maxWaitQueueSize=500, maxWaitTimeMS=120000, maxConnectionLifeTimeMS=0, maxConnectionIdleTimeMS=0, maintenanceInitialDelayMS=0, maintenanceFrequencyMS=60000}, socketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=0, keepAlive=false, receiveBufferSize=0, sendBufferSize=0}, serverSettings=ServerSettings{heartbeatFrequencyMS=10000, minHeartbeatFrequencyMS=500, serverListeners='[]', serverMonitorListeners='[]'}, heartbeatSocketSettings=SocketSettings{connectTimeoutMS=20000, readTimeoutMS=20000, keepAlive=false, receiveBufferSize=0, sendBufferSize=0}}} 17/12/01 09:38:05 WARN MongoClientCache: Release without acquire for key: Mongo{options=MongoClientOptions{description='null', applicationName='null', readPreference=primary, writeConcern=WriteConcern{w=null, wTimeout=null ms, fsync=null, journal=null, readConcern=com.mongodb.ReadConcern@0, codecRegistry=org.bson.codecs.configuration.ProvidersCodecRegistry@8cef5306, commandListeners=[], clusterListeners=[], serverListeners=[], serverMonitorListeners=[], minConnectionsPerHost=0, maxConnectionsPerHost=100, threadsAllowedToBlockForConnectionMultiplier=5, serverSelectionTimeout=30000, maxWaitTime=120000, maxConnectionIdleTime=0, maxConnectionLifeTime=0, connectTimeout=10000, socketTimeout=0, socketKeepAlive=false, sslEnabled=false, sslInvalidHostNamesAllowed=false, alwaysUseMBeans=false, heartbeatFrequency=10000, minHeartbeatFrequency=500, heartbeatConnectTimeout=20000, heartbeatSocketTimeout=20000, localThreshold=15, requiredReplicaSetName='null', dbDecoderFactory=com.mongodb.DefaultDBDecoder$1@1999df34, dbEncoderFactory=com.mongodb.DefaultDBEncoder$1@4f04d552, socketFactory=null, cursorFinalizerEnabled=true, connectionPoolSettings=ConnectionPoolSettings{maxSize=100, minSize=0, maxWaitQueueSize=500, maxWaitTimeMS=120000, maxConnectionLifeTimeMS=0, maxConnectionIdleTimeMS=0, maintenanceInitialDelayMS=0, maintenanceFrequencyMS=60000}, socketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=0, keepAlive=false, receiveBufferSize=0, sendBufferSize=0}, serverSettings=ServerSettings{heartbeatFrequencyMS=10000, minHeartbeatFrequencyMS=500, serverListeners='[]', serverMonitorListeners='[]'}, heartbeatSocketSettings=SocketSettings{connectTimeoutMS=20000, readTimeoutMS=20000, keepAlive=false, receiveBufferSize=0, sendBufferSize=0}}} 17/12/01 09:38:05 INFO MongoClientCache: Creating MongoClient: [localhost:27017] 17/12/01 09:38:05 INFO MongoClientCache: Closing MongoClient: [localhost:27017] 17/12/01 09:38:05 INFO MongoClientCacheSpec: Ended Test: 'MongoClientCacheSpec' [info] MongoClientCacheSpec: [info] MongoClientCache [info] - should create a client and then close the client once released [info] - should create a client and then close the client once released and after the timeout [info] - should return a different client once released [info] - should not throw an exception when trying to release unacquired client [info] - should eventually close all released clients on shutdown 17/12/01 09:38:05 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:05 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) 17/12/01 09:38:05 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) 17/12/01 09:38:05 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) 17/12/01 09:38:05 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:05 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 INFO MongoRelation: requiredColumns: age, filters: IsNotNull(age), GreaterThan(age,100) [info] Test run started [info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldBeEasilyCreatedViaMongoSpark started [info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldIncludeAnyPipelinesWhenInferringTheSchema started [info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldBeEasilyCreatedViaMongoSparkAndSQLContext started [info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldThrowAnExceptionIfPipelineIsInvalid started [info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldBeEasilyCreatedWithMongoSparkAndJavaBean started [info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldBeEasilyCreatedWithAProvidedRDDAndJavaBean started [info] Test com.mongodb.spark.sql.MongoDataFrameReaderTest.shouldBeEasilyCreatedFromTheSQLContext started [info] Test run finished: 0 failed, 0 ignored, 7 total, 1.098s 17/12/01 09:38:06 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:06 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:38:06 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 WARN TestPartitioner: Could not find collection (MongoRDDSpec), using single partition 17/12/01 09:38:06 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:06 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:06 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:07 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:07 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:07 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:07 INFO MongoRelation: requiredColumns: counter, filters: 17/12/01 09:38:07 ERROR Executor: Exception in task 0.0 in stage 24.0 (TID 24) com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast STRING into a IntegerType (value: BsonString{value='a'}) at com.mongodb.spark.sql.MapFunctions$.com$mongodb$spark$sql$MapFunctions$$convertToDataType(MapFunctions.scala:83) at com.mongodb.spark.sql.MapFunctions$$anonfun$3.apply(MapFunctions.scala:39) at com.mongodb.spark.sql.MapFunctions$$anonfun$3.apply(MapFunctions.scala:37) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186) at com.mongodb.spark.sql.MapFunctions$.documentToRow(MapFunctions.scala:37) at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$2.apply(MongoRelation.scala:45) at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$2.apply(MongoRelation.scala:45) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:395) at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:234) at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:228) at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827) at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:108) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 17/12/01 09:38:07 WARN TaskSetManager: Lost task 0.0 in stage 24.0 (TID 24, localhost, executor driver): com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast STRING into a IntegerType (value: BsonString{value='a'}) at com.mongodb.spark.sql.MapFunctions$.com$mongodb$spark$sql$MapFunctions$$convertToDataType(MapFunctions.scala:83) at com.mongodb.spark.sql.MapFunctions$$anonfun$3.apply(MapFunctions.scala:39) at com.mongodb.spark.sql.MapFunctions$$anonfun$3.apply(MapFunctions.scala:37) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186) at com.mongodb.spark.sql.MapFunctions$.documentToRow(MapFunctions.scala:37) at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$2.apply(MongoRelation.scala:45) at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$2.apply(MongoRelation.scala:45) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:395) at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:234) at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:228) at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827) at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:108) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 17/12/01 09:38:07 ERROR TaskSetManager: Task 0 in stage 24.0 failed 1 times; aborting job 17/12/01 09:38:07 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:07 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:07 INFO MongoRelation: requiredColumns: counter, filters: 17/12/01 09:38:07 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:07 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:07 INFO MongoRDDSpec: Running Test: 'MongoRDDSpec' 17/12/01 09:38:07 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:07 INFO MongoRDDSpec: Ended Test: 'MongoRDDSpec' [info] MongoRDDSpec: [info] MongoRDD [info] - should be easily created from the SparkContext [info] - should be able to handle non existent collections [info] - should be able to query via a pipeline [info] - should be able to handle different collection types [info] - should be able to create a DataFrame by inferring the schema [info] - should be able to create a DataFrame when provided a case class [info] - should be able to create a DataFrame with a set schema [info] - should be able to create a Dataset when provided a case class [info] - should not allow Nothing when trying to create a Dataset [info] - should throw when creating a Dataset with invalid data [info] - should use default values when creating a Dataset with missing data [info] - should be easy to use a custom partitioner [info] - should be easy to use a custom partitioner that is an object 17/12/01 09:38:07 INFO MongoRelationHelperSpec: Ended Test: 'MongoRelationHelperSpec' [info] MongoRelationHelperSpec: [info] createPipeline [info] - should create an empty pipeline if no projection or filters [info] - should project the required fields [info] - should explicitly exclude _id from the projection if not required [info] - should handle spark Filters [info] - should and multiple spark Filters 17/12/01 09:38:07 INFO MongoClientCache: Creating MongoClient: [localhost:27017] 17/12/01 09:38:07 INFO MongoClientCache: Creating MongoClient: [localhost:27017] [info] Test run started [info] Test com.mongodb.spark.MongoConnectorTest.shouldCreateMongoConnectorWithCustomMongoClientFactory started [info] Test com.mongodb.spark.MongoConnectorTest.shouldCreateMongoConnector started [info] Test com.mongodb.spark.MongoConnectorTest.shouldUseTheMongoClientCache started [info] Test com.mongodb.spark.MongoConnectorTest.shouldCreateMongoConnectorFromJavaSparkContext started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.043s 17/12/01 09:38:07 INFO MongoShardedPartitionerSpec: Ended Test: 'MongoShardedPartitionerSpec' [info] MongoShardedPartitionerSpec: [info] MongoShardedPartitioner [info] - should partition the database as expected !!! CANCELED !!! [info] Not a Sharded MongoDB (MongoShardedPartitionerSpec.scala:30) [info] - should have a default bounds of min to max key !!! CANCELED !!! [info] Not a Sharded MongoDB (MongoShardedPartitionerSpec.scala:43) [info] - should handle no collection !!! CANCELED !!! [info] Not a Sharded MongoDB (MongoShardedPartitionerSpec.scala:52) [info] - should handle an empty collection !!! CANCELED !!! [info] Not a Sharded MongoDB (MongoShardedPartitionerSpec.scala:58) [info] - should calculate the expected hosts for a single node shard [info] - should calculate the expected hosts for a multi node shard [info] - should return distinct hosts [info] - should calculate the expected Partitions 17/12/01 09:38:07 INFO PartitionerHelperSpec: Ended Test: 'PartitionerHelperSpec' [info] PartitionerHelperSpec: [info] PartitionerHelper [info] - should create the expected partitions query [info] - should create the correct partitions 17/12/01 09:38:07 INFO DefaultMongoClientFactorySpec: Ended Test: 'DefaultMongoClientFactorySpec' [info] DefaultMongoClientFactorySpec: [info] DefaultMongoClientFactory [info] - should create a MongoClient from the connection string [info] - should implement equals based on the prefix less options map [info] - should set the localThreshold correctly [info] - should validate the connection string 17/12/01 09:38:07 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:07 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:07 WARN TestPartitioner: Could not find collection (com.mongodb.spark.MongoSparkTest), using single partition 17/12/01 09:38:07 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:07 INFO MongoRelation: requiredColumns: counter, filters: 17/12/01 09:38:07 INFO MongoClientCache: Closing MongoClient: [localhost:27017] 17/12/01 09:38:07 INFO MongoClientCache: Closing MongoClient: [localhost:27017] 17/12/01 09:38:07 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 1) com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast STRING into a IntegerType (value: BsonString{value='a'}) at com.mongodb.spark.sql.MapFunctions$.com$mongodb$spark$sql$MapFunctions$$convertToDataType(MapFunctions.scala:83) at com.mongodb.spark.sql.MapFunctions$$anonfun$3.apply(MapFunctions.scala:39) at com.mongodb.spark.sql.MapFunctions$$anonfun$3.apply(MapFunctions.scala:37) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186) at com.mongodb.spark.sql.MapFunctions$.documentToRow(MapFunctions.scala:37) at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$2.apply(MongoRelation.scala:45) at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$2.apply(MongoRelation.scala:45) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:395) at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:234) at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:228) at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827) at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:108) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 17/12/01 09:38:07 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver): com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast STRING into a IntegerType (value: BsonString{value='a'}) at com.mongodb.spark.sql.MapFunctions$.com$mongodb$spark$sql$MapFunctions$$convertToDataType(MapFunctions.scala:83) at com.mongodb.spark.sql.MapFunctions$$anonfun$3.apply(MapFunctions.scala:39) at com.mongodb.spark.sql.MapFunctions$$anonfun$3.apply(MapFunctions.scala:37) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186) at com.mongodb.spark.sql.MapFunctions$.documentToRow(MapFunctions.scala:37) at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$2.apply(MongoRelation.scala:45) at com.mongodb.spark.sql.MongoRelation$$anonfun$buildScan$2.apply(MongoRelation.scala:45) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:395) at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:234) at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:228) at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827) at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:108) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 17/12/01 09:38:07 ERROR TaskSetManager: Task 0 in stage 1.0 failed 1 times; aborting job 17/12/01 09:38:07 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:07 INFO MongoRelation: requiredColumns: counter, filters: 17/12/01 09:38:08 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:08 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:08 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:08 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:08 INFO MongoRelation: requiredColumns: counter, filters: 17/12/01 09:38:08 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:08 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:08 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:08 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:08 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:08 INFO MongoRelation: requiredColumns: counter, filters: [info] Test run started [info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToCreateADataFrameUsingJavaBean started [info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToHandleNoneExistentCollections started [info] Test com.mongodb.spark.MongoSparkTest.shouldThrowWhenCreatingADatasetWithInvalidData started [info] Test com.mongodb.spark.MongoSparkTest.useDefaultValuesWhenCreatingADatasetWithMissingData started [info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToCreateADataFrameByInferringTheSchemaUsingSparkSession started [info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToCreateADataFrameByInferringTheSchema started [info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToQueryViaAPipeLine started [info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToCreateADatasetUsingJavaBeanWithSparkSession started [info] Test com.mongodb.spark.MongoSparkTest.useACustomPartitioner started [info] Test com.mongodb.spark.MongoSparkTest.shouldBeCreatableFromTheSparkContextWithAlternativeReadAndWriteConfigs started [info] Test com.mongodb.spark.MongoSparkTest.shouldBeCreatableFromTheSparkContext started [info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToHandleDifferentCollectionTypes started [info] Test com.mongodb.spark.MongoSparkTest.shouldBeAbleToCreateADatasetUsingJavaBean started [info] Test run finished: 0 failed, 0 ignored, 13 total, 1.207s 17/12/01 09:38:08 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:38:09 INFO MongoRelation: requiredColumns: _id, age, name, filters: 17/12/01 09:38:09 INFO NoSparkConfSpec: Ended Test: 'be able to accept just options' [info] NoSparkConfSpec: [info] MongoRDD [info] - should be able to accept just Read / Write Configs [info] DataFrame Readers and Writers [info] - should be able to accept just options [info] Test run started [info] Test com.mongodb.spark.config.WriteConfigTest.shouldBeCreatableFromAJavaMap started [info] Test com.mongodb.spark.config.WriteConfigTest.shouldBeCreatableFromTheSparkConf started [info] Test com.mongodb.spark.config.WriteConfigTest.shouldBeCreatableFromAJavaMapAndUseDefaults started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.002s 17/12/01 09:38:09 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:09 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:09 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:10 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:10 WARN TaskSetManager: Stage 100 contains a task of very large size (202 KB). The maximum recommended task size is 100 KB. 17/12/01 09:38:10 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:11 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:11 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 INFO MongoClientCache: Closing MongoClient: [localhost:27017] 17/12/01 09:38:12 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 INFO MongoInferSchemaSpec: Running Test: 'MongoInferSchemaSpec' 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN MongoInferSchema: Array Field 'a' contains conflicting types converting to StringType 17/12/01 09:38:12 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect. 17/12/01 09:38:12 WARN MongoInferSchema: Array Field 'a' contains conflicting types converting to StringType 17/12/01 09:38:12 INFO MongoInferSchemaSpec: Ended Test: 'MongoInferSchemaSpec' [info] MongoInferSchemaSpec: [info] MongoSchemaHelper [info] - should be able to infer the schema from simple types [info] - should be able to infer the schema from a flat array [info] - should be able to infer the schema from a flat document [info] - should be able to infer the schema from a nested array [info] - should be able to infer the schema from a multi level document [info] - should be able to infer the schema with custom sampleSize [info] - should ignore empty arrays and null values in arrays [info] - should use any set pipelines on the RDD [info] - should upscale number types based on numeric precedence [info] - should be able to infer the schema from arrays with mixed keys [info] - should be able to infer the schema from arrays with mixed numerics [info] - should be able to infer the schema from nested arrays with mixed keys [info] - should still mark incompatible schemas with a StringType [info] Test run started [info] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromAJavaMap started [info] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromTheSparkConf started [info] Test com.mongodb.spark.config.ReadConfigTest.shouldBeCreatableFromAJavaMapAndUseDefaults started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.003s 17/12/01 09:38:12 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:12 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:38:12 INFO MongoRelation: requiredColumns: binary, filters: IsNotNull(binary) 17/12/01 09:38:12 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:12 INFO MongoRelation: requiredColumns: oldBinary, filters: IsNotNull(oldBinary) 17/12/01 09:38:12 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:12 INFO MongoRelation: requiredColumns: dbPointer, filters: IsNotNull(dbPointer) 17/12/01 09:38:13 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:13 INFO MongoRelation: requiredColumns: code, filters: IsNotNull(code) 17/12/01 09:38:13 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:13 INFO MongoRelation: requiredColumns: codeWithScope, filters: IsNotNull(codeWithScope) 17/12/01 09:38:13 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:13 INFO MongoRelation: requiredColumns: maxKey, filters: IsNotNull(maxKey) 17/12/01 09:38:13 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:13 INFO MongoRelation: requiredColumns: minKey, filters: IsNotNull(minKey) 17/12/01 09:38:13 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:13 INFO MongoRelation: requiredColumns: objectId, filters: IsNotNull(objectId) 17/12/01 09:38:13 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:13 INFO MongoRelation: requiredColumns: regex, filters: IsNotNull(regex) 17/12/01 09:38:13 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:13 INFO MongoRelation: requiredColumns: regexWithOptions, filters: IsNotNull(regexWithOptions) 17/12/01 09:38:13 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:13 INFO MongoRelation: requiredColumns: symbol, filters: IsNotNull(symbol) 17/12/01 09:38:13 INFO HelpersSpec: Running Test: 'HelpersSpec' 17/12/01 09:38:13 INFO MongoRelation: requiredColumns: timestamp, filters: IsNotNull(timestamp) 17/12/01 09:38:13 INFO HelpersSpec: Ended Test: 'HelpersSpec' [info] HelpersSpec: [info] the user defined function helpers [info] - should handle Binary values [info] - should handle Binary values with a subtype [info] - should handle DbPointers [info] - should handle JavaScript [info] - should handle JavaScript with scope [info] - should handle maxKeys [info] - should handle minKeys [info] - should handle ObjectIds [info] - should handle Regular Expressions [info] - should handle Regular Expressions with options [info] - should handle Symbols [info] - should handle Timestamps 17/12/01 09:38:13 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 17/12/01 09:38:13 WARN TestPartitioner: Could not find collection (beabletoconnecttoanauthenticateddb), using single partition 17/12/01 09:38:13 INFO AuthConnectionSpec: Ended Test: 'AuthConnectionSpec' [info] AuthConnectionSpec: [info] MongoRDD [info] - should be able to connect to an authenticated db [info] Reading scoverage instrumentation [/usr/local/mongo-spark/target/scala-2.11/scoverage-data/scoverage.coverage.xml] [info] Reading scoverage measurements... [info] Generating scoverage reports... [info] Written Cobertura report [/usr/local/mongo-spark/target/scala-2.11/coverage-report/cobertura.xml] [info] Written XML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/scoverage.xml] [info] Written HTML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/index.html] [info] Coverage reports completed [info] All done. Coverage was [80.60%] [info] ScalaCheck [info] Passed: Total 0, Failed 0, Errors 0, Passed 0 [info] ScalaTest [info] Run completed in 27 seconds, 8 milliseconds. [info] Total number of tests run: 151 [info] Suites: completed 23, aborted 0 [info] Tests: succeeded 151, failed 0, canceled 4, ignored 0, pending 0 [info] All tests passed. [info] Passed: Total 189, Failed 0, Errors 0, Passed 189, Canceled 4 [success] Total time: 209 s, completed 2017-12-1 9:38:15 [info] Aggregating coverage from subprojects... [info] Found 1 subproject report files [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/scoverage.xml] [info] No subproject data to aggregate, skipping reports [success] Total time: 0 s, completed 2017-12-1 9:38:15 [info] Waiting for measurement data to sync... [info] Reading scoverage instrumentation [/usr/local/mongo-spark/target/scala-2.11/scoverage-data/scoverage.coverage.xml] [info] Reading scoverage measurements... [info] Generating scoverage reports... [info] Written Cobertura report [/usr/local/mongo-spark/target/scala-2.11/coverage-report/cobertura.xml] [info] Written XML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/scoverage.xml] [info] Written HTML coverage report [/usr/local/mongo-spark/target/scala-2.11/scoverage-report/index.html] 17/12/01 09:38:17 INFO MongoClientCache: Closing MongoClient: [localhost:27017] [info] Coverage reports completed [success] Total time: 2 s, completed 2017-12-1 9:38:17 17/12/01 09:38:17 INFO MongoClientCache: Closing MongoClient: [localhost:27017] [root@hadoop1 mongo-spark]#
mongodb/mongo-spark: The MongoDB Spark Connector https://github.com/mongodb/mongo-spark
./sbt check 报错后
控制台提示没有权限
ps -aux | grep mongo;
杀死需要权限认证的mongodb服务(./mongodb --port 27017 --auth),调整为./mongodb --port 27017
之后重新./sbt check
ok