启动 ./spark-shell 命令报错
当使用./spark-shell 命令报错
Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@476fde05, see the next exception for details. at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source) ... 153 more Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /usr/local/development/spark-2.1.1-bin-hadoop2.7/bin/metastore_db. at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source) at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source) -------------------- Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@476fde05, see the next exception for details. at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source) at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source) ------------------- Caused by: org.apache.hadoop.ipc.RemoteException: Cannot create directory /tmp/hive/root/5436b1aa-85e3-4512-b505-b0bdc7444e46. Name node is in safe mode. The reported blocks 0 needs additional 9 blocks to reach the threshold 0.9990 of total blocks 9. The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1327) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3895) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenode
抛出了很多错误,甚至我退出spark-shell,再次进入抛出的错误还不一样,最后一个错误是Name node is in safe mode. 先解决这个
网上说这是hdfs的分布式文件系统的安全模式,当安全模式时文件系统中的内容不允许修改和删除,直到安全模式结束,安全模式就是系统在检查各个dataNode数据块的有效性
bin/hadoop dfsadmin -safemode leave //离开安全模式
用户可以通过dfsadmin -safemode value 来操作安全模式,参数value的说明如下:
//enter - 进入安全模式
//leave - 强制NameNode离开安全模式
[root@node1 sbin]# hdfs dfsadmin -safemode leave Safe mode is OFF
//get - 返回安全模式是否开启的信息
[root@node1 sbin]# hdfs dfsadmin -safemode get Safe mode is ON
//wait - 等待,一直到安全模式结束。
当离开安全模式再次spark-shell,抛出异常
Caused by: org.apache.derby.iapi.error.StandardException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@476fde05, see the next exception for details. at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source) ... 153 more Caused by: org.apache.derby.iapi.error.StandardException: Another instance of Derby may have already booted the database /usr/local/development/spark-2.1.1-bin-hadoop2.7/bin/metastore_db. at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source) at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
可以看到,这个异常是说在/usr/local/development/spark-2.1.1-bin-hadoop2.7/bin/ 中已经有另一个数据库(Derby)实例了,我一直都是使用mysql数据库,没用过Derby数据库,最后想想可能是我在安装spark的时候,连接了hive,那个时候我的hive还
配置mysql数据库,hive默认使用Derby数据库,所以当我启动spark连接hive的时候,就自动生成了Derby的实例。我看了'metastore_db' 文件夹的内容是Derby数据库的配置
解决方法:
删除/usr/local/development/spark-2.1.1-bin-hadoop2.7/bin/ metastore_db 文件夹。
重新启动
./spark-shell
[root@node1 bin]# ./spark-shell Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 17/07/05 00:13:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/07/05 00:13:29 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 17/07/05 00:13:29 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException 17/07/05 00:13:40 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException Spark context Web UI available at http://192.168.177.120:4040 Spark context available as 'sc' (master = local[*], app id = local-1499184787668). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.1.1 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131) Type in expressions to have them evaluated. Type :help for more information. scala> scala> sc res0: org.apache.spark.SparkContext = org.apache.spark.SparkContext@178c4480
启动的时候指定jdbc连接mysql的jar包
[root@master bin]# ./spark-shell --jars usr/local/src/spark-2.0.2-bin-hadoop2.6/examples/jars/mysql-connector-java-5.1.41-bin.jar 可以跳过找metastore数据库
########## 今天的苦逼是为了不这样一直苦逼下去!##########