摘要: scala>val spark=new org.apache.spark.sql.SQLContext(sc) user.json {"age":"45","gender":"M","occupation":"7","userID":"4","zipcode":"02460"}{"age":"1", 阅读全文
posted @ 2017-12-05 15:49 信方 阅读(4916) 评论(0) 推荐(0) 编辑
摘要: 1.拷贝hive-site.xml到spark/conf下,拷贝mysql-connector-java-xxx-bin.jar到hive/lib下 2.开启hive元数据服务:hive --service metastore 3.开启hadoop服务:sh $HADOOP_HOME/sbin/st 阅读全文
posted @ 2017-12-05 11:10 信方 阅读(8382) 评论(0) 推荐(0) 编辑