摘要:
scala>val spark=new org.apache.spark.sql.SQLContext(sc) user.json {"age":"45","gender":"M","occupation":"7","userID":"4","zipcode":"02460"}{"age":"1", 阅读全文
摘要:
1.拷贝hive-site.xml到spark/conf下,拷贝mysql-connector-java-xxx-bin.jar到hive/lib下 2.开启hive元数据服务:hive --service metastore 3.开启hadoop服务:sh $HADOOP_HOME/sbin/st 阅读全文