SparkSQL 操作Hive In Java

本文的前提条件: SparkSQL in Java

1.增加POM依赖

        <dependency>
            <groupId>com.mysql</groupId>
            <artifactId>mysql-connector-j</artifactId>
            <version>8.0.33</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.13</artifactId>
            <version>3.5.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <version>3.1.3</version>
        </dependency>

2.将Hive的配置文件hive-site.xml放置在项目的资源目录下【略】

注意,需要保证配置文件中存在访问metastore服务所需的参数

    <property>
        <name>hive.metastore.uris</name>
        <value>thrift://192.168.58.130:9083</value>
    </property>

3.使用

package cn.coreqi;

import org.apache.spark.SparkConf;
import org.apache.spark.sql.*;

public class Main {
    public static void main(String[] args) {
        // 创建SparkConf对象
        SparkConf sparkConf = new SparkConf()
                .setMaster("local[*]")
                .setAppName("sparkSql");

        SparkSession spark = SparkSession
                .builder()
                .enableHiveSupport()    //启用Hive的支持
                .config(sparkConf)
                .getOrCreate();

        spark.sql("show tables").show();

        // 关闭
        spark.close();
    }
}
posted @ 2024-01-15 12:21  SpringCore  阅读(83)  评论(0编辑  收藏  举报