Kafka Connect 自定义Sink Connector实现在数据同步时增加时间戳字段和假删除功能

1.情景展示

以debezium为例,结合kafka很容易就能实现两个数据库表与表之间的数据同步问题。

但是,现在甲方有这样的需求:

其一,在源表数据同步至目标表时,目标表增加一个时间戳字段(就是我们通常意义上讲的last_update_time),无论是insert还是update操作,都在此字段插入系统当前时间。用于记录数据发生变化的时间。

其二,在源表数据同步至目标表时,如果源表执行的是delete操作,目标表增加一个删除标志字段(就是我们通常意义上讲的假删除),目标表不执行delete操作,而是执行update操作,且更新删除标识字段,将其标识为已删除状态。如:is_deleted=true。

2.具体分析

一般情况下,我们在进行数据同步时,源表和目标表的数据结构是保持一致的(即使两表的数据库类型可能不一致)。

现在,我们就假定两表的数据结构是一致的,否则的话,进行数据同步的意义不大。

正常情况下,我们直接利用debezium+kafka就能实现两表的数据同步功能啦。

但是,由于甲方的这俩需求,debezium本身并没有提供这样的功能实现(时间戳:需要往after结构中增加时间戳字段,删除标识:需要将删除变为更新操作),所以,那就只能自己搞了。

通过自定义开发Sink Connector来实现上述功能。 

3.准备工作

本来,源表与目标表的数据结构是一致的,现在由于甲方的要求,我们的目标表可能会比源表多一个时间戳字段和删除标识字段。

目标表增加字段

目标表要增加的时间戳字段和删除标识字段,需要我们手动在目标表中进行添加。

我觉得涉及表结构的变化,不应该交由Sink Connector来处理,它应该只专注于数据的变化,而不是两表之间的表结构同步问题。

创建maven项目

pom.xml

查看代码
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>code.marydon.kafka</groupId>
    <artifactId>marydon-connector-jdbc</artifactId>
    <version>1.1.Final</version>
    <name>kafka-connect-jdbc</name>
    <description>marydon-cdc-jdbc</description>
    <!--项目打包形式-->
    <packaging>jar</packaging>
    <properties>
        <!--maven编译jdk版本-->
        <maven.compiler.source>11</maven.compiler.source>
        <maven.compiler.target>11</maven.compiler.target>
        <!--maven插件版本-->
        <maven.compiler.plugin.version>3.8.1</maven.compiler.plugin.version>
        <maven.source.plugin.version>3.8.1</maven.source.plugin.version>
        <!--项目构建字符集-->
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <!--<version.debezium>2.3.0.Final</version.debezium>-->
        <!--取决于你所运行的kafka的版本号-->
        <version.kafka>3.5.1</version.kafka>
        <version.debezium>2.5.0.Final</version.debezium>
    </properties>
    <dependencies>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>connect-api</artifactId>
            <version>${version.kafka}</version>
            <!--仅在编译器有效-->
            <scope>provided</scope>
            <exclusions>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
            </exclusions>
        </dependency>
        <!--HikariDataSource是Spring Boot默认的数据库连接池-->
        <dependency>
            <groupId>com.zaxxer</groupId>
            <artifactId>HikariCP</artifactId>
            <version>4.0.3</version>
        </dependency>
        <!--mysql驱动-->
        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>8.0.28</version>
        </dependency>
        <!--oracle驱动-->
        <dependency>
            <groupId>com.oracle.database.jdbc</groupId>
            <artifactId>ojdbc8</artifactId>
            <version>19.7.0.0</version>
        </dependency>
        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
            <version>1.18.28</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-lang3</artifactId>
            <version>3.12.0</version>
        </dependency>
    </dependencies>
    <build>
        <finalName>${project.artifactId}-${project.version}</finalName><!-- 指定package生成的文件名 -->
        <plugins>
            <!--maven项目编译插件-->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>${maven.compiler.plugin.version}</version>
                <configuration>
                    <source>${maven.compiler.source}</source>
                    <target>${maven.compiler.target}</target>
                    <encoding>${project.build.sourceEncoding}</encoding>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-dependency-plugin</artifactId>
                <executions>
                    <execution>
                        <id>copy-dependencies</id>
                        <phase>package</phase>
                        <goals>
                            <goal>copy-dependencies</goal>
                        </goals>
                        <configuration>
                            <!-- 将依赖的jar包放置在项目构建所在路径的lib目录下 -->
                            <outputDirectory>${project.build.directory}/lib</outputDirectory>
                            <!-- 是否排除间接依赖:间接依赖也拷贝 -->
                            <excludeTransitive>false</excludeTransitive>
                            <!-- 是否跳过版本号:带上版本号 -->
                            <stripVersion>false</stripVersion>
                            <!--排除范围(哪些jar包将被排除在外)-->
                            <excludeScope>provided</excludeScope>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <!--maven项目打包插件-->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
            </plugin>
        </plugins>
    </build>
</project>

4.解决方案

以目标库是mysql进行举例说明。

项目目录结构展示:

JdbcSinkConfig.java

import code.marydon.utils.TimeZoneValidator;
import org.apache.kafka.common.config.AbstractConfig;
import org.apache.kafka.common.config.ConfigDef;
import org.apache.kafka.common.config.ConfigDef.Importance;
import org.apache.kafka.common.config.ConfigDef.NonEmptyString;
import org.apache.kafka.common.config.ConfigDef.Type;
import org.apache.kafka.common.config.ConfigDef.Width;

import java.util.Map;
/**
 * Sink Connector Config配置
 * @description: 主要参考的是io.confluent.connect.jdbc.JdbcSinkConnector的参数配置
 * io.confluent.connect.jdbc.JdbcSinkConnector参数配置文档
 * https://docs.confluent.io/kafka-connectors/jdbc/current/sink-connector/sink_config_options.html
 * @author: Marydon
 * @date: 2023-12-06 11:18
 * @version: 1.0
 * @email: marydon20170307@163.com
 */
public class JdbcSinkConfig extends AbstractConfig {
    public static final String CONNECTION_URL = "connection.url";// 必要参数
    public static final String CONNECTION_USER = "connection.user";// 必要参数
    public static final String CONNECTION_PASSWORD = "connection.password";// 必要参数
    public static final String CONNECTION_CLASS = "connection.class";// 必要参数
    public static final String CONNECTION_DRIVER_CLASS_NAME = "connection.driveClassName";// 自定义参数
    public static final String TOPICS_REGEX = "topics.regex";// 必要参数
    public static final String TABLE_NAME_FORMAT = "table.name.format";// 可选参数
    private static final String TABLE_NAME_FORMAT_DEFAULT = "${topic}";
    public static final String DELETE_ENABLED = "delete.enabled";// 实际未用到
    private static final String DELETE_ENABLED_DEFAULT = "false";
    public static final String AUTO_CREATE = "auto.create";// 实际未用到
    private static final String AUTO_CREATE_DEFAULT = "false";
    public static final String AUTO_EVOLVE = "auto.evolve";// 实际未用到
    private static final String AUTO_EVOLVE_DEFAULT = "false";
    public static final String INSERT_MODE = "insert.mode";// 可选参数
    public static final String INSERT_MODE_DEFAULT = "upsert";
    public static final String PK_FIELDS = "pk.fields";// 必要参数(更新和修改都是根据主键走的)
    public static final String PK_MODE = "pk.mode";// 实际未用到
    public static final String FIELDS_WHITELIST = "fields.whitelist";// 未使用
    public static final String DB_TIMEZONE_CONFIG = "db.timezone";// 未使用
    public static final String DB_TIMEZONE_DEFAULT = "UTC";
    public static final String TASKS_MAX = "tasks.max";// 可选参数
    public static final String COLUMNS_DATE = "columns.date";// 自定义参数
    public static final String COLUMNS_MAP = "columns.map";// 自定义参数
    public static final String SYNC_FIELD_TIMESTAMP = "sync.field.timestamp";// 自定义参数
    public static final String SYNC_FIELD_DELETED = "sync.field.delete";// 自定义参数
    public static final String SYNC_FIELD_DELETED_MARK = "sync.field.delete.mark";// 自定义参数
    public static final String LOG_LEVEL = "log.level";// 自定义参数
    public static final String LOG_LEVEL_DEFAULT = "info";
    public static final String TIMESTAMP_HOURS_DIFFERENCE = "timestamp.hours.difference";// 自定义参数

    public static final ConfigDef CONFIG_DEFINITION;

    static {
        CONFIG_DEFINITION = new ConfigDef()
                .define(CONNECTION_CLASS, Type.STRING, null, Importance.HIGH, "JDBC connection class.", "Connection", 1, Width.MEDIUM, "JDBC Class")
                .define(CONNECTION_URL, Type.STRING, ConfigDef.NO_DEFAULT_VALUE, Importance.HIGH, "JDBC connection URL.\nFor example: ``jdbc:oracle:thin:@localhost:1521:orclpdb1``, ``jdbc:mysql://localhost/db_name``, ``jdbc:sqlserver://localhost;instance=SQLEXPRESS;databaseName=db_name``", "Connection", 1, Width.LONG, "JDBC URL")
                .define(CONNECTION_USER, Type.STRING, null, Importance.HIGH, "JDBC connection user.", "Connection", 2, Width.MEDIUM, "JDBC User")
                .define(CONNECTION_PASSWORD, Type.PASSWORD, null, Importance.HIGH, "JDBC connection password.", "Connection", 3, Width.MEDIUM, "JDBC Password")
                .define(CONNECTION_DRIVER_CLASS_NAME, Type.STRING, null, Importance.HIGH, "JDBC connection  driver class.", "Connection", 4, Width.MEDIUM, "JDBC Driver Class")
                .define(INSERT_MODE, Type.STRING, INSERT_MODE_DEFAULT, null, Importance.HIGH, "The insertion mode to use. Supported modes are:\n``insert``\n    Use standard SQL ``INSERT`` statements.\n``upsert``\n    Use the appropriate upsert semantics for the target database if it is supported by the connector, e.g. ``INSERT OR IGNORE``.\n``update``\n    Use the appropriate update semantics for the target database if it is supported by the connector, e.g. ``UPDATE``.", "Writes", 1, Width.MEDIUM, "Insert Mode")
                .define(DELETE_ENABLED, Type.BOOLEAN, DELETE_ENABLED_DEFAULT, Importance.MEDIUM, "Whether to treat ``null`` record values as deletes. Requires ``pk.mode`` to be ``record_key``.", "Writes", 3, Width.SHORT, "Enable deletes")
                .define(TABLE_NAME_FORMAT, Type.STRING, TABLE_NAME_FORMAT_DEFAULT, new NonEmptyString(), Importance.MEDIUM, "A format string for the destination table name, which may contain '${topic}' as a placeholder for the originating topic name.\nFor example, ``kafka_${topic}`` for the topic 'orders' will map to the table name 'kafka_orders'.", "Data Mapping", 1, Width.LONG, "Table Name Format")
                .define(PK_MODE, Type.STRING, "none", null, Importance.HIGH, "The primary key mode, also refer to ``pk.fields`` documentation for interplay. Supported modes are:\n``none``\n    No keys utilized.\n``kafka``\n    Kafka coordinates are used as the PK.\n``record_key``\n    Field(s) from the record key are used, which may be a primitive or a struct.\n``record_value``\n    Field(s) from the record value are used, which must be a struct.", "Data Mapping", 2, Width.MEDIUM, "Primary Key Mode")
                .define(PK_FIELDS, Type.LIST, "", Importance.MEDIUM, "List of comma-separated Source Table primary key field names.\nFor example:pk_column1,pk_column2", "Data Mapping", 3, Width.LONG, "Source Table Primary Key Fields")
                .define(FIELDS_WHITELIST, Type.LIST, "", Importance.MEDIUM, "List of comma-separated record value field names. If empty, all fields from the record value are utilized, otherwise used to filter to the desired fields.\nNote that ``pk.fields`` is applied independently in the context of which field(s) form the primary key columns in the destination database, while this configuration is applicable for the other columns.", "Data Mapping", 4, Width.LONG, "Fields Whitelist")
                .define(DB_TIMEZONE_CONFIG, Type.STRING, DB_TIMEZONE_DEFAULT, TimeZoneValidator.INSTANCE, Importance.MEDIUM, "Name of the JDBC timezone that should be used in the connector when inserting time-based values. Defaults to UTC.", "Data Mapping", 5, Width.MEDIUM, "DB Time Zone")
                .define(AUTO_CREATE, Type.BOOLEAN, AUTO_CREATE_DEFAULT, Importance.MEDIUM, "Whether to automatically create the destination table based on record schema if it is found to be missing by issuing ``CREATE``.", "DDL Support", 1, Width.SHORT, "Auto-Create")
                .define(AUTO_EVOLVE, Type.BOOLEAN, AUTO_EVOLVE_DEFAULT, Importance.MEDIUM, "Whether to automatically add columns in the table schema when found to be missing relative to the record schema by issuing ``ALTER``.", "DDL Support", 2, Width.SHORT, "Auto-Evolve")
                .define(TASKS_MAX, Type.INT, 1, Importance.MEDIUM, "max tasks", "Connection", 5, Width.SHORT, "tasks")
                .define(TOPICS_REGEX, Type.STRING, "", null, Importance.HIGH, "Subscribe topics from kafka", "Data Mapping", 6, Width.MEDIUM, "Subscribe Topics")
                .define(COLUMNS_DATE, Type.STRING, "", null, Importance.HIGH, "date columns in Target Table.\nFor example:date_column1:date_type1,date_column2:date_type2", "Data Mapping", 6, Width.MEDIUM, "date columns")
                .define(COLUMNS_MAP, Type.STRING, "", null, Importance.HIGH, "columns map between Source Table and Target Table.\nFor example:source_table_column1:target_table_column1,source_table_column2:target_table_column2", "Data Mapping", 6, Width.MEDIUM, "columns map")
                .define(SYNC_FIELD_TIMESTAMP, Type.STRING, "", null, Importance.HIGH, "The Target Table synchronize timestamp column name.\nFor example:SYN_LAST_UPDATE_TIME", "Data Mapping", 6, Width.MEDIUM, "synchronize timestamp column name")
                .define(SYNC_FIELD_DELETED, Type.STRING, "", null, Importance.HIGH, "The Target Table synchronize is deleted mark column name.\nFor example:SYN_IS_DELETED", "Data Mapping", 6, Width.MEDIUM, "synchronize delete column name")
                .define(SYNC_FIELD_DELETED_MARK, Type.STRING, "", null, Importance.HIGH, "The Target Table synchronize is deleted mark column value.\nFor example:true or false", "Data Mapping", 6, Width.MEDIUM, "synchronize delete column value")
                .define(LOG_LEVEL, Type.STRING, LOG_LEVEL_DEFAULT, null, Importance.LOW, "Kafka Connect Console log output level", "Log Mapping", 7, Width.MEDIUM, "log level")
                .define(TIMESTAMP_HOURS_DIFFERENCE, Type.STRING, "", null, Importance.MEDIUM, "After debezium transform timestamp field, the difference hours for the real time", "Data Mapping", 6, Width.MEDIUM, "the timestamp fields difference")
        ;
    }

    public JdbcSinkConfig(Map<?, ?> props) {
        super(CONFIG_DEFINITION, props);
    }

    public static void main(String... args) {
        System.out.println(CONFIG_DEFINITION.toEnrichedRst());
    }
}

说明:

TimeZoneValidator.java不提供,没有太大作用,可以用null代替。

JdbcSinkConnector.java

import code.marydon.configs.JdbcSinkConfig;
import code.marydon.tasks.MysqlSinkTask;
import code.marydon.tasks.OracleSinkTask;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;
import org.apache.kafka.common.config.ConfigDef;
import org.apache.kafka.connect.connector.Task;
import org.apache.kafka.connect.errors.ConnectException;
import org.apache.kafka.connect.sink.SinkConnector;

import java.util.ArrayList;
import java.util.List;
import java.util.Map;
查看代码
 
/**
 * 输出连接器,用来实现读取配置信息和分配任务等一些初始化工作
 * @description:
 * @author: Marydon
 * @date: 2023-12-06 10:56
 * @version: 1.0
 * @email: marydon20170307@163.com
 */
@Slf4j
public class JdbcSinkConnector extends SinkConnector {
    private Map<String, String> configProperties;

    // 初始化
    @Override
    public void start(Map<String, String> props) {
        this.configProperties = Map.copyOf(props);

        String topics = props.get(JdbcSinkConfig.TOPICS_REGEX);
        if (StringUtils.isEmpty(topics)) {
            throw new ConnectException("JdbcSinkConnector configuration must include '" + JdbcSinkConfig.TOPICS_REGEX + "' setting");
        }
    }

    //指定要执行的Task类
    @Override
    public Class<? extends Task> taskClass() {
        String driveClassName = configProperties.get(JdbcSinkConfig.CONNECTION_DRIVER_CLASS_NAME);

        // return JdbcSinkTask.class;
        if (driveClassName.indexOf("mysql") > 0)// 目标库是mysql
            return MysqlSinkTask.class;
        else// 目标库是oracle
            return OracleSinkTask.class;
    }

    //task对应的config
    @Override
    public List<Map<String, String>> taskConfigs(int maxTasks) {
        log.info("Setting task configurations for {} workers.", maxTasks);
        List<Map<String, String>> configs = new ArrayList<>(maxTasks);

        for(int i = 0; i < maxTasks; i++) {
            configs.add(this.configProperties);
        }

        return configs;
    }

    @Override
    public void stop() {

    }

    //配置定义
    @Override
    public ConfigDef config() {
        // 返回配置定义,用于连接配置的校验
        return JdbcSinkConfig.CONFIG_DEFINITION;
    }

    @Override
    public String version() {
        // return AppInfoParser.getVersion();
        return "1.1.Final";
    }
}

DynamicConnection.java

import lombok.extern.slf4j.Slf4j;

import java.sql.CallableStatement;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import javax.sql.DataSource;
查看代码
 
/**
 * 数据库动态连接
 * @description: 动态SQL
 * @author: Marydon
 * @date: 2023-12-06 16:06
 * @version: 1.0
 * @email: marydon20170307@163.com
 */
@Slf4j
public class DynamicConnection {
    private DataSource ds;

    public DynamicConnection(DataSource ds) {
        this.ds = ds;
    }

    public List<Map<String, Object>> queryForList(String sql) throws SQLException {
        Connection conn = null;
        Statement statement = null;
        ResultSet rs = null;
        try {
            conn = this.ds.getConnection();
            statement = conn.createStatement();
            rs = statement.executeQuery(sql);
            return this.rsToList(rs);
        } catch (Exception var10) {
            throw new RuntimeException(var10);
        } finally {
            closeDB(conn, statement, rs, null);
        }
    }

    public List<Map<String, Object>> queryForList(String sql, Object[] params) throws SQLException {
        Connection conn = null;
        PreparedStatement ps = null;
        ResultSet rs = null;

        try {
            conn = this.ds.getConnection();
            ps = conn.prepareStatement(sql);

            for(int i = 0; i < params.length; ++i) {
                ps.setObject(i + 1, params[i]);
            }

            rs = ps.executeQuery();
            return this.rsToList(rs);
        } finally {
            closeDB(conn, ps, rs, null);
        }
    }

    private List<Map<String, Object>> rsToList(ResultSet rs) throws SQLException {
        List<Map<String, Object>> list = new ArrayList<>();
        ResultSetMetaData rsmd = rs.getMetaData();
        int rowCount = rsmd.getColumnCount();
        Map<String, Object> map;

        while(rs.next()) {
            map = new HashMap<>(rowCount);

            for(int i = 1; i <= rowCount; ++i) {
                map.put(rsmd.getColumnName(i), rs.getObject(i));
            }

            list.add(map);
        }

        return list;
    }

    public boolean insert(String sql) throws SQLException {
        Connection conn = null;
        Statement statement = null;
        try {
            conn = this.ds.getConnection();
            statement = conn.createStatement();
            return statement.execute(sql);
        } finally {
            closeDB(conn, statement, null, null);
        }
    }

    public boolean insert(String sql, Object[] params) throws SQLException {
        Connection conn = null;
        PreparedStatement ps = null;
        try {
            conn = this.ds.getConnection();
            ps = conn.prepareStatement(sql);

            for(int i = 0; i < params.length; ++i) {
                ps.setObject(i + 1, params[i]);
            }

            return ps.execute();
        } finally {
            closeDB(conn, ps, null, null);
        }
    }

    public boolean executeBatch(String sql, List<Object[]> list) throws SQLException {
        Connection conn = null;
        PreparedStatement ps = null;
        try {
            conn = this.ds.getConnection();
            conn.setAutoCommit(false);
            ps = conn.prepareStatement(sql);

            for(int i = 0; i < list.size(); ++i) {
                Object[] params = list.get(i);

                for(int j = 0; j < params.length; ++j) {
                    ps.setObject(j + 1, params[j]);
                }

                ps.addBatch();
                if (i % 300 == 0) {
                    ps.executeBatch();
                    ps.clearBatch();
                }
            }

            ps.executeBatch();
            ps.clearBatch();
            conn.commit();
            ps.close();
            return true;
        } catch (Exception var11) {
            if (conn != null) {
                conn.rollback();
            }
            return false;
        } finally {
            closeDB(conn, ps, null, null);
        }

    }

    public int update(String sql) throws SQLException {
        Connection conn = null;
        Statement statement = null;
        try {
            conn = this.ds.getConnection();
            statement = conn.createStatement();
            return statement.executeUpdate(sql);
        } finally {
            closeDB(conn, statement, null, null);
        }
    }

    public int update(String sql, Object[] params) throws SQLException {
        Connection conn = null;
        PreparedStatement ps = null;
        try {
            conn = this.ds.getConnection();
            ps = conn.prepareStatement(sql);

            int result;
            for(result = 0; result < params.length; ++result) {
                ps.setObject(result + 1, params[result]);
            }

            return ps.executeUpdate();
        } finally {
            closeDB(conn, ps, null, null);
        }
    }

    public boolean execProcedure(String sql) throws SQLException {
        return this.execProcedure(sql, new Object[0]);
    }

    public boolean execProcedure(String sql, Object[] params) throws SQLException {
        Connection conn = null;
        CallableStatement cs = null;

        try {
            conn = this.ds.getConnection();
            cs = conn.prepareCall(sql);

            for(int i = 0; i < params.length; ++i) {
                cs.setObject(i + 1, params[i]);
            }

            return cs.execute();
        } finally {
            closeDB(conn, null, null, cs);
        }
    }

    public List<Map<String, Object>> queryProcedure(String sql) throws SQLException {
        return this.queryProcedure(sql, new Object[0]);
    }

    public List<Map<String, Object>> queryProcedure(String sql, Object[] params) throws SQLException {
        Connection conn = null;
        CallableStatement cs = null;
        ResultSet rs = null;

        try {
            conn = this.ds.getConnection();
            log.debug("获取链接时长:");
            cs = conn.prepareCall(sql);
            if (params != null) {
                for(int i = 0; i < params.length; ++i) {
                    cs.setObject(i + 1, params[i]);
                }
            }

            rs = cs.executeQuery();
            log.debug("执行语句时长:,sql:" + sql);
            return this.rsToList(rs);
        } finally {
            log.debug("执行连接池回收");
            closeDB(conn, null, rs, cs);
        }
    }

    public List<Map<String, Object>> queryProcedureMoreRes(String sql) throws SQLException {
        return this.queryProcedure(sql);
    }

    public List<String> getColumnNames(String sql) throws SQLException {
        Connection conn = null;
        PreparedStatement ps = null;
        ResultSet rs = null;
        List<String> columns = new ArrayList<>();

        try {
            conn = this.ds.getConnection();
            ps = conn.prepareStatement(sql);
            rs = ps.executeQuery();
            ResultSetMetaData rsmd = rs.getMetaData();

            for(int i = 1; i <= rsmd.getColumnCount(); ++i) {
                columns.add(rsmd.getColumnName(i));
            }

            return columns;
        } finally {
            closeDB(conn, ps, rs, null);
        }
    }

    /**
     * 关闭数据库连接
     */
    public void closeDB(Connection conn, Statement ps, ResultSet rs, CallableStatement cs) {
        try {
            if (null != ps && !ps.isClosed()) {
                ps.close();
            }
            if (null != rs && !rs.isClosed()) {
                rs.close();
            }
            if (null != cs && !cs.isClosed()) {
                cs.close();
            }

            if (null != conn && !conn.isClosed()) {
                conn.close();
                log.debug("连接池已关闭");
            }
        } catch (SQLException e) {
            e.printStackTrace();
            log.error("数据库断开连接失败:{}", e.getMessage());
        }
    }
}

 DynamicDataSource.java

import com.zaxxer.hikari.HikariDataSource;
import lombok.extern.slf4j.Slf4j;

import java.util.List;
import java.util.concurrent.*;
import javax.sql.DataSource;
查看代码
/**
 * 动态配置数据源
 * @description:
 * @author: Marydon
 * @date: 2023-12-06 16:03
 * @version: 1.0
 * @email: marydon20170307@163.com
 */
@Slf4j
public class DynamicDataSource {
    private static final ConcurrentHashMap<String, DataSource> dataSources = new ConcurrentHashMap<>();
    private static DynamicDataSource instance = null;
    private static final Object lock = new Object();
    static int index = 1;

    private DynamicDataSource() {
    }

    private static DynamicDataSource getInstance() {
        if (instance == null) {
            synchronized(DynamicDataSource.class) {
                if (instance == null) {
                    instance = new DynamicDataSource();
                }
            }
        }

        return instance;
    }

    private DataSource getDataSource(String url, String userName, String pwd, String driveClassName, int maxActive) {
        String dataSourceString = url + userName + pwd + driveClassName;
        String hashCode = "ds" + dataSourceString.hashCode();
        if (!dataSources.containsKey(hashCode)) {
            synchronized(lock) {
                if (!dataSources.containsKey(hashCode)) {
                    HikariDataSource ds = new HikariDataSource();
                    ds.setJdbcUrl(url);
                    ds.setUsername(userName);
                    ds.setPassword(pwd);
                    ds.setDriverClassName(driveClassName);
                    ds.setMaximumPoolSize(maxActive);
                    dataSources.put(hashCode, ds);
                    log.debug("DataSource Key:" + hashCode);
                }
            }
        }

        return dataSources.get(hashCode);
    }

    public static DynamicConnection getDynamicConnection(String url, String userName, String pwd, String driveClassName) {
        return getDynamicConnection(url, userName, pwd, driveClassName, 50);
    }

    public static DynamicConnection getDynamicConnection(String url, String userName, String pwd, String driveClassName, int maxActive) {
        DynamicDataSource dynamicDataSource = getInstance();
        DataSource ds = dynamicDataSource.getDataSource(url, userName, pwd, driveClassName, maxActive);
        return new DynamicConnection(ds);
    }

    public static void main(String[] args) {
        ExecutorService executorService = Executors.newCachedThreadPool();
        ExecutorCompletionService completionService = new ExecutorCompletionService(executorService);

        for(int i = 0; i < 100; ++i) {
            completionService.submit(() ->{
                DynamicConnection conn = code.marydon.db.DynamicDataSource.getDynamicConnection("jdbc:mysql://192.168.0.1:3306/test60", "root", "test", "com.mysql.cj.jdbc.Driver");
                List result = conn.queryForList("select * from base_ac_user limit 1,5");
                Integer[] params = new Integer[]{1, 7};
                DynamicConnection conn1 = code.marydon.db.DynamicDataSource.getDynamicConnection("jdbc:mysql://192.168.0.1:3306/test", "root", "test", "com.mysql.cj.jdbc.Driver");
                List result1 = conn1.queryForList("select * from base_ac_user limit ?,?", params);
                System.out.println(code.marydon.db.DynamicDataSource.index++ + ":" + result.size() + "," + result1.size());
                return null;
            });
        }

    }

}

Operation.java

/**
 * 数据库操作关键词枚举类
 * @description:
 * op :表示当前事件的类型,取值为:c表示insert、u表示update、d表示delete、r表示快照read。
 * @author: Marydon
 * @date: 2023-12-06 17:37
 * @version: 1.0
 * @email: marydon20170307@163.com
 */
public enum Operation {
    // 插入:read(r)。表示快照read操作,数据部只包含后镜像(after)。before的值为null
    READ("r"),
    // 建表:create(c)。表示insert操作,数据部只包含后镜像(after)。before的值为null
    CREATE("c"),
    // 更新:update(u)。表示update操作,数据部同时包含前镜像(before)和后镜像(after)。before是更新前的当前行数据,after是更新后的当前行所有数据
    UPDATE("u"),
    // 删除:delete(d)。表示delete操作,数据部只包含前镜像(before)。after的值为null
    DELETE("d"),
    // 清空表
    TRUNCATE("t"),

    MESSAGE("m");
    private final String code;

    Operation(String code) {
        this.code = code;
    }

    public String code() {
        return this.code;
    }
}

 MysqlSinkTask.java

import code.marydon.configs.JdbcSinkConfig;
import code.marydon.connectors.JdbcSinkConnector;
import code.marydon.db.DynamicConnection;
import code.marydon.db.DynamicDataSource;
import code.marydon.enums.Operation;
import code.marydon.utils.DateUtils;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;
import org.apache.kafka.connect.data.Field;
import org.apache.kafka.connect.data.Struct;
import org.apache.kafka.connect.sink.SinkRecord;
import org.apache.kafka.connect.sink.SinkTask;

import java.sql.SQLException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.Map;
查看代码
/**
 * 接收从kafka订阅到的数据,并将数据同步到mysql库
 * @description: 目标表是mysql数据库
 * @author: Marydon
 * @date: 2023-12-27 11:04
 * @version: 1.0
 * @email: marydon20170307@163.com
 */
@Slf4j
public class MysqlSinkTask extends SinkTask {
    private DynamicConnection connection;
    // 待同步的表名
    private String tableName = "";
    // 主键字段
    private String primaryFields = "";
    // 获取目标表与源表的字段映射
    String[] targetColumnsMapArray = new String[0];
    // 获取目标表日期字段列
    String[] targetDateColumnsArray = new String[0];
    // 时间戳字段名称
    private String timestampField = "";
    // 删除标识字段名称
    private String deleteMarkField = "";
    // 删除标识字段的值
    private String deleteMarkFieldValue = "";
    // 插入模式
    private String insertMode = "";
    // 日志级别
    private String logLevel = "";
    // 时间戳与真实数据库timestamp字段的时间差(小时差)
    private Integer timestampDifference = 0;

    private String DATE_TIME_FORMAT = "%Y-%m-%d %H:%i:%s";
    private String DATE_FORMAT = "%Y-%m-%d";

    @Override
    public String version() {
        return new JdbcSinkConnector().version();
    }

    //task启动
    @Override
    public void start(Map<String, String> map) {
        String url = map.get(JdbcSinkConfig.CONNECTION_URL);
        String userName = map.get(JdbcSinkConfig.CONNECTION_USER);
        String password = map.get(JdbcSinkConfig.CONNECTION_PASSWORD);
        String driveClassName = map.get(JdbcSinkConfig.CONNECTION_DRIVER_CLASS_NAME);
        log.info("config:{}", map);
        this.connection = DynamicDataSource.getDynamicConnection(url, userName, password, driveClassName);

        tableName = map.get(JdbcSinkConfig.TABLE_NAME_FORMAT);
        primaryFields = map.get(JdbcSinkConfig.PK_FIELDS);
        // 目标表中表示日期的字段
        String targetDateColumns = map.get(JdbcSinkConfig.COLUMNS_DATE);
        // 目标表字段与源表字段的映射关系
        String targetColumnsMap = map.get(JdbcSinkConfig.COLUMNS_MAP);

        targetColumnsMapArray = targetColumnsMap.split(",");
        targetDateColumnsArray = targetDateColumns.split(",");
        timestampField = map.get(JdbcSinkConfig.SYNC_FIELD_TIMESTAMP);
        deleteMarkField = map.get(JdbcSinkConfig.SYNC_FIELD_DELETED);
        deleteMarkFieldValue = map.get(JdbcSinkConfig.SYNC_FIELD_DELETED_MARK);
        insertMode = map.get(JdbcSinkConfig.INSERT_MODE);
        logLevel = map.get(JdbcSinkConfig.LOG_LEVEL);
        String thd = map.get(JdbcSinkConfig.TIMESTAMP_HOURS_DIFFERENCE);
        if (StringUtils.isNotBlank(thd)) {
            timestampDifference = Integer.parseInt(thd);
        }
    }

    //数据put
    @Override
    public void put(Collection<SinkRecord> records) {
        try {
            for (SinkRecord record : records) {
                // 数据处理
                this.pull(record);
            }
        } catch (Exception var4) {
            throw new RuntimeException(var4);
        }
    }

    /**
     * 数据同步(从kafka拉取数据)
     * @explain
     * 1.mysql和Oracle进行互相同步时,必须要保证待同步的字段不包含time类型
     * 2.源表和目标表的主键必须保持一致,因为目标表的数据同步(修改和删除),是按主键走的
     * 3.源表的timestamp类型字段必须与目标表的timestamp类型映射字段保持一致
     * 4.当mysql之间进行数据同步时,支持time类型字段进行同步
     * 5.当mysql之间进行数据同步时,日期类型必须保持一致(datetime-->datetime,date-->date,time-->time,timestamp-->timestamp)
     * @param record
     * @throws SQLException
     */
    private void pull(SinkRecord record) throws SQLException {
        // payload分为:before、after、source、op和ts_ms等几部分
        Struct value = (Struct)record.value();
        if (null == value) return;
        // Struct{after=Struct{patient_id=1, zs_id=2, create_time=1702826679000, update_time=19708, time=2023-12-17T21:24:44Z, id=1}, source=Struct{version=2.2.1.Final, connector=mysql,  name=topic-medi_data_cent-70, ts_ms=1703630644000, snapshot=first, db-me_data_cent, table=t_patient_zs, server_id=0, file=binlog.003268, pos=581446144, row=0}, op=r, ts_ms=1703580582732}
        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info("try synchronize data:{}", value);
        }

        // source:事件源的结构信息,包括connector版本、事务ID等
        Struct source = value.getStruct("source");
        // String sourceDatabaseName = source.getString("db");
        // 当入参table.name.format值为""或null时,按目标库的目标表与源库的源表表名一致处理
        if (StringUtils.isEmpty(tableName)) tableName = source.getString("table");

        // 源数据库类型
        String sourceConnectorName = source.getString("connector");
        String operation = value.getString("op");
        // before:变化事件发生之前的值
        Struct beforeStruct = value.getStruct("before");
        // after:变化事件发生之后的值
        Struct afterStruct = value.getStruct("after");

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info("operation:{}", operation);
        }

        // READ/CREATE
        // 暂定read和create操作,都先执行select操作,如果主键存在,执行update操作,主键不存在则执行insert操作
        // 后续,如果看实际需要可以将二者进行拆分
        if (operation.equals(Operation.READ.code()) || operation.equals(Operation.CREATE.code())) {// r or c
            // 根据主键查询数据(beforeStruct为null)
            List<Map<String, Object>> mapList = this.selectOperation(afterStruct);
            if (mapList == null || mapList.isEmpty()) {// 目标表中没有数据,则执行插入操作
                // 插入操作
                this.insertOperation(afterStruct, sourceConnectorName, false);
                return;
            }

            if (JdbcSinkConfig.INSERT_MODE_DEFAULT.equalsIgnoreCase(insertMode)) {// 目标表中该主键已存在,则执行更新操作
                // 更新操作
                this.updateOperation(afterStruct, sourceConnectorName);
                return;
            }

        }

        // UPDATE
        if (operation.equals(Operation.UPDATE.code())) {// u
            // 当主键字段为空时,禁止更新同步
            if (StringUtils.isEmpty(primaryFields)) return;

            // 执行更新操作
            int updateRows = this.updateOperation(beforeStruct, afterStruct, sourceConnectorName);

            // 更新失败,说明该条记录不存在 && 插入模式为upsert
            if (updateRows < 1 && JdbcSinkConfig.INSERT_MODE_DEFAULT.equalsIgnoreCase(insertMode)) {
                // 不存在,则执行插入操作
                // 说明:oracle必须执行补全日志操作
                this.insertOperation(afterStruct, sourceConnectorName, false);
            }
            return;
        }

        // DELETE/TRUNCATE
        if (operation.equals(Operation.DELETE.code()) || operation.equals(Operation.TRUNCATE.code())) {// d or t
            // 当主键字段为空时,禁止删除同步
            if (StringUtils.isEmpty(primaryFields)) return;

            // 执行删除操作
            if (StringUtils.isBlank(deleteMarkField)) {
                // 物理删除
                this.physicalDeleteOperation(beforeStruct);
                return;
            }

            // 逻辑删除
            int updateRows = this.logicalDeleteOperation(beforeStruct);
            // 逻辑删除失败,说明该条记录不存在 && 插入模式为upsert
            if (updateRows < 1 && JdbcSinkConfig.INSERT_MODE_DEFAULT.equalsIgnoreCase(insertMode)) {
                // 不存在,则执行插入操作
                this.insertOperation(beforeStruct, sourceConnectorName, true);
            }

        }

    }

    /**
     * 查询操作
     * @description: 根据主键查询记录
     * @date: 2024/3/26 18:04
     * @param afterStruct  变化事件发生之后的值
     * @return java.util.List<java.util.Map<java.lang.String,java.lang.Object>>
     */
    private List<Map<String, Object>> selectOperation(Struct afterStruct) throws SQLException {
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;

        // where条件,根据主键进行查询
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : afterStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    fieldValues.add(afterStruct.get(pk));
                    break;
                }
            }

        }

        // 根据主键删除记录
        String sql = "SELECT 1 FROM " + tableName + " WHERE " + String.join(" AND ", fieldParams);

        // if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
        //     log.info(sql);
        // }

        return this.connection.queryForList(sql, fieldValues.toArray());
    }

    /**
     * 插入操作
     * @description: 
     * @date: 2024/3/25 18:29
     * @param afterStruct 变化事件发生之后的值
     * @param sourceConnectorType 源数据库类型
     * @param addDeleteMark 是否添加删除标记字段
     * @return boolean 插入成功、失败
     */
    private boolean insertOperation(Struct afterStruct, String sourceConnectorType, Boolean addDeleteMark) throws SQLException {
        ArrayList<String> fieldNames = new ArrayList<>();
        ArrayList<String> fieldParams = new ArrayList<>();
        ArrayList<Object> fieldValues = new ArrayList<>();
        String fieldName;
        Object fieldValue;

        // 源字段类型
        String sourceColumn;
        // 目标表字段名称
        String targetColumn;
        // 目标日期字段名称
        String targetDateColumnName;
        // 目标日期字段类型
        String targetDateColumnType;

        for (Field afterField : afterStruct.schema().fields()) {
            fieldName = afterField.name();
            fieldValue = afterStruct.get(fieldName);
            targetColumn = "";

            // 源表与目标表字段映射关系(根据源表字段映射出目标表字段)
            if (targetColumnsMapArray.length > 0) {
                for (String tcm : targetColumnsMapArray) {
                    sourceColumn = tcm.split(":")[0];
                    targetColumn = tcm.split(":")[1];// 关键

                    // 当要新增的列名与此循环的源表列名一致时,结束循环
                    if (sourceColumn.equalsIgnoreCase(fieldName)) break;
                }
            } else {// 说明:入参columns.map没有设值
                // 这样的话,要求目标表字段必须和源表表字段保持一致
                targetColumn = fieldName;
            }

            // 日期字段转换
            // 将日期字段名称,置空
            targetDateColumnName = "";
            targetDateColumnType = "";

            if (targetDateColumnsArray.length > 0) {
                for (String ttc : targetDateColumnsArray) {
                    targetDateColumnName = ttc.split(":")[0];
                    targetDateColumnType = ttc.split(":")[1];

                    // 当要新增的列名为日期字段时,结束循环
                    if (targetDateColumnName.equals(targetColumn)) {
                        break;
                    }

                    // 将日期字段名称,置空
                    targetDateColumnName = "";
                    targetDateColumnType = "";
                }
            } else {// 说明:入参columns.date没有设值
                // 这样的话,要求目标表字段和源表表字段都没有日期字段
                // 将日期字段名称,置空
                targetDateColumnName = "";
                targetDateColumnType = "";
            }

            // 目标日期字段不为空,说明当前要插入的字段是日期类型
            if (!targetDateColumnName.isEmpty()) {
                fieldNames.add(targetColumn);
                boolean isNull = "".equals(fieldValue) || "null".equals(fieldValue) || null == fieldValue;
                if (sourceConnectorType.equals("mysql")) {
                    if (targetDateColumnType.equalsIgnoreCase("datetime")) {// 目标表字段是mysql的datetime字段
                        // mysql datetime
                        // io.debezium.time.Timestamp
                        fieldParams.add("STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT));
                        }
                    } else if (targetDateColumnType.equalsIgnoreCase("date")) {// 目标表字段是mysql的date字段
                        // mysql date
                        // io.debezium.time.Date
                        fieldParams.add("STR_TO_DATE(?, '" + DATE_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.getSomeDay(Integer.parseInt(fieldValue + "")));
                        }
                    } else if (targetDateColumnType.equalsIgnoreCase("timestamp")) {// 目标表字段是mysql的timestamp字段
                        // mysql timestamp
                        // io.debezium.time.ZonedTimestamp
                        fieldParams.add("STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.fromISO8601((String) fieldValue, timestampDifference));
                        }
                    } else {// mysql time
                        // io.debezium.time.MicroTime
                        fieldParams.add("TIME(?)");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.secondsToTime((Long) fieldValue / 1000000));
                        }
                    }
                } else {// 源表oracle
                    if (targetDateColumnType.equalsIgnoreCase("datetime") || targetDateColumnType.equalsIgnoreCase("timestamp")) {// 目标表字段是mysql的datetime字段或者timestamp字段
                        // oracle timestamp-->io.debezium.time.MicroTimestamp
                        // oracle date(包含带时间和不带时间)-->io.debezium.time.Timestamp
                        fieldParams.add("STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT));
                        }
                    } else {// 目标表字段是mysql的date字段
                        // oracle date
                        // io.debezium.time.Timestamp
                        fieldParams.add("STR_TO_DATE(?, '" + DATE_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT).substring(0, 10));
                        }
                    }
                }
            } else {// 非日期字段
                // fieldNames.add(fieldName);
                fieldNames.add(targetColumn);
                fieldParams.add("?");
                fieldValues.add(fieldValue);
            }

        }

        if (fieldNames.isEmpty()) {
            return false;
        }

        // Date:2024/2/23
        // 增加时间戳标识字段
        if (StringUtils.isNotBlank(timestampField)) {
            fieldNames.add(timestampField);
            fieldParams.add("STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
            fieldValues.add(DateUtils.getSysdateStr(null));
        }

        // 增加标识是否删除字段(逻辑删除delete不存在,改为insert时,插入已删除标识)
        if (addDeleteMark && StringUtils.isNotBlank(deleteMarkField)) {
            fieldNames.add(deleteMarkField);
            fieldParams.add("?");
            fieldValues.add(deleteMarkFieldValue);
        }

        String sql = "insert into " + tableName + "(" + String.join(",", fieldNames) + ")values(" + String.join(",", fieldParams) + ")";

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info(sql);
        }

        return this.connection.insert(sql, fieldValues.toArray());
    }

    /**
     * 更新操作
     * @description: 
     * @date: 2024/3/25 18:35
     * @param beforeStruct 变化事件发生之前的值
     * @param afterStruct 变化事件发生之后的值
     * @param sourceConnectorType 源数据库类型
     * @return int
     */
    private int updateOperation(Struct beforeStruct, Struct afterStruct, String sourceConnectorType) throws SQLException {
        List<String> fieldNames = new ArrayList<>();
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;
        Object fieldValue;

        // 源字段类型
        String sourceColumn;
        // 目标表字段名称
        String targetColumn;
        // 目标日期字段名称
        String targetDateColumnName;
        // 目标日期字段类型
        String targetDateColumnType;


        // 先比对值前后变化,确定哪些字段需要更新
        // 说明:主键不能变
        for (Field afterField : afterStruct.schema().fields()) {
            fieldName = afterField.name();
            fieldValue = afterStruct.get(fieldName);
            // after和before相同字段的值不一致,说明被更新(mysql,数据更新前后全字段展示;Oracle默认情况下,只提供数据发生变化的字段(不含主键字段),我们可以通过补全日志来展示出所有字段)
            if (!String.valueOf(fieldValue).equals(String.valueOf(beforeStruct.get(fieldName)))) {
                targetColumn = "";
                // 源表与目标表字段映射关系
                for (String tcm : targetColumnsMapArray) {
                    sourceColumn = tcm.split(":")[0];
                    targetColumn = tcm.split(":")[1];

                    // 当要更新的列名与此循环的源表列名一致时,结束循环
                    if (sourceColumn.equalsIgnoreCase(fieldName)) break;
                }

                // 日期字段转换
                // 将日期字段名称,置空
                targetDateColumnName = "";
                targetDateColumnType = "";
                for (String ttc : targetDateColumnsArray) {
                    targetDateColumnName = ttc.split(":")[0];
                    targetDateColumnType = ttc.split(":")[1];

                    // 当要更新的列名为日期字段时,结束循环
                    if (targetDateColumnName.equals(targetColumn)) {
                        break;
                    } else {
                        // 将日期字段名称,置空
                        targetDateColumnName = "";
                        targetDateColumnType = "";
                    }
                }

                // 目标日期字段不为空,说明当前要更新的字段是日期类型
                if (!targetDateColumnName.isEmpty()) {
                    boolean isNull = "".equals(fieldValue) || "null".equals(fieldValue) || null == fieldValue;
                    if (sourceConnectorType.equals("mysql")) {
                        if (targetDateColumnType.equalsIgnoreCase("datetime")) {// 目标表字段是mysql的datetime字段
                            // mysql datetime
                            // io.debezium.time.Timestamp
                            fieldNames.add(targetColumn + "=STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                            if (isNull) {
                                fieldValues.add(null);
                            } else {
                                fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT));
                            }
                        } else if (targetDateColumnType.equalsIgnoreCase("date")) {// 目标表字段是mysql的date字段
                            // mysql date
                            // io.debezium.time.Date
                            fieldNames.add(targetColumn + "=STR_TO_DATE(?, '" + DATE_FORMAT + "')");
                            if (isNull) {
                                fieldValues.add(null);
                            } else {
                                fieldValues.add(DateUtils.getSomeDay((Integer) fieldValue));
                            }
                        } else if (targetDateColumnType.equalsIgnoreCase("timestamp")) {// 目标表字段是mysql的timestamp字段
                            // mysql timestamp
                            // io.debezium.time.ZonedTimestamp
                            fieldNames.add(targetColumn + "=STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                            if (isNull) {
                                fieldValues.add(null);
                            } else {
                                fieldValues.add(DateUtils.fromISO8601((String) fieldValue, timestampDifference));
                            }
                        } else {
                            // mysql time
                            // io.debezium.time.MicroTime
                            fieldNames.add(targetColumn + "=TIME(?)");
                            if (isNull) {
                                fieldValues.add(null);
                            } else {
                                fieldValues.add(DateUtils.secondsToTime((Long) fieldValue / 1000000));
                            }
                        }
                    } else {// 源表oracle
                        if (targetDateColumnType.equalsIgnoreCase("datetime") || targetDateColumnType.equalsIgnoreCase("timestamp")) {// 目标表字段是mysql的datetime字段或者timestamp字段
                            // oracle timestamp-->io.debezium.time.MicroTimestamp
                            // oracle date(包含带时间和不带时间)-->io.debezium.time.Timestamp
                            fieldNames.add(targetColumn + "=STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                            if (isNull) {
                                fieldValues.add(null);
                            } else {
                                fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT));
                            }
                        } else {// 目标表字段是mysql的date字段
                            // oracle date
                            // io.debezium.time.Timestamp
                            fieldNames.add(targetColumn + "=STR_TO_DATE(?, '" + DATE_FORMAT + "')");
                            if (isNull) {
                                fieldValues.add(null);
                            } else {
                                fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT).substring(0, 10));
                            }
                        }
                    }
                } else {// 非日期字段
                    // fieldNames.add(fieldName + "=?");
                    fieldNames.add(targetColumn + "=?");
                    fieldValues.add(fieldValue);
                }
            }



        }

        // 没有发生数据变化的字段
        if (fieldNames.isEmpty()) {
            return 11;// 此处返回11,表示没有发生数据变化,无需执行update操作
        }

        // 增加时间戳标识字段
        if (StringUtils.isNotBlank(timestampField)) {
            fieldNames.add(timestampField + "=STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
            fieldValues.add(DateUtils.getSysdateStr(null));
        }

        // where条件,根据主键更新
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : afterStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    fieldValues.add(afterStruct.get(pk));
                    break;
                }
            }

        }

        String sql = "UPDATE " + tableName + " SET " + String.join(",", fieldNames) + " WHERE " + String.join(" AND ", fieldParams);

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info(sql);
        }

        // 执行更新操作
        return this.connection.update(sql, fieldValues.toArray());
    }

    /**
     * 更新操作
     * @description: 数据来源于insert,beforeStruct为null
     * @date: 2024/3/26 18:35
     * @param afterStruct 变化事件发生之后的值
     * @param sourceConnectorType 源数据库类型
     * @return int
     */
    private int updateOperation(Struct afterStruct, String sourceConnectorType) throws SQLException {
        List<String> fieldNames = new ArrayList<>();
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;
        Object fieldValue;

        // 源字段类型
        String sourceColumn;
        // 目标表字段名称
        String targetColumn;
        // 目标日期字段名称
        String targetDateColumnName;
        // 目标日期字段类型
        String targetDateColumnType;

        // 说明:主键不能变
        for (Field afterField : afterStruct.schema().fields()) {
            fieldName = afterField.name();
            fieldValue = afterStruct.get(fieldName);
            targetColumn = "";
            // 源表与目标表字段映射关系
            for (String tcm : targetColumnsMapArray) {
                sourceColumn = tcm.split(":")[0];
                targetColumn = tcm.split(":")[1];

                // 当要更新的列名与此循环的源表列名一致时,结束循环
                if (sourceColumn.equalsIgnoreCase(fieldName)) break;
            }

            // 日期字段转换
            // 将日期字段名称,置空
            targetDateColumnName = "";
            targetDateColumnType = "";
            for (String ttc : targetDateColumnsArray) {
                targetDateColumnName = ttc.split(":")[0];
                targetDateColumnType = ttc.split(":")[1];

                // 当要更新的列名为日期字段时,结束循环
                if (targetDateColumnName.equals(targetColumn)) {
                    break;
                } else {
                    // 将日期字段名称,置空
                    targetDateColumnName = "";
                    targetDateColumnType = "";
                }
            }

            // 目标日期字段不为空,说明当前要更新的字段是日期类型
            if (!targetDateColumnName.isEmpty()) {
                boolean isNull = "".equals(fieldValue) || "null".equals(fieldValue) || null == fieldValue;
                if (sourceConnectorType.equals("mysql")) {
                    if (targetDateColumnType.equalsIgnoreCase("datetime")) {// 目标表字段是mysql的datetime字段
                        // mysql datetime
                        // io.debezium.time.Timestamp
                        fieldNames.add(targetColumn + "=STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT));
                        }
                    } else if (targetDateColumnType.equalsIgnoreCase("date")) {// 目标表字段是mysql的date字段
                        // mysql date
                        // io.debezium.time.Date
                        fieldNames.add(targetColumn + "=STR_TO_DATE(?, '" + DATE_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.getSomeDay((Integer) fieldValue));
                        }
                    } else if (targetDateColumnType.equalsIgnoreCase("timestamp")) {// 目标表字段是mysql的timestamp字段
                        // mysql timestamp
                        // io.debezium.time.ZonedTimestamp
                        fieldNames.add(targetColumn + "=STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.fromISO8601((String) fieldValue, timestampDifference));
                        }
                    } else {
                        // mysql time
                        // io.debezium.time.MicroTime
                        fieldNames.add(targetColumn + "=TIME(?)");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.secondsToTime((Long) fieldValue / 1000000));
                        }
                    }
                } else {// 源表oracle
                    if (targetDateColumnType.equalsIgnoreCase("datetime") || targetDateColumnType.equalsIgnoreCase("timestamp")) {// 目标表字段是mysql的datetime字段或者timestamp字段
                        // oracle timestamp-->io.debezium.time.MicroTimestamp
                        // oracle date(包含带时间和不带时间)-->io.debezium.time.Timestamp
                        fieldNames.add(targetColumn + "=STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT));
                        }
                    } else {// 目标表字段是mysql的date字段
                        // oracle date
                        // io.debezium.time.Timestamp
                        fieldNames.add(targetColumn + "=STR_TO_DATE(?, '" + DATE_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT).substring(0, 10));
                        }
                    }
                }
            } else {// 非日期字段
                // fieldNames.add(fieldName + "=?");
                fieldNames.add(targetColumn + "=?");
                fieldValues.add(fieldValue);
            }

        }

        // 增加时间戳标识字段
        if (StringUtils.isNotBlank(timestampField)) {
            fieldNames.add(timestampField + "=STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
            fieldValues.add(DateUtils.getSysdateStr(null));
        }

        // where条件,根据主键更新
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : afterStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    fieldValues.add(afterStruct.get(pk));
                    break;
                }
            }

        }

        String sql = "UPDATE " + tableName + " SET " + String.join(",", fieldNames) + " WHERE " + String.join(" AND ", fieldParams);

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info(sql);
        }

        // 执行更新操作
        return this.connection.update(sql, fieldValues.toArray());
    }

    /**
     * 删除操作(物理删除)
     * @description:
     * @date: 2024/3/25 18:35
     * @param beforeStruct 变化事件发生之前的值
     * @return int 删除的记录数
     */
    private int physicalDeleteOperation(Struct beforeStruct) throws SQLException {
        ArrayList<String> fieldParams = new ArrayList<>();
        ArrayList<Object> fieldValues = new ArrayList<>();
        String fieldName;

        // where条件,根据主键删除
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : beforeStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    fieldValues.add(beforeStruct.get(pk));
                    break;
                }
            }

        }

        // 根据主键删除记录
        String sql = "DELETE FROM " + tableName + " WHERE " + String.join(" AND ", fieldParams);

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info(sql);
        }

        return this.connection.update(sql, fieldValues.toArray());
    }

    /**
     * 删除操作(逻辑删除、假删除)
     * @description: 执行更新操作
     * @date: 2024/3/25 18:35
     * @param beforeStruct 变化事件发生之前的值
     * @return int 更新的记录数
     */
    private int logicalDeleteOperation(Struct beforeStruct) throws SQLException {
        List<String> fieldNames = new ArrayList<>();
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;

        // Date:2024/2/23
        // 增加标识是否删除字段
        fieldNames.add(deleteMarkField + "=?");
        fieldValues.add(deleteMarkFieldValue);

        // 增加时间戳标识字段
        if (StringUtils.isNotBlank(timestampField)) {
            fieldNames.add(timestampField + "=STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
            fieldValues.add(DateUtils.getSysdateStr(null));
        }

        // where条件,根据主键更新
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : beforeStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    fieldValues.add(beforeStruct.get(pk));
                    break;
                }
            }

        }

        String sql = "UPDATE " + tableName + " SET " + String.join(",", fieldNames) + " WHERE " + String.join(" AND ", fieldParams);

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info(sql);
        }

        return this.connection.update(sql, fieldValues.toArray());
    }
    @Override
    public void stop() {

    }
}

DateUtils.java

import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.time.Instant;
import java.time.LocalDateTime;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.time.format.DateTimeFormatter;
import java.util.Calendar;
import java.util.Date;
import java.util.Optional;
查看代码
 /**
 * 日期工具类
 * https://www.cnblogs.com/Marydon20170307/p/17921900.html
 * @description: 处理日期间的格式转换
 * @explain:
 * mysql的datetime类型会被插件io.debezium.connector.mysql.MySqlConnector转成时间戳,需调用DateUtils.timestampToString(1702027934000L,"UTC")
 * mysql的date类型会被插件io.debezium.connector.mysql.MySqlConnector转成天数,需调用getSomeDay(19699)
 * mysql的time类型会被插件io.debezium.connector.mysql.MySqlConnector转成微秒数,需调用DateUtils.secondsToTime(34419000000L / 1000000)
 * mysql的timestamp类型会被插件io.debezium.connector.mysql.MySqlConnector转成ISO8601格式的时间戳,需调用DateUtils.fromISO8601("2023-12-08T15:32:19Z")
 * oracle的date类型和timestamp类型会被插件io.debezium.connector.oracle.OracleConnector转成时间戳,需调用timestampToString(1702837490000L,"UTC"),timestampToString(1702826697000000L,"UTC")
 * 从转换结果来看,oracle的timestamp类型被插件转换后只是被date类型多了3个0
 * @author: Marydon
 * @date: 2023-12-07 11:48
 * @version: 1.0
 * @email: marydon20170307@163.com
 */
public class DateUtils {
    // 年月日(大写M:表示月份,小写m:表示分钟)
    public static final String FORMAT_DATE = "yyyy-MM-dd";
    // 时分秒(大写H:表示24小时制,小写h:表示12小时制)
    public static final String FORMAT_TIME = "HH:mm:ss";
    // 年月日时分秒
    public static final String FORMAT_DATE_TIME = FORMAT_DATE + " " + FORMAT_TIME;
    // 时区
    // 东八区:GMT+8/UTC+8
    public static final String FORMAT_TIME_ZONE = "Asia/Shanghai";

    public static final Integer DEFAULT_DIFFERENCE_HOURS = -6;

    /**
     * 将时间戳转成日期字符串
     * @param timestamp 时间
     * @return 日期格式字符串
     */
    public static String timestampToString(Long timestamp) {
        return toDateTimeString(fromTimeMills(timestamp), FORMAT_DATE_TIME);
    }

    public static String timestampToString(Long timestamp, String timeZone) {
        if (String.valueOf(timestamp).length() == 16) {// 16:毫秒
            return toDateTimeString(fromTimeMills(timestamp / 1000, timeZone), FORMAT_DATE_TIME);
        } else {// 13:秒
            return toDateTimeString(fromTimeMills(timestamp, timeZone), FORMAT_DATE_TIME);
        }
    }

    /**
     * 日期转字符串(日期+时间)
     * @attention: jdk>=1.8
     * @date: 2020年08月31日 0031 17:04
     * @param: dateTime
     * @param: pattern
     * @return: java.lang.String
     */
    public static String toDateTimeString(LocalDateTime dateTime, String pattern) {
        DateTimeFormatter dateTimeFormatter = DateTimeFormatter.ofPattern(pattern);
        return dateTimeFormatter.format(dateTime);
    }

    /**
     * 毫秒数转LocalDateTime
     * @attention: jdk>=1.8
     * @date: 2020年09月10日 0010 11:22
     * @param: timeMills
     * @return: java.time.LocalDateTime
     */
    public static LocalDateTime fromTimeMills(long timeMills){
        return fromTimeMills(timeMills, FORMAT_TIME_ZONE);
    }

    public static LocalDateTime fromTimeMills(long timeMills, String timeZone){
        return LocalDateTime.ofInstant(Instant.ofEpochMilli(timeMills), ZoneId.of(timeZone));
    }

    /**
     * 日期字符串按指定格式转LocalDateTime
     * @attention:
     * @date: 2021/7/28 15:05
     * @param: dateTimeStr 日期字符串
     * 2023-12-07T16:00:00Z
     * "2023-12-07T16:00:00Z"是一种ISO 8601标准的日期时间表示方式
     * 这个字符串表示的是一个特定的时间点:2023年12月7日,下午4点(16点),0分钟,0秒。其中“T”是时间标识符,“Z”表示的是协调世界时(UTC)。
     * 这种格式是可以精确到秒的时间戳
     * @return: java.time.LocalDateTime
     */
    public static LocalDateTime toLocalDateTime(String dateTimeStr) {
        // UTC-->这个位置不知道debezium是怎么处理的,按理说该是相差8个小时才对。但实际上相差了6个小时
        return toLocalDateTime(dateTimeStr, DEFAULT_DIFFERENCE_HOURS);
    }

    public static LocalDateTime toLocalDateTime(String dateTimeStr, Integer hours) {
        // UTC时间
        DateTimeFormatter formatter = DateTimeFormatter.ISO_DATE_TIME;
        ZonedDateTime zonedDateTime = ZonedDateTime.parse(dateTimeStr, formatter);

        // // 使用DateTimeFormatter.ISO_INSTANT解析字符串为Instant对象
        // DateTimeFormatter formatter = DateTimeFormatter.ISO_INSTANT;
        // Instant instant = Instant.from(formatter.parse(dateTimeStr));
        // // 将Instant对象转换为LocalDateTime
        // return instant.atZone(ZoneId.of("UTC")).toLocalDateTime();

        return zonedDateTime.toLocalDateTime().plusHours(hours);
    }

    /**
     * ISO 8601标准日期转成字符串
     * @param dateTimeStr
     * 2023-12-07T16:00:00Z
     * "2023-12-07T16:00:00Z"是一种ISO 8601标准的日期时间表示方式
     * 这个字符串表示的是一个特定的时间点:2023年12月7日,下午4点(16点),0分钟,0秒。其中“T”是时间标识符,“Z”表示的是协调世界时(UTC)。
     * 这种格式是可以精确到秒的时间戳
     * @return
     */
    public static String fromISO8601(String dateTimeStr) {
        return toDateTimeString(toLocalDateTime(dateTimeStr), FORMAT_DATE_TIME);
    }
    public static String fromISO8601(String dateTimeStr, Integer hours) {
        return toDateTimeString(toLocalDateTime(dateTimeStr, hours), FORMAT_DATE_TIME);
    }

    /**
     * 根据天数倒推日期
     * https://www.cnblogs.com/Marydon20170307/p/10672030.html
     * @param days 距离1970-01-01的天数
     * @return yyyy-MM-dd
     * 2023-12-26
     */
    public static String getSomeDay(int days) {
        SimpleDateFormat sdf = new SimpleDateFormat(FORMAT_DATE);
        Date date;
        try {
            date = sdf.parse("1970-01-01");
        } catch (ParseException e) {
            throw new RuntimeException(e);
        }
        Calendar calendar = Calendar.getInstance();
        calendar.setTime(date);
        calendar.add(Calendar.DAY_OF_YEAR, days);
        date = calendar.getTime();
        return sdf.format(date);
    }

    /**
     * 根据秒数倒推时间
     * @param seconds
     * @return HH:mm:ss
     * 09:30:00
     * 21:30:00
     */
    public static String secondsToTime(Long seconds) {
        String dateTime = String.format("%d:%d:%d", seconds / 3600, (seconds % 3600) / 60, seconds % 60);
        SimpleDateFormat sdf = new SimpleDateFormat("HH:mm:ss");
        Date date;
        try {
            date = sdf.parse(dateTime);
        } catch (ParseException e) {
            throw new RuntimeException(e);
        }
        return sdf.format(date);
    }

    /**
     * 获取系统当前时间
     * @return 指定格式的系统时间
     */
    public static String getSysdateStr(String format) {
        // format = format == null || format == "" ? FORMAT_DATE_TIME : format;
        format = Optional.ofNullable(format).filter(str -> !str.isEmpty()).orElse(FORMAT_DATE_TIME);

        LocalDateTime now = LocalDateTime.now();
        DateTimeFormatter formatter = DateTimeFormatter.ofPattern(format);
        return now.format(formatter);
    }

    public static void main(String[] args) {
        // mysql
        // System.out.println(timestampToString(1702027934000L,"UTC"));
        System.out.println(fromISO8601("2023-12-08T15:32:19Z"));
        // System.out.println(getSomeDay(19699));
        // System.out.println(secondsToTime(34419000000L / 1000000));
        // oracle
        // System.out.println(timestampToString(1702837490000L,"UTC"));
        // System.out.println(timestampToString(1702826697000000L,"UTC"));
        // System.out.println(timestampToString(1708330724413L,"UTC+8"));
        // System.out.println(secondsToTime(77400L));

    }
}

打包

对项目进行打包,先执行clean后执行package。

安装

把生成的marydon-connector-jdbc-1.1.Final.jar和lib目录拷贝至KAFKA_HOME/plugins/marydon-connector-jdbc-1.1目录下。

5.测试

启动

启动zookeeper,kafka,kafka connect

创建debezium source connector

{
    "name": "debezium-connector-source-mysql-128",
    "config": {
        "connector.class": "io.debezium.connector.mysql.MySqlConnector",
        "errors.log.include.messages": true,
        "database.user": "username",
        "database.server.id": 128,
        "schema.history.internal.kafka.bootstrap.servers": "localhost:9092",
        "event.processing.failure.handling.mode": "warn",
        "column.include.list": "test.tb_project.CREATE_BY,test.tb_project.CREATE_TIME,test.tb_project.ID,test.tb_project.PROJECT_BEGIN_TIME,test.tb_project.PROJECT_CODE,test.tb_project.PROJECT_COMPANY,test.tb_project.PROJECT_END_TIME,test.tb_project.PROJECT_MANAGER,test.tb_project.PROJECT_NAME,test.tb_project.PROJECT_SUBMIT_TIME,test.tb_project.UPDATE_BY,test.tb_project.UPDATE_TIME",
        "database.port": "3306",
        "schema.history.internal.store.only.captured.tables.ddl": true,
        "schema.history.internal.store.only.captured.databases.ddl": true,
        "topic.prefix": "topic-test-128",
        "schema.history.internal.kafka.topic": "schema-history-test-128",
        "database.hostname": "192.168.0.1",
        "database.connectionTimeZone": "GMT+8",
        "database.password": "123@db",
        "table.include.list": "test.tb_project",
        "skipped.operations": "none",
        "errors.log.enable": true,
        "database.include.list": "test",
        "snapshot.mode": "schema_only"
    }
}

创建marydon sink connector

{
    "name": "marydon-connector-sink-mysql-128-tb_project",
    "config": {
        "connector.class": "code.marydon.connectors.JdbcSinkConnector",
        "table.name.format": "tb_project",
        "connection.password": "password",
        "sync.field.timestamp": "SYN_LAST_UPDATE_TIME",
        "tasks.max": 1,
        "sync.field.delete.mark": "true",
        "topics.regex": "topic-test-128.test.tb_project",
        "sync.field.delete": "SYN_IS_DELETED",
        "delete.enabled": true,
        "columns.map": "CREATE_BY:CREATE_BY,CREATE_TIME:CREATE_TIME,ID:ID,PROJECT_BEGIN_TIME:PROJECT_BEGIN_TIME,PROJECT_CODE:PROJECT_CODE,PROJECT_COMPANY:PROJECT_COMPANY,PROJECT_END_TIME:PROJECT_END_TIME,PROJECT_MANAGER:PROJECT_MANAGER,PROJECT_NAME:PROJECT_NAME,PROJECT_SUBMIT_TIME:PROJECT_SUBMIT_TIME,UPDATE_BY:UPDATE_BY,UPDATE_TIME:UPDATE_TIME",
        "connection.user": "username",
        "timestamp.hours.difference": "-6",
        "log.level": "info",
        "connection.url": "jdbc:mysql://192.168.0.1:3306/test?useUnicode=true&characterEncoding=utf8&allowPublicKeyRetrieval=true&useTimezone=true&serverTimezone=Asia/Shanghai",
        "connection.driveClassName": "com.mysql.jdbc.Driver",
        "insert.mode": "upsert",
        "pk.mode": "record_key",
        "pk.fields": "id",
        "columns.date": "CREATE_TIME:datetime,PROJECT_BEGIN_TIME:datetime,PROJECT_END_TIME:datetime,PROJECT_SUBMIT_TIME:datetime,UPDATE_TIME:datetime"
    }
}

表结构

源表

目标表

新增

源表

目标表

我们可以看到:

源表新增的数据不仅插入到了目标表中,SYN_LAST_UPDATE_TIME字段也插入当时获取到的最新时间。

修改

源表

目标表

我们可以看到:

目标表除了按源表的数据变化进行更新操作外,还将SYN_LAST_UPDATE_TIME字段更新为了当时获取到的最新时间。

删除

源表

目标表

我们可以看到:

源表进行了删除操作,目标表执行的是更新操作。

SYN_LAST_UPDATE_TIME字段被更新为了当时获取到的最新时间,SYN_IS_DELETED字段被更新为true。

6.目标库是oracle

import code.marydon.configs.JdbcSinkConfig;
import code.marydon.connectors.JdbcSinkConnector;
import code.marydon.db.DynamicConnection;
import code.marydon.db.DynamicDataSource;
import code.marydon.enums.Operation;
import code.marydon.utils.DateUtils;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;
import org.apache.kafka.connect.data.Field;
import org.apache.kafka.connect.data.Struct;
import org.apache.kafka.connect.sink.SinkRecord;
import org.apache.kafka.connect.sink.SinkTask;

import java.sql.SQLException;
import java.util.*;
import java.util.concurrent.locks.ReentrantLock;
查看代码
/**
 * 接收从kafka订阅到的数据,并将数据同步到oracle库
 * @description: 目标表是mysql数据库
 * @author: Marydon
 * @date: 2023-12-27 15:04
 * @version: 1.0
 * @email: marydon20170307@163.com
 */
@Slf4j
public class OracleSinkTask extends SinkTask {
    private final ReentrantLock lock = new ReentrantLock();
    private DynamicConnection connection;
    // 待同步的表名
    private String tableName = "";
    // 主键字段
    private String primaryFields = "";
    // 获取目标表与源表的字段映射
    String[] targetColumnsMapArray = new String[0];
    // 获取目标表日期字段列
    String[] targetDateColumnsArray = new String[0];
    // 时间戳字段名称
    private String timestampField = "";
    // 删除标识字段名称
    private String deleteMarkField = "";
    // 删除标识字段的值
    private String deleteMarkFieldValue = "";
    // 插入模式
    private String insertMode = "";
    // 日志级别
    private String logLevel = "";

    private final  String DATE_TIME_FORMAT = "YYYY-MM-DD HH24:MI:SS";

    @Override
    public String version() {
        return new JdbcSinkConnector().version();
    }

    //task启动
    @Override
    public void start(Map<String, String> map) {
        String url = map.get(JdbcSinkConfig.CONNECTION_URL);
        String userName = map.get(JdbcSinkConfig.CONNECTION_USER);
        String password = map.get(JdbcSinkConfig.CONNECTION_PASSWORD);
        String driveClassName = map.get(JdbcSinkConfig.CONNECTION_DRIVER_CLASS_NAME);
        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info("config:{}", map);
        }
        this.connection = DynamicDataSource.getDynamicConnection(url, userName, password, driveClassName);

        tableName = map.get(JdbcSinkConfig.TABLE_NAME_FORMAT);
        primaryFields = map.get(JdbcSinkConfig.PK_FIELDS);
        // 目标表中表示日期的字段
        String targetDateColumns = map.get(JdbcSinkConfig.COLUMNS_DATE);
        // 目标表字段与源表字段的映射关系
        String targetColumnsMap = map.get(JdbcSinkConfig.COLUMNS_MAP);

        targetColumnsMapArray = targetColumnsMap.split(",");
        targetDateColumnsArray = targetDateColumns.split(",");
        timestampField = map.get(JdbcSinkConfig.SYNC_FIELD_TIMESTAMP);
        deleteMarkField = map.get(JdbcSinkConfig.SYNC_FIELD_DELETED);
        deleteMarkFieldValue = map.get(JdbcSinkConfig.SYNC_FIELD_DELETED_MARK);
        insertMode = map.get(JdbcSinkConfig.INSERT_MODE);
        logLevel = map.get(JdbcSinkConfig.LOG_LEVEL);
    }

    //数据put
    @Override
    public void put(Collection<SinkRecord> records) {
        try {
            // 在处理整个records集合之前加锁
            lock.lock();

            try {
                for (SinkRecord record : records) {
                    // 数据处理
                    this.pull(record);
                }
            } finally {
                // 在finally块中确保无论正常执行还是抛出异常都能解锁
                lock.unlock();
            }
        } catch (Exception var4) {
            throw new RuntimeException(var4);
        }
    }


    /**
     * 数据同步(从kafka拉取数据)
     * @explain
     * 1.mysql和Oracle进行互相同步时,必须要保证待同步的字段不包含time类型
     * 2.源表和目标表的主键必须保持一致,因为目标表的数据同步(修改和删除),是按主键走的
     * 3.源表的timestamp类型字段必须与目标表的timestamp类型映射字段保持一致
     * @param record 记录
     * @throws SQLException
     * 如果SQL执行失败,则抛出异常,这将导致任务停止运行
     * 如果不想影响后续的数据同步,可以捕获异常,然后继续执行
     */
    private void pull(SinkRecord record) throws SQLException {
        // payload分为:before、after、source、op和ts_ms等几部分
        Struct value = (Struct)record.value();
        if (null == value) return;
        // Struct{after=Struct{patient_id=1, zs_id=2, create_time=1702826679000, update_time=19708, time=2023-12-17T21:24:44Z, id=1}, source=Struct{version=2.2.1.Final, connector=mysql,  name=topic-medi_data_cent-70, ts_ms=1703630644000, snapshot=first, db-me_data_cent, table=t_patient_zs, server_id=0, file=binlog.003268, pos=581446144, row=0}, op=r, ts_ms=1703580582732}
        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info("try synchronize data:{}", value);
        }

        // Struct key = (Struct)record.key();
        // Schema valueSchema = value.schema();
        // source:事件源的结构信息,包括connector版本、事务ID等
        Struct source = value.getStruct("source");
        // String sourceDatabaseName = source.getString("db");
        // String tableName = source.getString("table");
        // 当入参table.name.format值为""或null时,按目标库的目标表与源库的源表表名一致处理
        if (StringUtils.isEmpty(tableName)) tableName = source.getString("table");
        // 源数据库类型
        String sourceConnectorName = source.getString("connector");
        //  ts_ms:connector处理该事件的本地时间戳
        // String timestamp = String.valueOf(value.getInt64("ts_ms"));
        String operation = value.getString("op");
        // before:变化事件发生之前的值
        Struct beforeStruct = value.getStruct("before");
        // after:变化事件发生之后的值
        Struct afterStruct = value.getStruct("after");

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info("operation:{}", operation);
        }

        // READ
        // operation=r,执行select操作,如果主键存在,执行update操作,主键不存在则执行insert操作
        if (operation.equals(Operation.READ.code())) {// r
            // 根据主键查询数据(beforeStruct为null)
            List<Map<String, Object>> mapList = this.selectOperation(afterStruct);
            // 目标表中没有数据,则执行插入操作
            if (mapList == null || mapList.isEmpty()) {
                // 插入操作
                this.insertOperation(afterStruct, sourceConnectorName, false);
                return;
            }

            // 目标表中该主键已存在,则执行更新操作
            if (JdbcSinkConfig.INSERT_MODE_DEFAULT.equalsIgnoreCase(insertMode)) {
                // 更新操作
                this.updateOperation(afterStruct, sourceConnectorName);
                return;
            }

        }

        // CREATE
        // operation=c,执行select操作,如果主键存在,执行delete操作和insert操作;主键不存在则执行insert操作
        // 后续,如果看实际需要可以将二者进行拆分
        if (operation.equals(Operation.CREATE.code())) {// c
            // 根据主键查询数据(beforeStruct为null)
            List<Map<String, Object>> mapList = this.selectOperation(afterStruct);
            // 目标表中没有数据,则执行插入操作
            if (mapList == null || mapList.isEmpty()) {
                // 插入操作
                this.insertOperation(afterStruct, sourceConnectorName, false);
                return;
            }

            // 目标表中该主键已存在,则执行删除和插入操作
            if (JdbcSinkConfig.INSERT_MODE_DEFAULT.equalsIgnoreCase(insertMode)) {
                // 删除操作(物理删除)
                this.physicalDeleteOperation(afterStruct);
                // 插入操作
                this.insertOperation(afterStruct, sourceConnectorName, false);
                // // 更新操作
                // this.updateOperation(afterStruct, sourceConnectorName);
                return;
            }

        }

        // UPDATE
        if (operation.equals(Operation.UPDATE.code())) {// u
            // 当主键字段为空时,禁止更新同步
            if (StringUtils.isEmpty(primaryFields)) return;

            // 根据主键查询数据
            List<Map<String, Object>> mapList = this.selectOperation(afterStruct);
            // 说明该条记录不存在 && 插入模式为upsert
            if ((mapList == null || mapList.isEmpty()) && JdbcSinkConfig.INSERT_MODE_DEFAULT.equalsIgnoreCase(insertMode)) {// 目标表中没有数据,则执行插入操作
                // 不存在,则执行插入操作
                // 说明:oracle必须执行补全日志操作(否则,执行插入操作没有意义,oracle归档日志默认只记录被更新字段)
                this.insertOperation(afterStruct, sourceConnectorName, false);
                return;
            }

            // 执行更新操作
            this.updateOperation(afterStruct, sourceConnectorName);

            return;
        }

        // DELETE/TRUNCATE
        if (operation.equals(Operation.DELETE.code()) || operation.equals(Operation.TRUNCATE.code())) {// d or t
            // 当主键字段为空时,禁止删除同步
            if (StringUtils.isEmpty(primaryFields)) return;

            // 根据主键查询数据(afterStruct为null)
            List<Map<String, Object>> mapList = this.selectOperation(beforeStruct);

            // 不存在
            if (mapList == null || mapList.isEmpty()) {
                // 说明该条记录不存在 && 物理删除
                if (StringUtils.isBlank(deleteMarkField)) return;

                // 说明该条记录不存在 && 逻辑删除 && 插入模式为upsert
                if (JdbcSinkConfig.INSERT_MODE_DEFAULT.equalsIgnoreCase(insertMode)) {
                    // 执行插入操作
                    this.insertOperation(beforeStruct, sourceConnectorName, true);
                    return;
                }
            }

            // 执行删除操作(存在,执行物理删除)
            if (StringUtils.isBlank(deleteMarkField)) {
                // 物理删除
                this.physicalDeleteOperation(beforeStruct);
                return;
            }

            // 执行删除操作(存在,执行逻辑删除)
            // 逻辑删除
            this.logicalDeleteOperation(beforeStruct);

        }

    }

    /**
     * 查询操作
     * @description: 根据主键查询记录
     * @date: 2024/3/26 18:04
     * @param afterStruct  变化事件发生之后的值
     * @return java.util.List<java.util.Map<java.lang.String,java.lang.Object>>
     */
    private List<Map<String, Object>> selectOperation(Struct afterStruct) throws SQLException {
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;

        // where条件,根据主键进行查询
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : afterStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    fieldValues.add(afterStruct.get(pk));
                    break;
                }
            }

        }

        // 根据主键查询记录
        String sql = "SELECT 1 FROM " + tableName + " WHERE " + String.join(" AND ", fieldParams);

        // if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
        //     log.info(sql);
        // }

        return this.connection.queryForList(sql, fieldValues.toArray());
    }


    /**
     * 插入操作
     * @description:
     * @date: 2024/3/25 18:29
     * @param afterStruct 变化事件发生之后的值
     * @param sourceConnectorType 源数据库类型
     * @param addDeleteMark 是否添加删除标记字段
     * @return boolean 插入成功、失败
     */
    private boolean insertOperation(Struct afterStruct, String sourceConnectorType, Boolean addDeleteMark) throws SQLException {
        ArrayList<String> fieldNames = new ArrayList<>();
        ArrayList<String> fieldParams = new ArrayList<>();
        ArrayList<Object> fieldValues = new ArrayList<>();
        String fieldName;
        Object fieldValue;

        // 源字段类型
        String sourceColumn;
        // 目标表字段名称
        String targetColumn;
        // 目标日期字段名称
        String targetDateColumnName;
        // 目标日期字段类型
        String targetDateColumnType;

        for (Field afterField : afterStruct.schema().fields()) {
            fieldName = afterField.name();
            fieldValue = afterStruct.get(fieldName);
            targetColumn = "";

            // 源表与目标表字段映射关系(根据源表字段映射出目标表字段)
            for (String tcm : targetColumnsMapArray) {
                sourceColumn = tcm.split(":")[0];
                targetColumn = tcm.split(":")[1];// 关键

                // 当要新增的列名与此循环的源表列名一致时,结束循环
                if (sourceColumn.equalsIgnoreCase(fieldName)) break;
            }

            // 说明:入参columns.map没有设值
            if (targetColumnsMapArray.length == 0) {
                // 这样的话,要求目标表字段必须和源表表字段保持一致
                targetColumn = fieldName;
            }

            // 日期字段转换
            // 将日期字段名称,置空
            targetDateColumnName = "";
            targetDateColumnType = "";

            for (String ttc : targetDateColumnsArray) {
                targetDateColumnName = ttc.split(":")[0];
                targetDateColumnType = ttc.split(":")[1];

                // 当要新增的列名为日期字段时,结束循环
                if (targetDateColumnName.equals(targetColumn)) break;

                // 将日期字段名称,置空
                targetDateColumnName = "";
                targetDateColumnType = "";
            }

            // 说明:入参columns.date没有设值
            if (targetDateColumnsArray.length == 0) {
                // 这样的话,要求目标表字段和源表表字段都没有日期字段
                // 将日期字段名称,置空
                targetDateColumnName = "";
                targetDateColumnType = "";
            }

            // 目标日期字段不为空,说明当前要插入的字段是日期类型
            if (!targetDateColumnName.isEmpty()) {
                fieldNames.add(targetColumn);
                boolean isNull = "".equals(fieldValue) || "null".equals(fieldValue) || null == fieldValue;

                // 源表mysql
                if (sourceConnectorType.equalsIgnoreCase("mysql")) {
                    // 目标表字段是oracle的IMESTAMP类型
                    if (StringUtils.startsWithIgnoreCase(targetDateColumnType, "TIMESTAMP")) {
                        // io.debezium.time.ZonedTimestamp
                        fieldParams.add("TO_TIMESTAMP(?, '" + DATE_TIME_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.fromISO8601((String) fieldValue));
                        }

                        continue;// 执行下一轮循环
                    }

                    // 目标表字段是oracle的DATE类型
                    if (targetDateColumnType.equalsIgnoreCase("DATE")) {
                        // mysql转oracle也没有关系(已经排除mysql time类型字段)
                        fieldParams.add("TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                            continue;// 执行下一轮循环
                        }

                        if (String.valueOf(fieldValue).length() > 5) {// 源类型是mysql datetime
                            fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT));
                        } else {// 源类型是mysql date
                            fieldValues.add(DateUtils.getSomeDay(Integer.parseInt(fieldValue + "")) + " 00:00:00");
                        }

                        continue;// 执行下一轮循环
                    }
                }

                // 源表oracle
                if (sourceConnectorType.equalsIgnoreCase("oracle")) {
                    // oracle的日期类型被插件io.debezium.connector.oracle.OracleConnector统一转换成了时间戳类型
                    fieldParams.add("TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                    if (isNull) {
                        fieldValues.add(null);
                    } else {
                        fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT));
                    }

                    continue;// 执行下一轮循环
                }
            }

            // 当前要插入的字段是非日期类型
            if (targetDateColumnName.isEmpty()) {// 非日期字段
                // fieldNames.add(fieldName);
                fieldNames.add(targetColumn);
                fieldParams.add("?");
                fieldValues.add(fieldValue);
            }

        }

        if (fieldNames.isEmpty()) return false;

        // Date:2024/2/23
        // 增加时间戳标识字段
        if (StringUtils.isNotBlank(timestampField)) {
            fieldNames.add(timestampField);
            fieldParams.add("TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
            fieldValues.add(DateUtils.getSysdateStr(null));
        }

        // 增加标识是否删除字段(逻辑删除delete不存在,改为insert时,插入已删除标识)
        if (addDeleteMark && StringUtils.isNotBlank(deleteMarkField)) {
            fieldNames.add(deleteMarkField);
            fieldParams.add("?");
            fieldValues.add(deleteMarkFieldValue);
        }

        String sql = "insert into " + tableName + "(" + String.join(",", fieldNames) + ")values(" + String.join(",", fieldParams) + ")";

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info(sql);
        }

        return this.connection.insert(sql, fieldValues.toArray());
    }

    /**
     * 更新操作
     * @description:
     * 当beforeStruct为null时,数据来源于insert,不变化的数据也会被更新(映射字段或同名字段)
     * 当beforeStruct不为null时,表示更新操作,此时,只更新变化字段
     * @date: 2024/3/25 18:35
     * @param afterStruct 变化事件发生之后的值
     * @param sourceConnectorType 源数据库类型
     * @return int
     */
    private int updateOperation(Struct afterStruct, String sourceConnectorType) throws SQLException {
        List<String> fieldNames = new ArrayList<>();
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;
        Object fieldValue;

        // 源字段类型
        String sourceColumn;
        // 目标表字段名称
        String targetColumn;
        // 目标日期字段名称
        String targetDateColumnName;
        // 目标日期字段类型
        String targetDateColumnType;

        // 说明:主键不能变
        for (Field afterField : afterStruct.schema().fields()) {

            fieldName = afterField.name();
            fieldValue = afterStruct.get(fieldName);
            targetColumn = "";

            // 源表与目标表字段映射关系(根据源表字段映射出目标表字段)
            for (String tcm : targetColumnsMapArray) {
                sourceColumn = tcm.split(":")[0];
                targetColumn = tcm.split(":")[1];// 关键

                // 当要新增的列名与此循环的源表列名一致时,结束循环
                if (sourceColumn.equalsIgnoreCase(fieldName)) break;
            }

            // 说明:入参columns.map没有设值
            if (targetColumnsMapArray.length == 0) {
                // 这样的话,要求目标表字段必须和源表表字段保持一致
                targetColumn = fieldName;
            }

            // 日期字段转换
            // 将日期字段名称,置空
            targetDateColumnName = "";
            targetDateColumnType = "";

            for (String ttc : targetDateColumnsArray) {
                targetDateColumnName = ttc.split(":")[0];
                targetDateColumnType = ttc.split(":")[1];

                // 当要更新的列名为日期字段时,结束循环
                if (targetDateColumnName.equals(targetColumn)) break;

                // 将日期字段名称,置空
                targetDateColumnName = "";
                targetDateColumnType = "";
            }

            // 说明:入参columns.date没有设值
            if (targetDateColumnsArray.length == 0) {
                // 这样的话,要求目标表字段和源表表字段都没有日期字段
                // 将日期字段名称,置空
                targetDateColumnName = "";
                targetDateColumnType = "";
            }

            // 目标日期字段不为空,说明当前要更新的字段是日期类型
            if (!targetDateColumnName.isEmpty()) {
                boolean isNull = "".equals(fieldValue) || "null".equals(fieldValue) || null == fieldValue;

                // 源表mysql
                if (sourceConnectorType.equalsIgnoreCase("mysql")) {
                    // 目标表字段是oracle的IMESTAMP类型
                    if (StringUtils.startsWithIgnoreCase(targetDateColumnType, "TIMESTAMP")) {
                        // io.debezium.time.ZonedTimestamp
                        fieldNames.add(targetColumn + "=TO_TIMESTAMP(?, '" + DATE_TIME_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                        } else {
                            fieldValues.add(DateUtils.fromISO8601((String) fieldValue));
                        }

                        continue;// 执行下一轮循环
                    }

                    // 目标表字段是oracle的DATE类型
                    if (targetDateColumnType.equalsIgnoreCase("DATE")) {
                        // mysql转oracle也没有关系(已经排除mysql time类型字段)
                        fieldNames.add(targetColumn + "=TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                        if (isNull) {
                            fieldValues.add(null);
                            continue;// 执行下一轮循环
                        }

                        if (String.valueOf(fieldValue).length() > 5) {// 源类型是mysql datetime
                            fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT));
                        } else {// 源类型是mysql date
                            fieldValues.add(DateUtils.getSomeDay(Integer.parseInt(fieldValue + "")) + " 00:00:00");
                        }

                        continue;// 执行下一轮循环
                    }
                }

                // 源表oracle
                if (sourceConnectorType.equalsIgnoreCase("oracle")) {
                    // oracle的日期类型被插件io.debezium.connector.oracle.OracleConnector统一转换成了时间戳类型
                    fieldNames.add(targetColumn + "=TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
                    if (isNull) {
                        fieldValues.add(null);
                    } else {
                        fieldValues.add(DateUtils.timestampToString((Long) fieldValue, JdbcSinkConfig.DB_TIMEZONE_DEFAULT));
                    }

                    continue;// 执行下一轮循环
                }
            }

            // 目标日期字段为空,说明当前要更新的字段是非日期类型
            if (targetDateColumnName.isEmpty()) {
                boolean isPrimaryKey = false;
                // 去除主键字段
                for (String pk : primaryFields.split(",")) {
                    // 主键
                    if (StringUtils.equalsIgnoreCase(pk, targetColumn)) {
                        isPrimaryKey = true;
                        break;
                    }
                }

                // targetColumn是主键字段
                // 目的:忽略主键字段,即确保update table set语句当中不包含主键字段
                if (isPrimaryKey) continue;// 执行下一轮循环

                // fieldNames.add(fieldName + "=?");
                fieldNames.add(targetColumn + "=?");
                fieldValues.add(fieldValue);
            }

        }

        // // operation=u && 没有发生数据变化的字段
        // if (null != beforeStruct && fieldNames.isEmpty()) {
        //     return 11;// 此处返回11,表示没有发生数据变化,无需执行update操作
        // }

        // 增加时间戳标识字段
        if (StringUtils.isNotBlank(timestampField)) {
            fieldNames.add(timestampField + "=TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
            fieldValues.add(DateUtils.getSysdateStr(null));
        }

        // where条件,根据主键更新
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : afterStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    fieldValues.add(afterStruct.get(pk));
                    break;
                }
            }

        }

        String sql = "UPDATE " + tableName + " SET " + String.join(",", fieldNames) + " WHERE " + String.join(" AND ", fieldParams);

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info(sql);
        }

        // 执行更新操作
        return this.connection.update(sql, fieldValues.toArray());
    }

    /**
     * 删除操作(物理删除)
     * @description:
     * @date: 2024/3/25 18:35
     * @param beforeStruct 变化事件发生之前的值
     * @return int 删除的记录数
     */
    private int physicalDeleteOperation(Struct beforeStruct) throws SQLException {
        ArrayList<String> fieldParams = new ArrayList<>();
        ArrayList<Object> fieldValues = new ArrayList<>();
        String fieldName;

        // where条件,根据主键删除
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : beforeStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    fieldValues.add(beforeStruct.get(pk));
                    break;
                }
            }

        }

        // 根据主键删除记录
        String sql = "DELETE FROM " + tableName + " WHERE " + String.join(" AND ", fieldParams);

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info(sql);
        }

        return this.connection.update(sql, fieldValues.toArray());
    }

    /**
     * 删除操作(逻辑删除、假删除)
     * @description: 执行更新操作
     * @date: 2024/3/25 18:35
     * @param beforeStruct 变化事件发生之前的值
     * @return int 更新的记录数
     */
    private int logicalDeleteOperation(Struct beforeStruct) throws SQLException {
        List<String> fieldNames = new ArrayList<>();
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;

        // Date:2024/2/23
        // 增加标识是否删除字段
        fieldNames.add(deleteMarkField + "=?");
        fieldValues.add(deleteMarkFieldValue);

        // 增加时间戳标识字段
        if (StringUtils.isNotBlank(timestampField)) {
            fieldNames.add(timestampField + "=STR_TO_DATE(?, '" + DATE_TIME_FORMAT + "')");
            fieldValues.add(DateUtils.getSysdateStr(null));
        }

        // where条件,根据主键更新
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : beforeStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    fieldValues.add(beforeStruct.get(pk));
                    break;
                }
            }

        }

        String sql = "UPDATE " + tableName + " SET " + String.join(",", fieldNames) + " WHERE " + String.join(" AND ", fieldParams);

        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equalsIgnoreCase(logLevel)) {
            log.info(sql);
        }

        return this.connection.update(sql, fieldValues.toArray());
    }

    @Override
    public void stop() {

    }
}

2024-06-17 11:45:16

7.代码优化

其实,我们根本无需去关注目标库,而是只需要关注来源库就可以了。

针对JdbcSinkTask进行优化,将MysqlSinkTask和OracleSinkTask进行合并。

JdbcSinkTask.java

查看代码
 
import code.marydon.configs.JdbcSinkConfig;
import code.marydon.connectors.JdbcSinkConnector;
import code.marydon.db.DynamicConnection;
import code.marydon.db.DynamicDataSource;
import code.marydon.enums.OperationEnum;
import code.marydon.utils.OperationUtils;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;
import org.apache.kafka.connect.data.Field;
import org.apache.kafka.connect.data.Struct;
import org.apache.kafka.connect.sink.SinkRecord;
import org.apache.kafka.connect.sink.SinkTask;

import java.sql.SQLException;
import java.time.LocalDateTime;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.Map;
import java.util.concurrent.locks.ReentrantLock;

/**
 * 接收从kafka订阅到的数据,并将数据同步到目标库
 * @description 目标表是mysql数据库
 * @attention 这里的数据同步只是单向的,即:
 * 将源表数据同步至目标表中,不支持从目标表同步到源表
 * 同样地,当源表数据少于目标表数据时,目标表多出的数据不会执行删除操作
 * @author Marydon
 * @date 2023-12-27 11:04
 * @version 1.0
 * @email marydon20170307@163.com
 */
@Slf4j
public class JdbcSinkTask extends SinkTask {
    private final ReentrantLock lock = new ReentrantLock();
    /** 源数据库类型 */
    private String sourceDbType;
    private DynamicConnection connection;
    /** 待同步的表名 */
    private String tableName = "";
    /** 主键字段 */
    private String primaryFields = "";
    /** 目标表与源表的字段映射关系 */
    String[] targetColumnsMapArray = new String[0];
    /** 源表日期字段列 */
    String[] sourceDateColumnsArray = new String[0];
    /** 时间戳字段名称 */
    private String timestampField = "";
    /** 删除标识字段名称 */
    private String deleteMarkField = "";
    /** 删除标识字段的值 */
    private String deleteMarkFieldValue = "";
    // 插入模式
    /** 插入模式 */
    private String insertMode = "";
    /** 日志级别 */
    // private String logLevel = "";
    /**
     * 时间戳与真实数据库timestamp字段的时间差
     * @attention 仅供mysql使用
     */
    private Integer timestampDifference;

    private OperationUtils oUtils;

    @Override
    public String version() {
        return new JdbcSinkConnector().version();
    }

    /**
     * task启动
     * @attention
     * @date 2024-06-06 16:38
     * @param map 参数配置
     */
    @Override
    public void start(Map<String, String> map) {
        String url = map.get(JdbcSinkConfig.CONNECTION_URL);
        String userName = map.get(JdbcSinkConfig.CONNECTION_USER);
        String password = map.get(JdbcSinkConfig.CONNECTION_PASSWORD);
        String driveClassName = map.get(JdbcSinkConfig.CONNECTION_DRIVER_CLASS_NAME);
        this.connection = DynamicDataSource.getDynamicConnection(url, userName, password, driveClassName);

        tableName = map.get(JdbcSinkConfig.TABLE_NAME_FORMAT);
        primaryFields = map.get(JdbcSinkConfig.PK_FIELDS);
        // 目标表字段与源表字段的映射关系
        String targetColumnsMap = map.get(JdbcSinkConfig.COLUMNS_MAP);
        // 源表中表示日期的字段
        String sourceDateColumns = map.get(JdbcSinkConfig.COLUMNS_DATE);

        targetColumnsMapArray = targetColumnsMap.split(",");
        sourceDateColumnsArray = sourceDateColumns.split(",");
        timestampField = map.get(JdbcSinkConfig.SYNC_FIELD_TIMESTAMP_NAME);
        deleteMarkField = map.get(JdbcSinkConfig.SYNC_FIELD_DELETED_MARK_NAME);
        deleteMarkFieldValue = map.get(JdbcSinkConfig.SYNC_FIELD_DELETED_MARK_VALUE);
        insertMode = map.get(JdbcSinkConfig.INSERT_MODE).toLowerCase();
        /*logLevel = map.get(JdbcSinkConfig.LOG_LEVEL).toLowerCase();
        if (JdbcSinkConfig.LOG_LEVEL_DEFAULT.equals(logLevel)) {
            log.info("config:{}", map);
        }*/
        log.debug("config:\n{}", map);

        /*String thd = map.get(JdbcSinkConfig.MYSQL_TIMESTAMP_HOURS_DIFFERENCE);
        if (StringUtils.isNotBlank(thd)) {
            timestampDifference = Integer.parseInt(thd);
        } else {
            timestampDifference = JdbcSinkConfig.MYSQL_TIMESTAMP_HOURS_DIFFERENCE_DEFAULT;
        }*/

        // 参数初始化
        oUtils = OperationUtils.builder()
                .primaryFields(primaryFields)
                .connection(connection)
                .tableName(tableName)
                .deleteMarkField(deleteMarkField)
                .deleteMarkFieldValue(deleteMarkFieldValue)
                .timestampFieldName(timestampField)
                // .logLevel(logLevel)
                // .timestampDifference(timestampDifference)
                .build()
                ;
    }

    //数据put
    @Override
    public void put(Collection<SinkRecord> records) {
        try {
            // 在处理整个records集合之前加锁
            lock.lock();

            try {
                for (SinkRecord record : records) {
                    // 数据处理
                    this.pull(record);
                }
            } finally {
                // 在finally块中确保无论正常执行还是抛出异常都能解锁
                lock.unlock();
            }
        } catch (Exception var4) {
            throw new RuntimeException(var4);
        }
    }

    /**
     * 数据同步(从kafka拉取数据)
     * @explain
     * 1.mysql和Oracle进行互相同步时,必须要保证待同步的字段不包含time类型
     * 2.源表和目标表的主键必须保持一致,因为目标表的数据同步(修改和删除),是按主键走的
     * 3.源表的timestamp类型字段必须与目标表的timestamp类型映射字段保持一致
     * 4.当mysql之间进行数据同步时,支持time类型字段进行同步
     * @param record 记录
     * @throws SQLException
     * 如果SQL执行失败,则抛出异常,这将导致任务停止运行
     * 如果不想影响后续的数据同步,可以捕获异常,然后继续执行
     */
    private void pull(SinkRecord record) throws SQLException {
        // payload分为:before、after、source、op和ts_ms等几部分
        Struct value = (Struct)record.value();
        if (null == value) {
            return;
        }
        // Struct{after=Struct{patient_id=1, zs_id=2, create_time=1702826679000, update_time=19708, time=2023-12-17T21:24:44Z, id=1}, source=Struct{version=2.2.1.Final, connector=mysql,  name=topic-medi_data_cent-70, ts_ms=1703630644000, snapshot=first, db-me_data_cent, table=t_patient_zs, server_id=0, file=binlog.003268, pos=581446144, row=0}, op=r, ts_ms=1703580582732}
        log.info("try synchronize data:\n{}", value);

        // source:事件源的结构信息,包括connector版本、事务ID等
        Struct source = value.getStruct("source");
        // 当入参table.name.format值为""或null时,按目标库的目标表与源库的源表表名一致处理
        if (StringUtils.isEmpty(tableName)) {
            tableName = source.getString("table");
        }

        // 源数据库类型
        sourceDbType = source.getString("connector").toLowerCase();
        String operation = value.getString("op");
        // before:变化事件发生之前的值
        Struct beforeStruct = value.getStruct("before");
        // after:变化事件发生之后的值
        Struct afterStruct = value.getStruct("after");
        log.info("operation:{}", operation);

        // READ
        // operation=r,执行select操作,如果主键存在,执行update操作,主键不存在则执行insert操作
        if (operation.equals(OperationEnum.READ.code())) {// r
            // 根据主键查询数据(beforeStruct为null)
            List<Map<String, Object>> mapList = oUtils.selectOperation(afterStruct);
            // 目标表中没有数据,则执行插入操作
            if (mapList == null || mapList.isEmpty()) {
                // 插入操作
                this.insertOperation(afterStruct, false);
                return;
            }

            // mode=upserts时,如果目标表中该主键已存在,则执行更新操作
            if (JdbcSinkConfig.INSERT_MODE_DEFAULT.equals(insertMode)) {
                // 更新操作
                this.updateOperation(afterStruct);
                return;
            }

        }

        // CREATE
        // operation=c,执行select操作,如果主键存在,执行delete操作和insert操作;主键不存在则执行insert操作
        // 后续,如果看实际需要可以将二者进行拆分
        if (operation.equals(OperationEnum.CREATE.code())) {// c
            // 根据主键查询数据(beforeStruct为null)
            List<Map<String, Object>> mapList = oUtils.selectOperation(afterStruct);
            // 目标表中没有数据,则执行插入操作
            if (mapList == null || mapList.isEmpty()) {
                // 插入操作
                this.insertOperation(afterStruct, false);
                return;
            }

            // 目标表中该主键已存在,则执行删除和插入操作
            if (JdbcSinkConfig.INSERT_MODE_DEFAULT.equals(insertMode)) {
                // Step1:删除操作(物理删除)
                oUtils.physicalDeleteOperation(afterStruct);
                // Step2:插入操作
                this.insertOperation(afterStruct, false);
                // // 更新操作
                // this.updateOperation(afterStruct, sourceConnectorName);
                return;
            }

        }

        // UPDATE
        if (operation.equals(OperationEnum.UPDATE.code())) {// u
            // 当主键字段为空时,禁止更新同步
            if (StringUtils.isEmpty(primaryFields)) return;

            // 根据主键查询数据
            List<Map<String, Object>> mapList = oUtils.selectOperation(afterStruct);
            // 说明该条记录不存在 && 插入模式为upsert
            if ((mapList == null || mapList.isEmpty()) && JdbcSinkConfig.INSERT_MODE_DEFAULT.equals(insertMode)) {// 目标表中没有数据,则执行插入操作
                // 不存在,则执行插入操作
                // 说明:oracle必须执行补全日志操作(否则,执行插入操作没有意义,oracle归档日志默认只记录被更新字段)
                this.insertOperation(afterStruct, false);
                return;
            }

            // 执行更新操作
            this.updateOperation(afterStruct);
            return;
        }

        // DELETE/TRUNCATE
        if (operation.equals(OperationEnum.DELETE.code()) || operation.equals(OperationEnum.TRUNCATE.code())) {// d or t
            // 当主键字段为空时,禁止删除同步
            if (StringUtils.isEmpty(primaryFields)) return;

            // 根据主键查询数据(afterStruct为null)
            List<Map<String, Object>> mapList = oUtils.selectOperation(beforeStruct);

            // 不存在
            if (mapList == null || mapList.isEmpty()) {
                // 说明该条记录不存在 && 物理删除
                if (StringUtils.isBlank(deleteMarkField)) return;

                // 说明该条记录不存在 && 逻辑删除 && 插入模式为upsert
                if (JdbcSinkConfig.INSERT_MODE_DEFAULT.equals(insertMode)) {
                    // 执行插入操作
                    this.insertOperation(beforeStruct, true);
                    return;
                }
            }

            // 执行删除操作(存在,执行物理删除)
            if (StringUtils.isBlank(deleteMarkField)) {
                // 物理删除
                oUtils.physicalDeleteOperation(beforeStruct);
                return;
            }

            // 执行删除操作(存在,执行逻辑删除)
            // 逻辑删除
            oUtils.logicalDeleteOperation(beforeStruct);
        }

    }

    /**
     * 插入操作
     * @description
     * 1.当执行逻辑删除时,发现主键不存在,则执行插入操作(mode=upsert)
     * 2.当执行更新操作时,发现主键不存在,则执行插入操作(mode=upsert)
     * @date 2024/3/25 18:29
     * @param afterStruct 变化事件发生之后的值
     * @param addDeleteMark 是否添加删除标记字段
     *                      需要确保:当addDeleteMark为true时,deleteMarkField字段必定存在
     * @return boolean 插入成功、失败
     */
    private boolean insertOperation(Struct afterStruct, Boolean addDeleteMark) throws SQLException {
        List<Field> fieldList = afterStruct.schema().fields();
        List<String> fieldNames = new ArrayList<>();
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;
        Object fieldValue;

        // 源字段类型
        String sourceColumn;
        // 目标表字段名称
        String targetColumn;
        // 源表日期字段名称
        String sourceDateColumnName;
        // 源表日期字段类型
        String sourceDateColumnType;

        for (Field afterField : fieldList) {
            fieldName = afterField.name();
            // fieldValue = afterStruct.get(fieldName);
            fieldValue = oUtils.getFieldValueByNameIgnoreCase(afterStruct, fieldName);

            sourceColumn = "";
            targetColumn = "";

            // 源表与目标表字段映射关系(根据源表字段映射出目标表字段)
            // 说明:入参columns.map没有设值的话,不进此循环
            for (String tcm : targetColumnsMapArray) {
                sourceColumn = tcm.split(":")[0];
                targetColumn = tcm.split(":")[1];// 关键

                // 当要新增的列名与此循环的源表列名一致时,结束循环
                if (sourceColumn.equalsIgnoreCase(fieldName)) break;

                // 最后一轮,还是没有找到映射关系,置空
                sourceColumn = "";
                targetColumn = "";
            }
            // 说明:入参columns.map没有设值
            if (targetColumnsMapArray.length == 0) {
                // 这样的话,要求目标表字段必须和源表表字段保持一致
                sourceColumn = fieldName;
                targetColumn = fieldName;
            }

            // 情形1:表结构发生了变化,导致:源表的字段已经被删除、重命名、新增
            // 情形2:源表与目标表,只映射了部分字段或者源表表字段数量多于目标表字段或者目标表字段数量多于源表字段
            if (StringUtils.isBlank(sourceColumn)) {
                continue;// 执行下一轮循环
            }

            // 源表日期字段转换
            // 将源表日期字段名称,置空
            sourceDateColumnName = "";
            sourceDateColumnType = "";

            // 说明:入参source.table.columns.date没有设值的话,不进此循环
            for (String ttc : sourceDateColumnsArray) {
                sourceDateColumnName = ttc.split(":")[0];
                sourceDateColumnType = ttc.split(":")[1];

                // 当源表的列名为日期字段时,结束循环
                if (sourceDateColumnName.equals(sourceColumn)) break;

                // 最后一轮,还是没有找到源表日期字段与目标表日期字段的映射关系,置空
                sourceDateColumnName = "";
                sourceDateColumnType = "";
            }

            fieldNames.add(targetColumn);
            fieldParams.add("?");
            // 目标日期字段不为空,说明当前要插入的字段是日期类型
            if (!sourceDateColumnName.isEmpty()) {
                fieldValues.add(oUtils.convertToDateValue(sourceDbType, sourceDateColumnType, fieldValue));
                continue;// 执行下一轮循环
            }

            // 非日期类型字段直接插入,不做任何处理
            fieldValues.add(fieldValue);
        }

        if (fieldNames.isEmpty()) return false;

        // Date:2024/2/23
        // 增加时间戳标识字段
        if (StringUtils.isNotBlank(timestampField)) {
            fieldNames.add(timestampField);
            fieldParams.add("?");
            fieldValues.add(LocalDateTime.now());
        }

        // 增加标识是否删除字段(逻辑删除delete不存在,改为insert时,插入已删除标识)
        if (addDeleteMark) {
            // 删除标识字段肯定不为空,只有当此字段不为null时,addDeleteMark才为true
            fieldNames.add(deleteMarkField);
            fieldParams.add("?");
            fieldValues.add(deleteMarkFieldValue);
        }

        String sql = "insert into " + tableName + "(" + String.join(",", fieldNames) + ")values(" + String.join(",", fieldParams) + ")";

        log.info(sql);
        log.info(fieldValues.toString());

        return this.connection.insert(sql, fieldValues.toArray());
    }

    /**
     * 更新操作
     * @description
     * 当beforeStruct为null时,数据来源于insert,不变化的数据也会被更新(映射字段或同名字段)
     * 当beforeStruct不为null时,表示更新操作,此时,只更新变化字段
     * @date 2024/3/25 18:35
     * @param afterStruct 变化事件发生之后的值
     * @return int
     */
    private int updateOperation(Struct afterStruct) throws SQLException {
        List<Field> fieldList = afterStruct.schema().fields();
        List<String> fieldNames = new ArrayList<>();
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;
        Object fieldValue;

        // 目标表字段名称
        String targetColumn;
        // 源表字段名称
        String sourceColumn;
        // 源表日期字段名称
        String sourceDateColumnName;
        // 源表日期字段类型
        String sourceDateColumnType;
        // 是否为主键
        boolean isPrimaryKey;

        // 说明:主键不能变
        for (Field afterField : fieldList) {
            fieldName = afterField.name();
            // fieldValue = afterStruct.get(fieldName);
            fieldValue = oUtils.getFieldValueByNameIgnoreCase(afterStruct, fieldName);


            // 数据来自operation=u && after和before相同字段的值相同时,结束本轮循环
            // if (null != beforeStruct && String.valueOf(fieldValue).equals(String.valueOf(beforeStruct.get(fieldName)))) {
            //     break;
            // }

            // 数据来自operation=u && after和before相同字段的值不一致,说明被更新(mysql,数据更新前后全字段展示;Oracle默认情况下,只提供数据发生变化的字段(不含主键字段),我们可以通过补全日志来展示出所有字段)
            // 或者,数据来自operation=r,beforeStruct为null,更新的是与源表指定映射关系的所有字段或者同名的所有字段(没有映射关系或不同名或多余字段值将保持不变)
            targetColumn = "";
            sourceColumn = "";
            // 源表与目标表字段映射关系(根据源表字段映射出目标表字段)
            // 说明:入参columns.map没有设值的话,不进此循环
            for (String tcm : targetColumnsMapArray) {
                sourceColumn = tcm.split(":")[0];
                targetColumn = tcm.split(":")[1];// 关键

                // 当要更新的列名与此循环的源表列名一致时,结束循环
                if (sourceColumn.equalsIgnoreCase(fieldName)) break;

                // 最后一轮,还是没有找到映射关系,置空
                sourceColumn = "";
                targetColumn = "";
            }
            // 说明:入参columns.map没有设值
            if (targetColumnsMapArray.length == 0) {
                // 这样的话,要求目标表字段必须和源表表字段保持一致
                sourceColumn = fieldName;
                targetColumn = fieldName;
            }

            // 情形1:表结构发生了变化,导致:源表的字段已经被删除、重命名、新增
            // 情形2:源表与目标表,只映射了部分字段或者源表表字段数量多于目标表字段或者目标表字段数量多于源表字段
            if (StringUtils.isBlank(sourceColumn)) {
                continue;// 执行下一轮循环
            }


            isPrimaryKey = false;
            // 去除主键字段
            for (String pk : primaryFields.split(",")) {
                // 当前字段为主键
                if (StringUtils.equalsIgnoreCase(pk, targetColumn)) {
                    isPrimaryKey = true;
                    break;
                }
            }

            // targetColumn是主键字段
            // 目的:忽略主键字段,即确保update table set语句当中不包含主键字段
            if (isPrimaryKey) continue;// 执行下一轮循环

            fieldNames.add(targetColumn + "=?");

            // 日期字段转换
            // 将日期字段名称,置空
            sourceDateColumnName = "";
            sourceDateColumnType = "";
            for (String ttc : sourceDateColumnsArray) {
                sourceDateColumnName = ttc.split(":")[0];
                sourceDateColumnType = ttc.split(":")[1];

                // 当要更新的列名为日期字段时,结束循环
                if (sourceDateColumnName.equals(sourceColumn)) break;

                // 最后一轮,还是没有找到源表日期字段与目标表日期字段的映射关系,置空
                sourceDateColumnName = "";
                sourceDateColumnType = "";
            }

            // 说明:入参columns.date没有设值
            if (sourceDateColumnsArray.length == 0) {
                // 源表没有日期字段
                // 将日期字段名称,置空
                sourceDateColumnName = "";
                sourceDateColumnType = "";
            }

            // 源表日期字段不为空,说明当前要更新的字段是日期类型
            if (!sourceDateColumnName.isEmpty()) {
                fieldValues.add(oUtils.convertToDateValue(sourceDbType, sourceDateColumnType, fieldValue));
                continue;// 执行下一轮循环
            }

            // 非日期类型字段直接插入,不做任何处理
            fieldValues.add(fieldValue);
        }

        // // operation=u && 没有发生数据变化的字段
        // if (null != beforeStruct && fieldNames.isEmpty()) {
        //     return 11;// 此处返回11,表示没有发生数据变化,无需执行update操作
        // }

        // 增加时间戳标识字段
        if (StringUtils.isNotBlank(timestampField)) {
            fieldNames.add(timestampField + "=?");
            fieldValues.add(LocalDateTime.now());
        }

        // where条件,根据主键更新
        for (String pk : primaryFields.split(",")) {
            // 设置主键
            fieldParams.add(pk + "=?");
            for (Field f : fieldList) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    // 根据主键取值
                    // fieldValues.add(afterStruct.get(pk));
                    fieldValues.add(oUtils.getFieldValueByNameIgnoreCase(afterStruct, pk));
                    break;
                }
            }

        }

        String sql = "UPDATE " + tableName + " SET " + String.join(",", fieldNames) + " WHERE " + String.join(" AND ", fieldParams);

        log.info(sql);
        log.info(fieldValues.toString());

        // 执行更新操作
        return this.connection.update(sql, fieldValues.toArray());
    }

    @Override
    public void stop() {

    }
}

OperationUtils.java

查看代码
 
import code.marydon.db.DynamicConnection;
import code.marydon.enums.DatabaseEnum;
import lombok.Builder;
import lombok.Setter;
import lombok.experimental.Accessors;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;
import org.apache.kafka.connect.data.Field;
import org.apache.kafka.connect.data.Struct;

import java.sql.SQLException;
import java.time.LocalDateTime;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;

/**
 * 数据操作工具类
 * @description
 * @author Marydon
 * @date 2024-03-27 20:19
 * @version 1.0
 * @email marydon20170307@163.com
 */

@Slf4j
@Setter
@Builder
@Accessors(chain = true)
public class OperationUtils {
    // public static final String TARGET_DB_MYSQL = DatabaseEnum.DB_MYSQL_NEW.getTypeName();
    // public static final String TARGET_DB_ORACLE = DatabaseEnum.DB_ORACLE.getTypeName();
    private DynamicConnection connection;
    private String primaryFields;
    private String tableName;
    private String deleteMarkField;
    private String deleteMarkFieldValue;
    private String timestampFieldName;
    // private String logLevel;
    // mysql source connector在同步timestamp字段时,会出现时差问题
    // 我们只需要在创建 debezium mysql connector时,设置时区为 UTC/GMT 即可:
    // database.connectionTimeZone=UTC
    // private Integer timestampDifference;

    /**
     * 查询操作
     * @description 根据主键查询记录
     * @date 2024/3/26 18:04
     * @param afterStruct  变化事件发生之后的值
     * @return java.util.List<java.util.Map<java.lang.String,java.lang.Object>>
     */
    public List<Map<String, Object>> selectOperation(Struct afterStruct) throws SQLException {
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;

        // where条件,根据主键进行查询
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : afterStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    // fieldValues.add(afterStruct.get(pk));
                    fieldValues.add(this.getFieldValueByNameIgnoreCase(afterStruct, pk));
                    break;
                }
            }

        }

        // 根据主键查询记录
        String sql = "SELECT 1 FROM " + tableName + " WHERE " + String.join(" AND ", fieldParams);

        log.debug(sql);
        log.debug(fieldValues.toString());

        return connection.queryForList(sql, fieldValues.toArray());
    }


    /**
     * 删除操作(物理删除)
     * @description
     * @date 2024/3/25 18:35
     * @param beforeStruct 变化事件发生之前的值
     * @return int 删除的记录数
     */
    public int physicalDeleteOperation(Struct beforeStruct) throws SQLException {
        ArrayList<String> fieldParams = new ArrayList<>();
        ArrayList<Object> fieldValues = new ArrayList<>();
        String fieldName;

        // where条件,根据主键删除
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : beforeStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    // fieldValues.add(beforeStruct.get(pk));
                    fieldValues.add(this.getFieldValueByNameIgnoreCase(beforeStruct, pk));
                    break;
                }
            }

        }

        // 根据主键删除记录
        String sql = "DELETE FROM " + tableName + " WHERE " + String.join(" AND ", fieldParams);

        log.info(sql);
        log.info(fieldValues.toString());

        return this.connection.update(sql, fieldValues.toArray());
    }

    /**
     * 删除操作(逻辑删除、假删除)
     * @description 执行更新操作
     * @date 2024/3/25 18:35
     * @param beforeStruct 变化事件发生之前的值
     * @return int 更新的记录数
     */
    public int logicalDeleteOperation(Struct beforeStruct) throws SQLException {
        List<String> fieldNames = new ArrayList<>();
        List<String> fieldParams = new ArrayList<>();
        List<Object> fieldValues = new ArrayList<>();
        String fieldName;

        // Date:2024/2/23
        // 增加标识是否删除字段
        fieldNames.add(deleteMarkField + "=?");
        fieldValues.add(deleteMarkFieldValue);

        // 增加时间戳标识字段
        if (StringUtils.isNotBlank(timestampFieldName)) {
            fieldNames.add(timestampFieldName + "=?");
            fieldValues.add(LocalDateTime.now());
        }

        // where条件,根据主键更新
        for (String pk : primaryFields.split(",")) {
            fieldParams.add(pk + "=?");
            for (Field f : beforeStruct.schema().fields()) {
                fieldName = f.name();

                if (StringUtils.equalsIgnoreCase(fieldName, pk)) {
                    // fieldValues.add(beforeStruct.get(pk));
                    fieldValues.add(this.getFieldValueByNameIgnoreCase(beforeStruct, pk));
                    break;
                }
            }

        }

        String sql = "UPDATE " + tableName + " SET " + String.join(",", fieldNames) + " WHERE " + String.join(" AND ", fieldParams);

        log.info(sql);
        log.info(fieldValues.toString());

        return this.connection.update(sql, fieldValues.toArray());
    }

    /**
     * 转成日期类型
     * @description 源表的日期类型被debezium转成了数字类型,需要通过Java再转回来
     * 然后通过jdbc直接将转换后的日期类型(Java日期对象)插入到目标表对应的日期字段
     * @attention
     * @date 2024-06-07 09:25
     * @param sourceDbType 源库类型
     * @param sourceDateColumnType 源表日期字段的类型
     * @param fieldValue 源表日期字段的对应值
     * @return java.lang.Object LocalDateTime、LocalDate、LocalTime
     */
    public Object convertToDateValue(String sourceDbType, String sourceDateColumnType, Object fieldValue) {
        // 源表mysql
        if (sourceDbType.equals(DatabaseEnum.DB_MYSQL_NEW.getTypeName())) {
            // 源表表字段是mysql的datetime类型
            if (sourceDateColumnType.equalsIgnoreCase("datetime")) {
                // mysql datetime
                // io.debezium.time.Timestamp
                return DateUtils.fromTimeMills((Long) fieldValue);
            }

            // 源表表字段是mysql的date类型
            if (sourceDateColumnType.equalsIgnoreCase("date")) {
                // mysql date
                // io.debezium.time.Date
                return DateUtils.calculateDateFromEpochDays((Integer) fieldValue);
            }

            // 源表表字段是mysql的timestamp类型
            if (sourceDateColumnType.equalsIgnoreCase("timestamp")) {
                // mysql timestamp
                // io.debezium.time.ZonedTimestamp
                return DateUtils.ISO8601ToLocalDateTime((String) fieldValue);
            }

            // 源表表字段是mysql的time类型
            if (sourceDateColumnType.equalsIgnoreCase("time")) {
                // io.debezium.time.MicroTime
                // fieldValues.add(DateUtils.secondsToLocalTime((Long) fieldValue / 1000000));
                return DateUtils.secondsToLocalTime((Long) fieldValue);
            }
        }

        // 源表oracle
        if (sourceDbType.equals(DatabaseEnum.DB_ORACLE.getTypeName())) {
            // oracle timestamp-->io.debezium.time.MicroTimestamp
            // oracle date(包含带时间和不带时间)-->io.debezium.time.Timestamp
            return DateUtils.fromTimeMills((Long) fieldValue);
        }

        // 源表sqlserver
        if (sourceDbType.equals(DatabaseEnum.DB_SQL_SERVER.getTypeName())) {
            // 源表表字段是sqlserver的datetime类型
            if (sourceDateColumnType.equalsIgnoreCase("datetime")) {
                // sqlserver datetime
                // io.debezium.time.Timestamp
                return DateUtils.fromTimeMills((Long) fieldValue);
            }
            // 源表表字段是sqlserver的datetime2类型
            if (sourceDateColumnType.equalsIgnoreCase("datetime2")) {
                // sqlserver datetime
                // io.debezium.time.Timestamp
                return DateUtils.fromTimeMills((Long) fieldValue);
            }

            // 源表表字段是sqlserver的date类型
            if (sourceDateColumnType.equalsIgnoreCase("date")) {
                // sqlserver date
                // io.debezium.time.Date
                return DateUtils.calculateDateFromEpochDays((Integer) fieldValue);
            }

            // 源表表字段是sqlserver的smalldatetime类型
            if (sourceDateColumnType.equalsIgnoreCase("smalldatetime")) {
                // sqlserver smalldatetime
                // io.debezium.time.Timestamp
                return DateUtils.fromTimeMills((Long) fieldValue);
            }

            // 源表表字段是sqlserver的datetimeoffset类型
            if (sourceDateColumnType.equalsIgnoreCase("datetimeoffset")) {
                // sqlserver timestamp
                // io.debezium.time.ZonedTimestamp
                return DateUtils.ISO8601ToLocalDateTime((String) fieldValue);
            }

            // 源表表字段是sqlserver的time类型
            if (sourceDateColumnType.equalsIgnoreCase("time")) {
                // io.debezium.time.MicroTime
                return DateUtils.secondsToLocalTime(((Integer) fieldValue).longValue());
            }
        }

        return fieldValue;
    }

    /**
     * 根据字段名获取字段值
     * @attention Struct结构的原生取值get()是区分大小写的
     * 但是,由于源表的字段名称使用不规范的问题,导致字段名称大小写不一的状态,进而无法正确拿到该字段对应的值。
     * 所以,在取值的时候,需要忽略字段名称的大小写问题
     * @date 2024-06-07 09:25
     * @param struct Struct对象
     * @param fieldNameIgnoreCase 字段名称
     * @return java.lang.Object 字段值
     */
    public Object getFieldValueByNameIgnoreCase(Struct struct, String fieldNameIgnoreCase) {
        String fieldName;
        for (Field field : struct.schema().fields()) {
            fieldName = field.name();
            if (fieldName.equalsIgnoreCase(fieldNameIgnoreCase)) {
                return struct.get(field);
            }
        }

        // 没有匹配的,就返回null
        return null;
    }
}

DateUtils.java

其实,当我们以jdbc的方式操作数据库的时候,没有必要将date类型转成string类型再将其写入数据库,java的日期类型完全能够实现数据库date类型的无缝转换。 

查看代码
 
import code.marydon.configs.JdbcSinkConfig;
import org.apache.commons.lang3.StringUtils;

import java.time.*;
import java.time.format.DateTimeFormatter;

/**
 * 日期工具类
 * <a href="https://www.cnblogs.com/Marydon20170307/p/17921900.html">博客</a>
 * @description 处理日期间的格式转换
 * 1秒=1000毫秒
 * 1毫秒=1000微秒
 * 1微秒=1000纳秒
 * 1纳秒=1000皮秒
 * 1皮秒=1000飞秒
 * @explain
 * mysql的datetime类型会被插件io.debezium.connector.mysql.MySqlConnector转成时间戳,需调用DateUtils.fromTimeMills(1702027934000L)
 * mysql的date类型会被插件io.debezium.connector.mysql.MySqlConnector转成天数,需调用DateUtils.calculateDateFromEpochDays(19699)
 * mysql的time类型会被插件io.debezium.connector.mysql.MySqlConnector转成微秒数,需调用DateUtils.secondsToLocalTime(34419000000L)
 * mysql的timestamp类型会被插件io.debezium.connector.mysql.MySqlConnector转成ISO8601格式的时间戳,需调用DateUtils.ISO8601ToLocalDateTime("2023-12-08T15:32:19Z", DateUtils.DEFAULT_DIFFERENCE_HOURS)
 * oracle的date类型和timestamp类型会被插件io.debezium.connector.oracle.OracleConnector转成时间戳,fromTimeMills(1702837490000L),timestampToString(1702826697000000L)
 * 从转换结果来看,oracle的timestamp类型被插件转换后只是被date类型多了3个0
 * @author Marydon
 * @date 2023-12-07 11:48
 * @version 1.0
 * @email marydon20170307@163.com
 */
public class DateUtils {
    // UTC时区
    public static final String UTC_TIME_ZONE = JdbcSinkConfig.DB_TIMEZONE_DEFAULT;

    /**
     * 毫秒数转LocalDateTime
     * @description 时区使用的是:GMT+8/UTC+8
     * @attention jdk>=1.8
     * @date 2020年09月10日 0010 11:22
     * @param timeMills 毫秒数
     * @return java.time.LocalDateTime
     */
    public static LocalDateTime fromTimeMills(Long timeMills){
        return fromTimeMills(timeMills, UTC_TIME_ZONE);
    }

    public static LocalDateTime fromTimeMills(Long timeMills, String timeZone){
        if (null == timeMills || timeMills <= 0) {
            return null;
        }

        // length=16:毫秒
        if (String.valueOf(timeMills).length() == 16) {
            timeMills = timeMills / 1000;
        }
        return LocalDateTime.ofInstant(Instant.ofEpochMilli(timeMills), ZoneId.of(timeZone));
    }

    /**
     * ISO 8601标准日期转成LocalDateTime
     * @description
     * 在mysql中,timestamp类型被插件io.debezium.connector.mysql.MySqlConnector转成ISO8601格式,
     * 并且时间早了6个小时
     * @param dateTimeStr ISO 8601标准日期字符串
     * 2023-12-07T16:00:00Z
     * "2023-12-07T16:00:00Z"是一种ISO 8601标准的日期时间表示方式
     * 这个字符串表示的是一个特定的时间点:2023年12月7日,下午4点(16点),0分钟,0秒。其中“T”是时间标识符,“Z”表示的是协调世界时(UTC)。
     * 这种格式是可以精确到秒的时间戳
     * @return java.time.LocalDateTime
     */
    public static LocalDateTime ISO8601ToLocalDateTime(String dateTimeStr) {
        return ISO8601ToLocalDateTime(dateTimeStr, 0);
    }

    /**
     * ISO 8601标准日期转成LocalDateTime
     * @description
     * 在mysql中,timestamp类型被插件io.debezium.connector.mysql.MySqlConnector转成ISO8601格式,
     * 并且时间早了6个小时
     * @param dateTimeStr ISO 8601标准日期字符串
     * 2023-12-07T16:00:00Z
     * "2023-12-07T16:00:00Z"是一种ISO 8601标准的日期时间表示方式
     * 这个字符串表示的是一个特定的时间点:2023年12月7日,下午4点(16点),0分钟,0秒。其中“T”是时间标识符,“Z”表示的是协调世界时(UTC)。
     * 这种格式是可以精确到秒的时间戳
     * @param hours 往前/往后几个小时
     * @return java.time.LocalDateTime
     */
    public static LocalDateTime ISO8601ToLocalDateTime(String dateTimeStr, Integer hours) {
        if (StringUtils.isBlank(dateTimeStr)) {
            return null;
        }

        // UTC时间
        DateTimeFormatter formatter = DateTimeFormatter.ISO_DATE_TIME;
        ZonedDateTime zonedDateTime = ZonedDateTime.parse(dateTimeStr, formatter);
        LocalDateTime ldt = zonedDateTime.toLocalDateTime();
        if (hours != 0) {
            return ldt.plusHours(hours);
        } else {
            return ldt;
        }

    }

    /**
     * 根据天数倒推日期
     * @param days 距离1970-01-01的天数
     *             示例:19699
     * @return java.time.LocalDate
     */
    public static LocalDate calculateDateFromEpochDays(Integer days) {
        if (null == days || days <= 0) {
            return null;
        }
        // 创建1970年1月1日的LocalDate
        LocalDate baseDate = LocalDate.of(1970, 1, 1);

        // 增加总天数得到新的LocalDate
        // LocalDate calculatedDate = baseDate.plusDays(days);
        // 将LocalDate转换为LocalDateTime(默认时间为00:00:00)
        // return calculatedDate.atStartOfDay();
        return baseDate.plusDays(days);
    }

    /**
     * 秒数转时间
     * @param seconds 77400
     * @return java.time.LocalTime 09:30:00
     */
    public static LocalTime secondsToLocalTime(Long seconds) {
        if (null == seconds || seconds <= 0) {
            return null;
        }

        if (String.valueOf(seconds).length() == 8) {
            seconds = seconds / 1000;
        } else if (String.valueOf(seconds).length() == 11) {
            seconds = seconds / 1000000;
        }

        int seconds2 = seconds.intValue();

        // 计算小时、分钟和剩余秒数
        int hours = seconds2 / 3600;
        int remainingSecondsAfterHours = seconds2 % 3600;
        int minutes = remainingSecondsAfterHours / 60;
        int secondsOnly = remainingSecondsAfterHours % 60;

        // 使用LocalTime的of方法创建时间对象
        return LocalTime.of(hours, minutes, secondsOnly);
    }
}    

 

写在最后

  哪位大佬如若发现文章存在纰漏之处或需要补充更多内容,欢迎留言!!!

 相关推荐:

posted @ 2024-03-26 12:02  Marydon  阅读(75)  评论(0编辑  收藏  举报