Flink批流一体API(Connectors示例)

Connectors

Connectors连接器参考:https://nightlies.apache.org/flink/flink-docs-release-1.15/zh/docs/connectors/datastream/overview/

Flink目前支持的Connectors

Flink目前支持以下系统:

系统 使用地方
Apache Kafka source/sink
Apache Cassandra sink
Amazon Kinesis Streams source/sink
Elasticsearch sink
FileSystem sink
RabbitMQ  source/sink
Google PubSub source/sink
Hybrid Source source
Apache NiFi source/sink
Apache Pulsar source
JDBC sink

在使用一种连接器时,通常需要额外的第三方组件,比如:数据存储服务器或者消息队列,要注意这些列举的连接器是Flink 工程的一部分,包含在发布的源码中,但是不包含在二进制发行版中。

Flink 还有些一些额外的连接器通过 Apache Bahir 发布, 包括:

系统 使用地方
Apache ActiveMQ source/sink
Apache Flume sink
Redis sink
Akka sink
Netty source

JDBC案例

代码如下:

/**
 * Connectors -JDBC
 */

public class ConnectorsDemo {

    public static void main(String[] args) throws Exception {
        //1.env
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        //2.Source
        env.fromElements(new Student(null, "tonyma111", 18))
                //3.Transformation
                //4.Sink
                .addSink(JdbcSink.sink(
                        "INSERT INTO `t_student` (`id`, `name`, `age`) VALUES (null, ?, ?)",
                        (ps, s) -> {
                            ps.setString(1, s.getName());
                            ps.setInt(2, s.getAge());
                        },
                        new JdbcConnectionOptions.JdbcConnectionOptionsBuilder()
                                .withUrl("jdbc:mysql://localhost:3306/test?serverTimezone=Asia/Shanghai&useUnicode=true&charcterEncoding=UTF-8&useSSL=false")
                                .withUsername("root")
                                .withPassword("root")
                                .withDriverName("com.mysql.cj.jdbc.Driver")
                                .build()));
        //5.execute
        env.execute();
    }

    @Data
    @NoArgsConstructor
    @AllArgsConstructor
    public static class Student {
        private Integer id;
        private String name;
        private Integer age;
    }
}

执行完后,t_student会新增一条数据。

Kafka案例

 

posted @ 2022-05-28 20:43  残城碎梦  阅读(91)  评论(0编辑  收藏  举报