flink读写kafka--读kafka

flink读写kafka--读kafka

介绍

主要介绍实际中flink如何读取写入设置kafka

flink版本:1.13.2

github地址:https://github.com/dahai1996/mdw-flink-quickstart


读取kafka

引入依赖

    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-connector-kafka_2.11</artifactId>
        <version>${flink.version}</version>
    </dependency>

设置并创建kafka source

        Properties propertiesSource = new Properties();
        propertiesSource.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "ip");
        propertiesSource.setProperty(ConsumerConfig.GROUP_ID_CONFIG, groupId);
        propertiesSource.setProperty(ConsumerConfig.REQUEST_TIMEOUT_MS_CONFIG,"60000");
        propertiesSource.setProperty(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG,"earliest");

        FlinkKafkaConsumer<String> source = new FlinkKafkaConsumer<>(topic, new SimpleStringSchema(), propertiesSource);

注:SimpleStringSchema 是序列化方法,简单地读取为string,一般都使用这个

用建造者模式包装下

public class SourceKafkaBuilder<T> {
    private final Properties properties = new Properties();
    private FlinkKafkaConsumer<T> sourceKafka;
    private final String topic;
    private final DeserializationSchema<T> valueDeserializer;

    public SourceKafkaBuilder(RunEnv runEnv,String topic,String groupId,DeserializationSchema<T> valueDeserializer) {
        properties.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, runEnv.getKafkaHost());
        properties.setProperty(ConsumerConfig.GROUP_ID_CONFIG, groupId);

        this.topic=topic;
        this.valueDeserializer=valueDeserializer;
    }

    public SourceKafkaBuilder<T> setSessionTimeOutMs(String sessionTimeOutMs){
        properties.setProperty(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, sessionTimeOutMs);
        return this;
    }

    public SourceKafkaBuilder<T> setRequestTimeOutMs(String requestTimeOutMs){
        properties.setProperty(ConsumerConfig.REQUEST_TIMEOUT_MS_CONFIG, requestTimeOutMs);
        return this;
    }
    public SourceKafkaBuilder<T> setAutoOffsetResetConfig(String autoOffsetResetConfig){
        properties.setProperty(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, autoOffsetResetConfig);
        return this;
    }
    public SourceKafkaBuilder<T> setExactlyOnce(){
        properties.setProperty(ConsumerConfig.ISOLATION_LEVEL_CONFIG, "read_committed");
        return this;
    }

    /**
     * @param key ConsumerConfig中包含的key
     * @see ConsumerConfig
     * @param value 值
     * @return 构造器
     */
    public SourceKafkaBuilder<T> setPropertyValue(String key,String value){
        properties.setProperty(key, value);
        return this;
    }

    public FlinkKafkaConsumer<T> build(){
        return new FlinkKafkaConsumer<>(topic,valueDeserializer, properties);
    }
}

使用:

FlinkKafkaConsumerBase<String> sourceKafka = new SourceKafkaBuilder<>(uat, topicName, groupId, new SimpleStringSchema())
                .setRequestTimeOutMs("60000")
                .setSessionTimeOutMs("60000")
                .build()
                .setStartFromGroupOffsets();

注:更多设置使用setPropertyValue()方法,具体有哪些参数,参考ConsumerConfig类的常量。

注2:setExactlyOnce 方法与端到端一致性有关,后续会提到


posted @   sqhhhhAA111i  阅读(318)  评论(0编辑  收藏  举报
相关博文:
阅读排行:
· 【自荐】一款简洁、开源的在线白板工具 Drawnix
· 没有Manus邀请码?试试免邀请码的MGX或者开源的OpenManus吧
· 园子的第一款AI主题卫衣上架——"HELLO! HOW CAN I ASSIST YOU TODAY
· 无需6万激活码!GitHub神秘组织3小时极速复刻Manus,手把手教你使用OpenManus搭建本
· C#/.NET/.NET Core优秀项目和框架2025年2月简报
点击右上角即可分享
微信分享提示