Kafka手动初始化消费者

背景

项目需要动态更改topic,且不能将配置写在yml配置文件中,同时没有找到方法提取其他文件里的 topics 放入@KafkaListener注解中,因此不使用@KafkaListener注解,改为手动初始化消费者,开启消费者线程。

思路

在消费者线程里,每次循环都从配置源获取最新的topic信息,与之前的topic比较,如果发生变化,重新订阅topic或者初始化消费者。

实现

@Service
@Slf4j
public class KafkaConsumers implements InitializingBean {

    @Value("${spring.kafka.bootstrap-servers}") 
    private String bootstrapServers;

    @Value("${spring.kafka.consumer.group-id}") 
    private String groupId;

    @Value("${spring.kafka.consumer.key-deserializer}") 
    private String keyDeserializer;

    @Value("${spring.kafka.consumer.value-deserializer}") 
    private String valueDeserializer;

    /**
     * 消费者
     */
    private static KafkaConsumer<String, byte[]> consumer;
    /**
     * topic
     */
    private List<String> topicList;
 
    public static String getNewTopic() {
        try {
            File file = ResourceUtils.getFile("classpath:topics.json");
            String jsonData = FileUtils.readFileToString(file);
            JSONObject jsonObject = JSON.parseObject(jsonData);
            return jsonObject.getString("topics");
        } catch (IOException e) {
            log.error("read topics.json file failed!");
        }
        return null;
    }
 
    /**
     * 初始化消费者
     *
     * @param topicList
     * @return
     */
    public KafkaConsumer<String, String> getInitConsumer(List<String> topicLists) {
        //配置信息
        Properties props = new Properties();
        //kafka服务器地址
        props.put("bootstrap.servers", bootstrapServers);
        //必须指定消费者组
        props.put("group.id", groupId);
        //设置数据key和value的序列化处理类
        props.put("key.deserializer", keyDeserializer);
        props.put("value.deserializer", valueDeserializer);
        //创建消息者实例
        KafkaConsumer<String, byte[]> tmpConsumer = new KafkaConsumer<>(props);
        //订阅topic的消息
        consumer.subscribe(topicLists);
        return tmpConsumer;
    }
 
    /**
     * 开启消费者线程
     */
    @Override
    public void afterPropertiesSet() {
        // 初始化topic
        topicList = Splitter.on(",").splitToList(Objects.requireNonNull(getNewTopic()));
        if (CollectionUtils.isNotEmpty(topicList)) {
            consumer = getInitConsumer(topicList);
            // 开启一个消费者线程
            new Thread(() -> {
                while (true) {
                    // 模拟从配置源中获取最新的topic(字符串,逗号隔开)
                    final List<String> newTopic = Splitter.on(",").splitToList(Objects.requireNonNull(getNewTopic()));
                    // 如果topic发生变化
                    if (!topicList.equals(newTopic)) {
                        log.info("topic 发生变化:newTopic:{},oldTopic:{}-------------------------", newTopic, topicList);
                        // method one:重新订阅topic:
                        topicList = newTopic;
                        consumer.subscribe(newTopic);
                        // method two:关闭原来的消费者,重新初始化一个消费者
                        //consumer.close();
                        //topicList = newTopic;
                        //consumer = getInitConsumer(newTopic);
                        continue;
                    }
                    //在100ms内等待Kafka的broker返回数据.
                    ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));
                    for (ConsumerRecord<String, String> record : records) {
                        System.out.println("key:" + record.key() + "" + ",value:" + new String(record.value()));
                    }
                }
            }).start();
        }
    }
}

参考

https://blog.csdn.net/weixin_36380516/article/details/119524653

posted @ 2022-04-11 17:11  zjcfrancis  阅读(512)  评论(0编辑  收藏  举报