史上最差的kafka教程第四天(spring整合kafka)

导入依赖 

         <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>2.4.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-streams</artifactId>
            <version>2.4.0</version>
        </dependency>


        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
            <version>2.4.3.RELEASE</version>
        </dependency>

配置可以采用之前的KafkaConfiguration

生产者

@RestController
public class KafkaController {

    @Autowired
    private KafkaTemplate<String,Object> kafkaTemplate;

    @PostMapping("/user/save")
    public boolean saveUser(@RequestBody User user){
        kafkaTemplate.send("userTopic",user);
        return true;
    }
}

 

消费者

@Component
public class ConsumerListener {

    @KafkaListener(id="demo",topics = "userTopic")
    public void onMessage(User user){
        //insertIntoDb(buffer);//这里为插入数据库代码
        System.out.println(user);
    }
}

指定分区消费者

@Component
public class ConsumerListener {
  @KafkaListener(topicPartitions =
{ @TopicPartition(topic = "testTopic",
partitions = { "0", "2" }))
  public void onMessage(User user){ /
    System.out.println(user); } }

 

手动提交offset(假设消费异常,本次offset不提交,重启服务后会重新消费,或者采用定时器定时重启监听器,后续会写一个通用的spring-kafka可用方案)

首先修改KafkaConfiguration类的consumerProps的offset自动提交属性改为false

props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);

  

 
@Component
public class ConsumerListener {

    @KafkaListener(topics = "userTopic")
    public void onMessage(ConsumerRecord consumerRecord, Acknowledgment ack){
        Object value = consumerRecord.value();
if (null == value) {
log.error("kafka消费数据为空");
}
log.info("KafkaConsumer1 消费者开始消费接受到的数据》》》》》》》》"+(String) value.toString());
ack.acknowledge(); } }
posted @ 2020-08-13 16:33  慧剑仙  阅读(324)  评论(0编辑  收藏  举报