logstash kafka的版本兼容问题

项目细节也就不说了

kafka + spark streaming >>

kafka版本1.0,spark streaming也是依照kafka1.0开发的

这个kfk集群叫kfk_1.0吧

包括kafka在内的集群资源都是个人维护的

目前有一个情况是要从另一个kafka集群版本0.8.1,叫kfk_0.8.1里,取数计算写入kfk_1.0

比较坑爹的是

https://spark.apache.org/docs/latest/streaming-kafka-0-8-integration.html

spark-streaming 要求的最低kfk版本是 0.8.2.1

不兼容kfk_A

正在维护人员升级或部分topic迁移

还有周转的办法,是用中间导数程序将0.8.1 导入 kfk_1.0

以个人经验来说,先考虑应用logstash

但logstash对kafka也有版本的兼容问题

https://www.elastic.co/guide/en/logstash/6.5/plugins-inputs-kafka.html
This plugin uses Kafka Client 2.0.0.

https://www.elastic.co/guide/en/logstash/6.3/plugins-inputs-kafka.html
This plugin uses Kafka Client 1.1.0

https://www.elastic.co/guide/en/logstash/5.0/plugins-inputs-kafka.html

This plugin uses Kafka Client 0.10.0.1

https://www.elastic.co/guide/en/logstash/2.4/plugins-inputs-kafka.html

This plugin uses Kafka Client 0.8.2.2

https://www.elastic.co/guide/en/logstash/2.3/plugins-inputs-kafka.html

```
Kafka Client Version Logstash Version Plugin Version Security Features Why?
0.8 2.0.0 - 2.x.x<3.0.0 Legacy, 0.8 is still popular
0.9 2.0.0 - 2.3.x 3.x.x Basic Auth, SSL
```

经测试 logstash 2.0 兼容kafka 0.8.1

看迁移的进度,如不支持迁移

考虑以kafka2.0读取kafka 0.8.1 写入kafka 1.0.0

kafka2.0写入kafka 1.0.0还未测试

posted @ 2019-03-02 14:41  cclient  阅读(2951)  评论(0编辑  收藏  举报