flink with rabbitmq,sink source mysql redis es

flink-docker
https://github.com/melentye/flink-docker

 

https://hub.docker.com/_/flink/?tab=description

 

https://shekharsingh.com/blog/2016/11/12/apache-flink-rabbimq-streams-processor.html

http://www.54tianzhisheng.cn/2019/01/20/Flink-RabbitMQ-sink/
https://github.com/tydhot/Kafka-Flink-Rabbitmq-Demo
https://github.com/rootcss/flink-rabbitmq

flink-aggregate---(sum,max,keyby)
https://segmentfault.com/a/1190000017571429

flink-to-mysql
http://www.54tianzhisheng.cn/2019/01/15/Flink-MySQL-sink/


table data of mysql is fixed when start to query, so the job should be a flink batch job.

If you want to read the incoming data if there is incoming data, flink can not handle this kind of case, because flink does not know the incoming data unless you monitor binlog.

You have to use canal to sync up the binlog from mysql to kafka, and run a flink streaming job reading data from kafka. This is the best solution.

 

flinks-to-elastic-search
http://lxwei.github.io/posts/Flink(5)-Sink-%E4%BB%8B%E7%BB%8D%E4%B8%8E%E5%AE%9E%E8%B7%B5.html


flink-redis sink
https://blog.csdn.net/xianpanjia4616/article/details/82534369

flink-redis
https://www.cnblogs.com/jiashengmei/p/9084057.html

posted on 2019-04-16 10:27  szllq2000  阅读(566)  评论(0编辑  收藏  举报