随笔分类 - kafka概念学习系列
摘要:不多说,直接上干货! 比如,我们给kafka的topic命名为user_r2p10 表示user这个topic的副本因子(r)是2,分区数(p)是10。 这样后期在写消费者代码的时候,根据topic名称就知道分区有多少个,可以很方便的设置多少个消费者线程。 比如,如下 前期博客,见 Kafka的3节
阅读全文
摘要:见 基于Web的Kafka管理器工具之Kafka-manager的编译部署详细安装 (支持kafka0.8、0.9和0.10以后版本)(图文详解)(默认端口或任意自定义端口)
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 3.1 Broker Configs The essential configurations are the following: broker.id log.dirs zookeeper.connect broker.id log.dirs zookeepe
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 3. CONFIGURATION 3.1 Broker Configs 3.2 Producer Configs 3.3 Consumer Configs 3.3.1 New Consumer Configs 3.3.2 Old Consumer Configs
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 2.5 Legacy APIs A more limited legacy producer and consumer api is also included in Kafka. These old Scala APIs are deprecated and
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 2.4 Connect API The Connect API allows implementing connectors that continually pull from some source data system into Kafka or pus
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 2.3 Streams API The Streams API allows transforming streams of data from input topics to output topics. Examples showing how to use
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 2.2 Consumer API The Consumer API allows applications to read streams of data from topics in the Kafka cluster. Examples showing ho
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 2.1 Producer API The Producer API allows applications to send streams of data to topics in the Kafka cluster. Examples showing how
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 2. APIS Kafka includes four core apis: The Producer API allows applications to send streams of data to topics in the Kafka cluster.
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 1.5 Upgrading From Previous Versions Upgrading from 0.8.x, 0.9.x, 0.10.0.x or 0.10.1.x to 0.10.2.0 从0.8.x, 0.9.x 或 0.10.0.X 升级到 0.1
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 1.4 Ecosystem There are a plethora of tools that integrate with Kafka outside the main distribution. The ecosystem page lists many
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 Step 8: Use Kafka Streams to process data Kafka Streams is a client library of Kafka for real-time stream processing and analyzing
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 Step 7: Use Kafka Connect to import/export data Writing data from the console and writing it back to the console is a convenient pl
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 Step 6: Setting up a multi-broker cluster So far we have been running against a single broker, but that's no fun. For Kafka, a sing
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 Step 5: Start a consumer Kafka also has a command line consumer that will dump out messages to standard output. If you have each of
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 Step 3: Create a topic Let's create a topic named "test" with a single partition and only one replica: We can now see that topic if
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 Step 2: Start the server Kafka uses ZooKeeper so you need to first start a ZooKeeper server if you don't already have one. You can
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 不要局限于,这个版本,我只是以最新的版本,来做个引子,让大家对官网的各个kafka版本懂得如何独立去看。 over
阅读全文
摘要:不多说,直接上干货! 一切来源于官网 Commit Log Kafka can serve as a kind of external commit-log for a distributed system. The log helps replicate data between nodes an
阅读全文