flume往kafka中导入数据

1、编辑flume的配置文件

a1.sources = r1
a1.channels = c1
 
# Describe/configure the source
a1.sources.r1.type = taildir
a1.sources.r1.filegroups=f1
a1.sources.r1.filegroups.f1 = /workplace/data/log*.*  #监控data目录下所有的log为前缀的文件
a1.sources.r1.positionFile  = /workplace/data/taildir_position.json   
 
# Use a channel which buffers events in memory
a1.channels.c1.type = org.apache.flume.channel.kafka.KafkaChannel
a1.channels.c1.kafka.bootstrap.servers=master:9092
a1.channels.c1.kafka.topic = test1
 
# Bind the source and sink to the channel
a1.sources.r1.channels = c1

  2、启动flume,导入数据到kafka

bin/flume-ng agent --conf conf --conf-file ./conf/job/flume_to_kafka2.conf --name a1 -Dflume.root.logger=INFO,consol

  3、启动Kafka,进入到kafka的bin目录

/app/kafka/bin/kafka-server-start.sh -daemon /app/kafka/config/server.properties

     4、查看kafka中数据

/app/kafka/bin/kafka-console-consumer.sh --bootstrap-server 192.168.80.128:9092 --from-beginning --topic test1

  

posted @ 2022-10-28 14:34  欣欣姐  Views(62)  Comments(0Edit  收藏  举报