多数据源汇总案例 

1) 案例需求:

hadoop103上的flume-1监控文件hive.log

hadoop104上的flume-2监控某一个端口的数据流,

flume-1flume-2将数据发送给hadoop102上的flume-3flume-3将最终数据打印到控制台

 

2)需求分析:

 

 

 

 

3)实现步骤:

 

0.准备工作

 

分发flume

 

[jason@hadoop102 module]$ xsync flume

 

hadoop102hadoop103以及hadoop104的/opt/module/flume/job目录创建一个group2文件

 

[jason@hadoop102 job]$ mkdir group2


[jason@hadoop103 job]$ mkdir group2


[jason@hadoop104 job]$ mkdir group2

1.创建flume1.conf

 

配置source用于监控hive.log文件,配置sink输出数据到下一级flume

 

hadoop103上创建配置文件并打开

 

[jason@hadoop103 group2]$ touch flume1.conf


[jason@hadoop103 group2]$ vim flume1.conf

添加如下内容

 

# Name the components on this agent


a1.sources = r1


a1.sinks = k1


a1.channels = c1


 


# Describe/configure the source


a1.sources.r1.type = exec


a1.sources.r1.command = tail -F /opt/module/group.log


a1.sources.r1.shell = /bin/bash -c


 


# Describe the sink


a1.sinks.k1.type = avro


a1.sinks.k1.hostname = hadoop102


a1.sinks.k1.port = 4141


 


# Describe the channel


a1.channels.c1.type = memory


a1.channels.c1.capacity = 1000


a1.channels.c1.transactionCapacity = 100


 


# Bind the source and sink to the channel


a1.sources.r1.channels = c1


a1.sinks.k1.channel = c1

 

 

2.创建flume2.conf

 

配置source监控端口44444数据流,配置sink数据到下一级flume

 

hadoop104上创建配置文件并打开

 

[jason@hadoop104 group2]$ touch flume2.conf


[jason@hadoop104 group2]$ vim flume2.conf

添加如下内容

 

# Name the components on this agent
a2.sources = r1

a2.sinks = k1

a2.channels = c1


# Describe/configure the source
a2.sources.r1.type = netcat

a2.sources.r1.bind = hadoop104

a2.sources.r1.port = 44444


# Describe the sink
a2.sinks.k1.type = avro

a2.sinks.k1.hostname = hadoop102

a2.sinks.k1.port = 4141
 
# Use a channel
which buffers events in memory a2.channels.c1.type = memory a2.channels.c1.capacity = 1000 a2.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel a2.sources.r1.channels = c1 a2.sinks.k1.channel = c1

 

 

3.创建flume3.conf

 

配置source用于接收flume1与flume2发送过来的数据流,最终合并后sink到控制台。

 

hadoop102上创建配置文件并打开

 

[jason@hadoop102 group2]$ touch flume3.conf


[jason@hadoop102 group2]$ vim flume3.conf

添加如下内容

 

# Name the components on this agent
a3.sources = r1
a3.sinks
= k1
a3.channels
= c1 # Describe/configure the source a3.sources.r1.type = avro
a3.sources.r1.bind
= hadoop102
a3.sources.r1.port
= 4141 # Describe the sink a3.sinks.k1.type = logger # Describe the channel a3.channels.c1.type = memory
a3.channels.c1.capacity
= 1000

a3.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel a3.sources.r1.channels = c1
a3.sinks.k1.channel
= c1

 

 

4.执行配置文件

 

分别开启对应配置文件:flume3.conf,flume2.conf,flume1.conf。

 

[jason@hadoop102 flume]$ bin/flume-ng agent --conf conf/ --name a3 --conf-file job/group2/flume3.conf -Dflume.root.logger=INFO,console
[jason@hadoop104 flume]$ bin/flume-ng agent --conf conf/ --name a2 --conf-file job/group2/flume2.conf
[jason@hadoop103 flume]$ bin/flume-ng agent --conf conf/ --name a1 --conf-file job/group2/flume1.conf

 

 

5.在hadoop103上向/opt/module目录下的group.log追加内容

 

[jason@hadoop103 module]$ echo 'hello' > group.log

 

 

6.在hadoop104上向44444端口发送数据

 

[jason@hadoop104 flume]$ telnet hadoop104 44444

 

 

7.检查hadoop102上数据

 

 

 

 

 

posted on 2020-09-08 19:29  架构艺术  阅读(366)  评论(0编辑  收藏  举报