log4j输出日志到flume

现需要通过log4j将日志输出到flume,通过flume将日志写到文件或hdfs中

配置flume-config文件

  • 将日志下沉至文件
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# 如果需要本地机器连接服务器flume,需要配置为0.0.0.0
a1.sources.r1.type = avro
a1.sources.r1.bind = 0.0.0.0
a1.sources.r1.port = 44444

# 将日志下沉至文件
a1.sinks.k1.type = file_roll
a1.sinks.k1.sink.directory = /home/hadoop/tmp
a1.sinks.k1.sink.rollInterval=86400
a1.sinks.k1.sink.batchSize=100
a1.sinks.k1.sink.serializer=text
a1.sinks.k1.sink.serializer.appendNewline = false

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 1000

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

  • 将日志下沉至HDFS
agent1.sources = source1
agent1.sinks = sink1
agent1.channels = channel1

# 如果需要本地机器连接服务器flume,需要配置为0.0.0.0
agent1.sources.source1.type = avro
agent1.sources.source1.bind = 0.0.0.0
agent1.sources.source1.port = 44444

# Describe sink1
agent1.sinks.sink1.type = hdfs
agent1.sinks.sink1.hdfs.path =hdfs://master:9000/log/flume-collection/%y-%m-%d/%H-%M
agent1.sinks.sink1.hdfs.filePrefix = test_log
agent1.sinks.sink1.hdfs.maxOpenFiles = 5000
agent1.sinks.sink1.hdfs.batchSize= 100
agent1.sinks.sink1.hdfs.fileType = DataStream
agent1.sinks.sink1.hdfs.writeFormat =Text
agent1.sinks.sink1.hdfs.rollSize = 102400
agent1.sinks.sink1.hdfs.rollCount = 1000000
agent1.sinks.sink1.hdfs.rollInterval = 60
agent1.sinks.sink1.hdfs.round = true
agent1.sinks.sink1.hdfs.roundValue = 10
agent1.sinks.sink1.hdfs.roundUnit = minute
agent1.sinks.sink1.hdfs.useLocalTimeStamp = true

# Use a channel which buffers events in memory
agent1.channels.channel1.type = memory
agent1.channels.channel1.keep-alive = 120
agent1.channels.channel1.capacity = 500000
agent1.channels.channel1.transactionCapacity = 600

# Bind the source and sink to the channel
agent1.sources.source1.channels = channel1
agent1.sinks.sink1.channel = channel1

maven配置pom.xml文件

 <dependency>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-log4j12</artifactId>
    <version>1.7.25</version>
</dependency>

<dependency>
    <groupId>org.apache.flume.flume-ng-clients</groupId>
    <artifactId>flume-ng-log4jappender</artifactId>
    <version>1.7.0</version>
</dependency>

配置log4j.properties

log4j.category.com.xxx=INFO,console,flume

log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.out
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern="%d{yyyy-MM-dd HH:mm:ss} %p [%c:%L] - %m%n

log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Threshold=INFO
log4j.appender.flume.Hostname = [flume-ip]
log4j.appender.flume.Port = [flume-port]
log4j.appender.flume.UnsafeMode = true
log4j.appender.flume.layout=org.apache.log4j.PatternLayout
log4j.appender.flume.layout.ConversionPattern=%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n

编写测试类

public class ProduceLog {
	static Logger logger = Logger.getLogger(ProduceLog.class);

	public static void main(String[] args) throws InterruptedException {

		while (true) {
			// 每隔两秒log输出一下当前系统时间戳
			logger.info(String.valueOf(new Date().getTime()));
			Thread.sleep(2000);
			try {
				throw new Exception("exception msg");
			}
			catch (Exception e) {
				logger.error("error:" + e.getMessage());
			}
		}
	}
}

问题总结

  • flume和项目不在同一台机器上,需要远程写入到服务器
agent1.sources.source1.bind = 0.0.0.0
  • log4j:ERROR Flume append() failed.
如果使用log4j,需要将日志写入flume,在maven配置依赖jar包的时候一定要添加:flume-ng-log4jappender
posted @ 2017-05-15 15:37  0xcafedaddy  阅读(1230)  评论(1编辑  收藏  举报