如何在docker环境使用ElasticSearch + Fluentd + Kibana打造日志收集系统
一. fluentd收集日志
为了方便大家理解如何一步一步的搭建这个日志收集系统,我先讲解如何使用Fluentd将Docker容器的日志收集写到本地的文件里。
Fluentd是一个开源的日志收集器,Docker在V1.8之后实现了原生的Fluentd Docker Logging Driver,使用这个driver就可以将docker里的日志写到Fluentd里。详细的Fluentd介绍可参考官网 https://docs.fluentd.org/
docker-compose.yml示例:
services:
fluentd:
image: fluentd:latest
container_name: fluentd
ports:
- "24224:24224"
- "24224:24224/udp"
restart: always
enviroment:
- FLUENTD_CONF=fluentd.conf
volumes:
- /var/log:/var/log
- /var/fluentd.conf:/fluentd/etc/fluentd.conf
nginx:
image: nginx:latest
links:
- fluentd
ports:
- "80:80"
- "443:443"
container_name: nginx
logging:
driver: "fluentd"
options:
fluentd-address: localhost:24224
tag: nginx
注: 宿主提供的/var/log要有其他用户的写入权限,否则启动时fluentd容器会报错。
/var/fluentd.conf文件内容如下,这段配置的意思是讲匹配nginx的日志写入到/var/log/nginx目录下,其他文件写入到/var/log/other目录下
<match nginx>
@type file
path /var/log/nginx
append true
flush_interval 60s
flush_at_shutdown true
</match>
<match **>
@type file
path /var/log/other
append true
flush_interval 60s
flush_at_shutdown true
</match>
然后使用docker-compose启动后,在宿主机/var/log目录下可以看到生成的日志文件,类似下图。
二. fluentd将日志转发到elasticsearch
默认的fluentd的docker镜像没有相关插件,需要安装,这里我们写个Dockerfile
FROM fluent/fluentd:latest
RUN fluent-gem install fluent-plugin-elasticsearch
Dockerfile与Docker-Compose.yml在同一个目录,
Docker-Compose.yml修改如下
services:
elasticsearch:
image: elasticsearch:7.13.4
container_name: elasticsearch
environment:
- discovery.type=single-node
volumes:
- /usr/local/elasticsearch/data:/usr/share/elasticsearch/data
fluentd:
build: .
container_name: fluentd
ports:
- "24224:24224"
- "24224:24224/udp"
restart: always
links:
- elasticsearch
enviroment:
- ESHost=elasticsearch
- FLUENTD_CONF=fluentd.conf
volumes:
- /var/log:/var/log
- /var/fluentd.conf:/fluentd/etc/fluentd.conf
nginx:
image: nginx:latest
links:
- fluentd
ports:
- "80:80"
- "443:443"
container_name: nginx
logging:
driver: "fluentd"
options:
fluentd-address: localhost:24224
tag: nginx
fluentd.conf文件修改如下
<source>
@type forward
port 24224
bind 0.0.0.0
</source>
<match **>
@type elasticsearch
host "#{ENV['ESHost']}"
port 9200
logstash_format true
</match>
二. 使用kibana可视化查询elasticsearch
在docker-compose增加kibana容器配置
services:
kibana:
image: kibana:7.13.4
links:
- elasticsearch
container_name: kibana
ports:
- "5601:5601"
environment:
- ELASTICSEARCH_HOSTS=http://elasticsearch:9200
elasticsearch:
image: elasticsearch:7.13.4
container_name: elasticsearch
restart: always
environment:
- discovery.type=single-node
volumes:
- /usr/local/elasticsearch/data:/usr/share/elasticsearch/data
fluentd:
build: .
container_name: fluentd
ports:
- "24224:24224"
- "24224:24224/udp"
restart: always
links:
- elasticsearch
enviroment:
- ESHost=elasticsearch
- FLUENTD_CONF=fluentd.conf
volumes:
- /var/log:/var/log
- /var/fluentd.conf:/fluentd/etc/fluentd.conf
nginx:
image: nginx:latest
links:
- fluentd
ports:
- "80:80"
- "443:443"
container_name: nginx
logging:
driver: "fluentd"
options:
fluentd-address: localhost:24224
tag: nginx
启动后在浏览器输入 http://[server]:5601可打开kibana页面
参考:
https://docs.fluentd.org/container-deployment/docker-compose
https://zhuanlan.zhihu.com/p/257867352
https://www.elastic.co/guide/en/elasticsearch/reference/current/docker.html
https://www.elastic.co/guide/cn/kibana/current/docker.html
关于安全访问的解决方案,可参考
Kibana配置登录密码 https://blog.csdn.net/qq_40142345/article/details/105487478