Collectd基本使用
基本用法
基础环境
操作系统 | 硬件配置 |
---|---|
CentOS 7 Server | 磁盘:40GB 内存:8GB 网卡:ens3(外网) |
网络配置
# vim ifcfg-ens3
TYPE=Ethernet
BOOTPROTO=static
DEFROUTE=yes
PEERDNS=yes
PEERROUTES=yes
NAME=ens3
DEVICE=ens3
ONBOOT=yes
IPADDR=192.168.200.11
NETMASK=255.255.255.0
GATEWAY=192.168.200.1
DNS1=114.114.114.114
部署Collectd
安装Collectd
# yum install epel-release
# yum makecache
# yum install collectd -y
# systemctl start collectd
# systemctl enable collectd
# setenforce 0
# vim /etc/selinux/config
SELINUX=disabled
配置Collectd
### 收集信息并输出到日志
# vim /etc/collectd.conf
LoadPlugin cpu
LoadPlugin interface
LoadPlugin load
LoadPlugin memory
LoadPlugin logfile
<Plugin logfile>
LogLevel info
File "/var/log/collectd.json.log"
Timestamp true
PrintSeverity false
</Plugin>
LoadPlugin write_log
# systemctl restart collectd
高级用法
输出信息到kafka
下载collectd源码包
# yum install yum-utils rpm-build -y
# yumdownloader --source collectd
### 安装的时候会弹一堆root用户告警,但是不会影响安装
# rpm -i collectd-5.7.1-2.el7.src.rpm
修改collectd编译配置
# cd ~/rpmbuild/SPECS
### 编辑配置文件添加如下内容,然后将--disable-write_kafka修改为--enable-write_kafka
# vim collectd.spec
%package write_kafka
Summary: Kafka output plugin for collectd
Group: System Environment/Daemons
Requires: %{name}%{?_isa} = %{version}-%{release}
BuildRequires: librdkafka-devel
%description write_kafka
This plugin can send data to Kafka.
--enable-write_kafka \
%{_libdir}/collectd/write_kafka.so
编译collectd
# cd ~/rpmbuild/SPECS
# rpmbuild -bb collectd.spec
安装collectd
# yum autoremove collectd
# cd ~/rpmbuild/RPMS/x86_64
# rpm -i collectd-5.7.1-2.el7.centos.x86_64.rpm
配置Kafka
### 参考《Kafka基本使用》部署Kafka,并创建主题"collectd"
# bin/kafka-topics.sh --create --zookeeper 192.168.200.13:2181 --replication-factor 1 --partitions 1 --topic collectd
# bin/kafka-topics.sh --list --zookeeper 192.168.200.13:2181
配置Collectd
LoadPlugin cpu
LoadPlugin interface
LoadPlugin load
LoadPlugin memory
LoadPlugin write_kafka
<Plugin write_kafka>
Property "metadata.broker.list" "192.168.200.13:32771,192.168.200.13:32770,192.168.200.13:32769"
<Topic "collectd">
Format JSON
</Topic>
</Plugin>
测试安装
# bin/kafka-console-consumer.sh --bootstrap-server 192.168.200.13:32771 --topic collectd --from-beginning