【Logstash系列】使用Logstash作为收集端采集IIS日志

现阶段Logstash在Windows端的日志采集一直存在若干问题,包括:
 
1. LS有读锁:进程开启后Input指定路径下的所有文件都会被锁死无法重命名或删除。
2. LS不识别*:如果在path上使用“*”的话,sincedb会失效,所有日志都会从文件开始处重新读取(类似start_position => beginning)。
 
社区​里关于这些历史问题也是一抓一大把,贴下主要的几个贴:
[LOGSTASH-429] File Input - .sincedb file is broken on Windows
[LOGSTASH-578] Logstash Agent lock file inputs on Windows - Python LOG rotate failed
[LOGSTASH-986] File Locking in Windows(Rolling logs, etc.)
[LOGSTASH-1587] Windows - file input - sincedb / start_position is ignored
[LOGSTASH-1801] Input File Rotation in Logstash
 
当然,开发者也一直在努力修复这些问题,这两个月貌似有了不少进展,Jordan并号称LS1.5将会修复这个BUG,那就让我们欢呼等待吧:)
Release a gem with windows support #39
File Locking in Windows(Rolling logs, etc.) #1557
【Logstash系列】使用Logstash作为收集端采集IIS日志

 
那么,问题来了,在现有的LS1.4.2版本上,如何在生产环境上对日志进行采集呢?
 
阿婆主采用的办法是每周日凌晨执行计划任务,关闭LS服务,删除除了当天以前其他所有的IIS日志,然后再启动。这样就保证了IIS日志不会无法被删除,且重启后即使重复读也不会重复读太多文件而影响性能。
不过这样部署起来的成本比较大,需要一台台上去装啊改的,觉得麻烦的话还是考虑下NXLOG方式进行采集,之后阿婆主也会提到如何使用nxlog来采集IIS日志。
 
***小小分割线***
 
以下附上配置文档(安装部署文档本篇暂不涉及,需要的话可私信):
  • LS客户端配置文件
input中使用file及eventlog模块分别对IIS及Windows Event Log进行采集;filter中用grok对日志进行分隔处理;output将过滤后的日志传至redis。
input {
  file {
    type => "IIS"
    path => "D:\iislog\xxx.xxx.com\W3SVC663037409/*.log"
    codec => plain {
charset => "ISO-8859-1"
    }
  }
}
 
input {
  eventlog {
    type => 'Win32-EventLog'
    logfile => ["Application", "System", "Security"]
  }
}
 
filter {
  #ignore log comments
  if [message] =~ "^#" {
    drop {}
  }
 
  grok {
    # check that fields match your IIS log settings
    match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} (%{WORD:s-sitename}|-) (%{IPORHOST:s-ip}|-) (%{WORD:cs-method}|-) %{NOTSPACE:cs-uri-stem} %{NOTSPACE:cs-uri-query} (%{NUMBER:s-port}|-) (%{IPORHOST:c-ip}|-) %{NOTSPACE:cs-useragent} %{NOTSPACE:cs-referer} %{NOTSPACE:cs-host} (%{NUMBER:sc-status}|-) (%{NUMBER:sc-bytes}|-) (%{NUMBER:cs-bytes}|-) (%{NUMBER:time-taken}|-)"]
  }
  #Set the Event Timesteamp from the log
date {
    match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
 timezone => "Etc/UCT"
  }
  mutate {
remove_field => [ "log_timestamp"]
convert => [ "sc-bytes", "float" ]
convert => [ "cs-bytes", "float" ]
convert => [ "time-taken", "float" ]
  }
}
 
output {
    #stdout { codec => rubydebug }
    redis {
    host => "192.168.xx.xxx"
    data_type =>"list"
    key => "test:redis"
   }
}
注意:
1. charset是为了解决logstash字符集的问题,如果按默认值“UTF-8”可能还是无法识别一些特殊字体,而修改为“ISO-8859-1”后便可以正确的读入了:
←[33mReceived an event that has a different character encoding than you configured. {:text=>"2014-12-22 14:22:52 /newDict/jp/improve_new.aspx sourcedict=1&jid=322316&wtype=jc&w=\\xCD?\\xFC 192.168.31.190 HTTP/1.1 Mozilla/4.0 - dict.hjenglish.com 200 5043 342\\r", :expected_charset=>"UTF-8", :level=>:warn}←[0m[33mInterrupt received. Shutting down the pipeline. {:level=>:warn}←[0m
【Logstash系列】使用Logstash作为收集端采集IIS日志

2. 如果要调用eventlog模块,必须安装“contrib plugins”,否则服务无法启动:
报错:“LoadError: no such file to load -- jruby-win32ole”
【Logstash系列】使用Logstash作为收集端采集IIS日志

3. 针对数值型字段,必须使用mutate中的convert强制转换字段类型,否则还是字符型,ES后期无法对其使用聚集方法(mean/total/max/etc.)
报错:

ELK日志错误:

[2015-01-07 11:43:02,464][DEBUG][action.search.type       ] [Prester John] [logstash-2015.01.07][0], node[wL6TfFyxQI2fmsDWHI-bdA], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@61fc4bf9] lastShard [true]

org.elasticsearch.search.SearchParseException: [logstash-2015.01.07][0]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"facets":{"0":{"date_histogram":{"key_field":"@timestamp","value_field":"time-taken","interval":"1s"},"global":true,"facet_filter":{"fquery":{"query":{"filtered":{"query":{"query_string":{"query":"cs-host:(dict.*)"}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"from":1420601882243,"to":1420602182243}}},{"terms":{"s-ip.raw":["192.168.33.31"]}}]}}}}}}}},"size":0}]]

        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:681)

        at org.elasticsearch.search.SearchService.createContext(SearchService.java:537)

        at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:509)

        at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:264)

        at org.elasticsearch.search.action.SearchServiceTransportAction$5.call(SearchServiceTransportAction.java:231)

        at org.elasticsearch.search.action.SearchServiceTransportAction$5.call(SearchServiceTransportAction.java:228)

        at org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:559)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

        at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.ClassCastException: org.elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be cast to org.elasticsearch.index.fielddata.IndexNumericFieldData

        at org.elasticsearch.search.facet.datehistogram.DateHistogramFacetParser.parse(DateHistogramFacetParser.java:199)

        at org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)

        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:665)

        ... 9 more

未使用convert前:
【Logstash系列】使用Logstash作为收集端采集IIS日志
字段类型转换后:
【Logstash系列】使用Logstash作为收集端采集IIS日志

4. 对于无法正确获得客户端IP而取到了前端HAProxy或F5或CDN的IP地址的话,需要使用“X-Forwarded-For”,可以看下F5官方提供的文档:
https://devcentral.f5.com/articles/x-forwarded-for-log-filter-for-windows-servers
 
5. 如何对grok表达式进行调试,三斗室大大也早有过介绍了,另外评论区1楼有彩蛋:
http://chenlinux.com/2014/10/19/grokdebug-commandline/
 
6. IIS日志域必须与grok中的配置项精确匹配,否则会出现grok匹配失败,返回“_grokparsefailure”的情况:
【Logstash系列】使用Logstash作为收集端采集IIS日志
【Logstash系列】使用Logstash作为收集端采集IIS日志
补充一下:在这里如何使用powershell对IIS日志域进行批量更改,阿婆主研究了一下午还是没弄出来,希望有相关经验的朋友给予下帮助哈~
 
  • LS服务端配置
input中用到redis模块导入日志,output中用elasticsearch模块将消息导至ES。
input {
  redis {
    host => '192.168.xx.xxx'
    data_type => 'list'
    port => "6379"
    key => 'test:redis'
    type => 'redis-input'
        }
}
output {
  elasticsearch {
    host => "192.168.xx.xxx"
    port => "9300"
                }
}
 
  • 批处理脚本导入计划任务
addTask.bat
schtasks /create /tn "LSRestartIISDel" /tr D:\scripts\logstash\LSRestartIISDel.bat /sc weekly /mo 1 /d SUN /st 03:00 /ru system
pause
【Logstash系列】使用Logstash作为收集端采集IIS日志
注意:Windows Server 2003需要安装补丁(需重启),否则只能手动添加。
 
  • 批处理脚本每周删除IIS日志
LSRestartIISDel.bat
:: Due to LogStash Bug, LS process will lock IIS file until process stopped, however IIS log need to be deleted everyday, so we need stop LS, delete the log and start LS manually.
:: Create Date#2015-01-07#zhoufeng
:: Ver1.0
 
@echo off
 
:: stop LS Service
NET STOP logstash
 
:: delete iis log before 1 days
:: CHANGE IIS LOG PATH BEFORE USING IT!!
forfiles -p "D:\iislog\xxx.xxx.com\W3SVC663037409" /s /m *.log /c "cmd /c del @path" /d -1
 
:: start service
NET START logstash
注意:删除的路径必须手动修改!
 
***神秘分割线***
 
参考文档:
[Link] Using Logstash to Analyse IIS Log Files with Kibana
[Link] Logstash: Received an event that has a different character encoding
[Link] Win32-EventLog input throwing "LoadError: no such file to load -- jruby-win32ole"
[Link] Kibana 中文指南 - histogram by 三斗室
[Link] chenryn/logstash-best-practice-cn/input/file.md by 三斗室
posted @ 2017-01-19 13:56  paymob  阅读(2588)  评论(0编辑  收藏  举报