Linux学习29-awk提取log日志信息,统计日志里面ip访问次数排序

前言

有一段log日志,需从日志里面分析,统计IP访问次数排序前10名,查看是否有异常攻击。

日志提取

如下日志,这段日志包含的信息内容较多,我们希望提取ip,访问时间,请求方式,访问路径(不带参数),状态码

123.125.72.61 - - [05/Dec/2018:00:00:02 +0000] "GET /yoyo/artical?locale=en HTTP/1.1" 200 12164 "-" "Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)" 0.032 0.032 .
123.125.72.61 - - [05/Dec/2018:00:00:02 +0000] "GET /index?page=1 HTTP/1.1" 200 16739 "-" "Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)" 0.120 0.120 .
141.1.142.111 - - [05/Dec/2018:00:00:02 +0000] "GET /index?page=61 HTTP/1.1" 200 16739 "-" "Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)" 0.120 0.120 .
141.1.142.131 - - [05/Dec/2018:00:00:02 +0000] "GET /yoyoketang?page=62 HTTP/1.1" 200 16739 "-" "Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)" 0.120 0.120 .
141.1.142.131 - - [05/Dec/2018:00:00:02 +0000] "GET /blog?page=3 HTTP/1.1" 200 16739 "-" "Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)" 0.120 0.120 .
142.22.12.132 - - [05/Dec/2018:00:00:02 +0000] "GET /blog?page=1 HTTP/1.1" 200 16739 "-" "Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)" 0.120 0.120 .
142.22.12.132 - - [05/Dec/2018:00:00:02 +0000] "POST /blog?page=1 HTTP/1.1" 200 16739 "-" "Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)" 0.120 0.120 .
142.22.12.132 - - [05/Dec/2018:00:00:02 +0000] "POST /blog?page=3 HTTP/1.1" 200 16739 "-" "Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)" 0.120 0.120 .

可以使用 awk 对日志内容格式化输出,根据空格格式化输出,第一列是ip,也就是'{print $1}',其它列依次类推

awk '{print $1,$4,$6,$7,$9}' log.txt

[root@VM_0_2_centos ~]# awk '{print $1,$4,$6,$7,$9}' log.txt
123.125.72.61 [05/Dec/2018:00:00:02 "GET /yoyo/artical?locale=en 200
123.125.72.61 [05/Dec/2018:00:00:02 "GET /index?page=1 200
141.1.142.111 [05/Dec/2018:00:00:02 "GET /index?page=61 200
141.1.142.131 [05/Dec/2018:00:00:02 "GET /yoyoketang?page=62 200
141.1.142.131 [05/Dec/2018:00:00:02 "GET /blog?page=3 200
142.22.12.132 [05/Dec/2018:00:00:02 "GET /blog?page=1 200
142.22.12.132 [05/Dec/2018:00:00:02 "POST /blog?page=1 200
142.22.12.132 [05/Dec/2018:00:00:02 "POST /blog?page=1 200

接下来需要去掉多余的[ 和 " 和?后面的参数,可以使用符号继续分割

awk -F '[[, ",?]' '{print $1,$5,$8,$9,$13}' log.txt

[root@VM_0_2_centos ~]# awk -F '[[, ",?]'  '{print $1,$5,$8,$9,$13}' log.txt
123.125.72.61 05/Dec/2018:00:00:02 GET /yoyo/artical 200
123.125.72.61 05/Dec/2018:00:00:02 GET /index 200
141.1.142.111 05/Dec/2018:00:00:02 GET /index 200
141.1.142.131 05/Dec/2018:00:00:02 GET /yoyoketang 200
141.1.142.131 05/Dec/2018:00:00:02 GET /blog 200
142.22.12.132 05/Dec/2018:00:00:02 GET /blog 200
142.22.12.132 05/Dec/2018:00:00:02 POST /blog 200
142.22.12.132 05/Dec/2018:00:00:02 POST /blog 200

统计ip次数

统计IP访问次数排序前10名,使用 sort 对内容进行排序,默认是自然顺序排序。head -10 是前十个倒叙

[root@VM_0_2_centos ~]# awk -F '[[, ",?]'  '{print $1,$5,$8,$9,$13}' log.txt  | sort | uniq -c | sort -k 1 -n -r |head -10
      2 142.22.12.132 05/Dec/2018:00:00:02 POST /blog 200
      1 142.22.12.132 05/Dec/2018:00:00:02 GET /blog 200
      1 141.1.142.131 05/Dec/2018:00:00:02 GET /yoyoketang 200
      1 141.1.142.131 05/Dec/2018:00:00:02 GET /blog 200
      1 141.1.142.111 05/Dec/2018:00:00:02 GET /index 200
      1 123.125.72.61 05/Dec/2018:00:00:02 GET /yoyo/artical 200
      1 123.125.72.61 05/Dec/2018:00:00:02 GET /index 200

uniq指令用于排重,而是只适用于相邻两行相同的情况。所以一般结合sort使用。即先sort排序再排重。
uniq -u是只显示唯一的记录行。uniq -c是显示有重复记录的情况。sort -k 1 -n -r这个指令,参看下面sort指令参数的详细说明
sort选项与参数:

  • -f :忽略大小写的差异,例如 A 与 a 视为编码相同;
  • -b :忽略最前面的空格符部分;
  • -M :以月份的名字来排序,例如 JAN, DEC 等等的排序方法;
  • -n :使用『纯数字』进行排序(默认是以文字型态来排序的);
  • -r :反向排序;
  • -u :就是 uniq ,相同的数据中,仅出现一行代表;
  • -t :分隔符,默认是用 [tab] 键来分隔;
  • -k :以哪个区间 (field) 来进行排序的意思
posted @ 2020-05-20 18:03  上海-悠悠  阅读(2793)  评论(0编辑  收藏  举报