MongoDB日志分析

Result文件数据说明:

Ip106.39.41.166,(城市)

Date10/Nov/2016:00:01:02 +0800,(日期)

Day10,(天数)

Traffic: 54 ,(流量)

Type: video,(类型:视频video或文章article

Id: 8701(视频或者文章的id

测试要求:

1、 数据清洗:按照进行数据清洗,并将清洗后的数据导入MongDB数据库中

两阶段数据清洗:

1)第一阶段:把需要的信息从原始日志中提取出来

ip:    199.30.25.88

time:  10/Nov/2016:00:01:03 +0800

traffic:  62

文章: article/11325

视频: video/3235

2)第二阶段:根据提取出来的信息做精细化操作

ip--->城市 cityIP

date--> time:2016-11-10 00:01:03

day: 10

traffic:62

type:article/video

id:11325

3MongDB数据库表结构:

create table data(  ip string,  time string , day string, traffic bigint,

type string, id   string )

2、数据处理:

·统计最受欢迎的视频/文章的Top10访问次数 (video/article

·按照地市统计最受欢迎的Top10课程 (ip

·按照流量统计最受欢迎的Top10课程 (traffic

3、数据可视化:将统计结果倒入MySql数据库中,通过图形化展示的方式展现出来。

1.数据清洗

使用java程序清洗数据,修改时间格式:

 

清洗完后的数据:

 

导入到hive中:

首先创建表result3

create table result3(Ip string

,Dates string

,Day string

,Traffic string

,Type string

,Id string

)

ROW format delimited fields terminated by ',';

load data local inpath '/opt/software/result2.txt' into table result3;

2.数据统计

1·统计最受欢迎的视频/文章的Top10访问次数 (video/article

 

创建ksh_1

create table ksh_1 as

select id,count(*) total

from result3 where type='video'

group by id

order by total desc

limit 10;

 

create table ksh_2 as

select id,count(*) total

from result3 where type='article'

group by id

order by total desc

limit 10;

导入到mysql

bin/sqoop export \

--connect jdbc:mysql://Hadoop102:3306/company \

--username root \

--password 1229 \

--table ksh_1 \

--num-mappers 1 \

--export-dir /user/hive/warehouse/test03.db/ksh_1 \

--input-fields-terminated-by "\001"

 

bin/sqoop export \

--connect jdbc:mysql://Hadoop102:3306/company \

--username root \

--password 1229 \

--table ksh_2 \

--num-mappers 1 \

--export-dir /user/hive/warehouse/test03.db/ksh_2 \

--input-fields-terminated-by "\001"

 

 

(2)按照地市统计最受欢迎的Top10课程 (ip

create table ksh_4 as

select ip,id,count(*) total

from result3

group by ip,id

order by total desc

limit 10;

bin/sqoop export \

--connect jdbc:mysql://Hadoop102:3306/company \

--username root \

--password 1229 \

--table ksh_4 \

--num-mappers 1 \

--export-dir /user/hive/warehouse/test03.db/ksh_4 \

--input-fields-terminated-by "\001"

 

(3)按照流量统计最受欢迎的Top10课程 (traffic

创建ksh_5:

create table ksh_5 as

select id,sum(traffic) total

from result3

group by id

order by total desc

limit 10;

导入到MySQL:

 

bin/sqoop export \

--connect jdbc:mysql://Hadoop102:3306/company \

--username root \

--password 1229 \

--table ksh_5 \

--num-mappers 1 \

--export-dir /user/hive/warehouse/test03.db/ksh_5 \

--input-fields-terminated-by "\001"

 

  1. 可视化

对于统计地区的可视化还未能完成,目前完成第一个和第三个统计可视化

 

posted @ 2021-11-04 11:17  哦心有  阅读(472)  评论(0编辑  收藏  举报