hive导入数据,数据清洗,导入mysql,echarts显示

1  向hive中导入数据

  在hive中先建表,表的列名要和即将导入的数据对应

create table test333(ip string,itime string,day  string,traffic bigint,type string,id STRING)  
ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' 
STORED AS TEXTFILE;

      把本地的数据导入hive

load data local inpath '/opt/software/result2.csv' overwrite into table test2;

2  在hive中进行数据清洗,也可以清洗完再导入

insert overwrite table data
select ip,
       from_unixtime(to_unix_timestamp(`time`,'dd/MMM/yyy:HH:mm:ss Z')) as `time`,
       day,
       traffic,
       type,
       id
        from data  ;

3  将hive的数据用sqoop导入mysql

bin/sqoop export --connect jdbc:mysql://hadoop102:3306/company --username root --password 001224 --table business --num-mappers 1 --export-dir /user/hive/warehouse/csv3 --input-fields-terminated-by "," --driver com.mysql.jdbc.Driver  ;

4  JDBC连接mysql,查询数据后返回给echarts

复制代码
    protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
        response.setContentType("text/html;charset=utf-8");
        request.setCharacterEncoding("utf-8");
        ArrayList<city> book = new ArrayList<city>();
        select dao=new select();
        System.out.println("hahahahah ");
        try {
            dao.mostPopularIP(book);
            System.out.println("1");
        } catch (ClassNotFoundException e) {
            e.printStackTrace();
            System.out.println("2");
        } catch (SQLException e) {
            e.printStackTrace();
            System.out.println("3");
        }
        System.out.println("*****************");
            String json = JSON.toJSONString(book);
            response.getWriter().write(json);
    }
复制代码

 

posted @   Cuora  阅读(145)  评论(0编辑  收藏  举报
相关博文:
阅读排行:
· 震惊!C++程序真的从main开始吗?99%的程序员都答错了
· 【硬核科普】Trae如何「偷看」你的代码?零基础破解AI编程运行原理
· 单元测试从入门到精通
· 上周热点回顾(3.3-3.9)
· winform 绘制太阳,地球,月球 运作规律
历史上的今天:
2021-10-18 tomcat和Servlet
点击右上角即可分享
微信分享提示