大数据Spark实时处理--Echarts数据展示

一、Spring Data

  • 官网:https://spring.io/projects/spring-data
  • 一种数据访问技术、可访问关系数据库和非关系数据库、map-reduce框架以及基于云的数据服务。

二、Spring Data整合MySQL开发环境准备及实体类开发+spring boot

  • 建立log-spark-web子模块。同log-web流程。(C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web)
  • 更改application.yml文件(C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\resources\application.yml) 
    • port改为:9526
    • path改为:/log-spark-web
  • 测试是否可以访问(C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\example\logsparkweb\LogSparkWebApplication.java)
  • 在C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\example\logsparkweb路径下新建controller、domain、repository、service,4个package
  • 创建User.java
    • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\example\logsparkweb\domain\User.java
    • 即使用spring data保存到mysql中,也可以读进来,进行mysql的一些操作
  • 添加依赖
    • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.47</version>
</dependency>
  • 在application.yml需要进行相关配置
    • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\resources\application.yml
server:
  port: 9526
  servlet:
    context-path: /log-spark-web


spring:
  jpa:
    hibernate:
      ddl-auto: update
    show-sql: true
  datasource:
    username: root
    password: root
    url: jdbc:mysql://spark000:3306/jieqiong
    driver-class-name: com.mysql.jdbc.Driver
  •  开发实体类User.java
    • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\example\logsparkweb\domain\User.java
package com.example.logsparkweb.domain;

import javax.persistence.*;

@Entity
@Table(name = "t_user")
public class User {

    @Id
    @GeneratedValue(strategy = GenerationType.IDENTITY)
    private Integer id;

    @Column(name = "last_name", length = 50)
    private String lastName;

    @Column
    private String email;

    public User() {
    }

    @Override
    public String toString() {
        return "User{" +
                "id=" + id +
                ", lastName='" + lastName + '\'' +
                ", email='" + email + '\'' +
                '}';
    }

    public User(Integer id, String lastName, String email) {
        this.id = id;
        this.lastName = lastName;
        this.email = email;
    }

    public User(String lastName, String email) {
        this.lastName = lastName;
        this.email = email;
    }

    public Integer getId() {
        return id;
    }

    public void setId(Integer id) {
        this.id = id;
    }

    public String getLastName() {
        return lastName;
    }

    public void setLastName(String lastName) {
        this.lastName = lastName;
    }

    public String getEmail() {
        return email;
    }

    public void setEmail(String email) {
        this.email = email;
    }
}
  • 运行入口类LogSparkWebApplication.java
    • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\example\logsparkweb\LogSparkWebApplication.java
  • 测试结果:在mysql数据库的jieqiong中出现t_user表。
use jieqiong;
show tables;
desc t_user;
select * from t_user;
  • 怎样快速与数据库进行交互
    • MVC:M(Model)模型,V(View)视图,C(Controller)控制
    • 选择interface
    • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\example\logsparkweb\repository\UserRepository.java
package com.example.logsparkweb.repository;

import com.example.logsparkweb.domain.User;
import org.springframework.data.jpa.repository.JpaRepository;

public interface UserRepository extends JpaRepository<User,Integer>{
}
  • UserService.java
    • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\example\logsparkweb\service\UserService.java
package com.example.logsparkweb.service;

import com.example.logsparkweb.domain.User;
import com.example.logsparkweb.repository.UserRepository;
import org.springframework.stereotype.Service;

import javax.annotation.Resource;
import java.util.List;

@Service
public class UserService {

    @Resource
    UserRepository userRepository;

    public void save(User user) {
        userRepository.save(user);
    }

    public List<User> query(){
        return userRepository.findAll();
    }

}
  • 开始单元测试
  • 新建:UserServiceTest.java
    • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\test\java\com\example\logsparkweb\service\UserServiceTest.java
package com.example.logsparkweb.service;

import com.example.logsparkweb.domain.User;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringRunner;
import javax.annotation.Resource;
import java.util.List;

@RunWith(SpringRunner.class)
@SpringBootTest
public class UserServiceTest {

    @Resource
    UserService userService;

    @Test
    public void testSave(){
        for(int i=0; i<10; i++) {
            User user = new User("pk" + i, "pk" + i + "@gmail.com");
            userService.save(user);
        }
    }

    @Test
    public void testQuery(){
        List<User> users = userService.query();
        for(User user : users) {
            System.out.println(user);
        }
    }
}
  • 修改pom.xml
    • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\pom.xml
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
            <exclusions>
                <exclusion>
                    <groupId>org.junit.vintage</groupId>
                    <artifactId>junit-vintage-engine</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.junit.jupiter</groupId>
                    <artifactId>junit-jupiter-api</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <scope>test</scope>
        </dependency>
  • 测试运行第一个@Test,结果输出到mysql。测试运行第二个@Test,结果输出到控制台。
mysql> use jieqiong;
mysql> show tables;
mysql> desc t_user;
mysql> select * from t_user;
  •  Controller层开发及测试:UserController.java
    • localhost:9526/log-spark-web/query
    • 新建:C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\example\logsparkweb\controller\UserController.java
    • 运行C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\example\logsparkweb\LogSparkWebApplication.java
    • 测试结果:在浏览器上展示数据
package com.example.logsparkweb.controller;


import com.example.logsparkweb.domain.User;
import com.example.logsparkweb.service.UserService;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.bind.annotation.GetMapping;
import java.util.List;
import javax.annotation.Resource;

@RestController
public class UserController {

    @Resource
    UserService userService;

    @GetMapping("/query")
    public List<User> query() {
        return userService.query();
    }

}

 

三、可视化框架

四、Echarts使用

  • 从官网下载Handbook - Apache ECharts
    • 放入路径1:C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\resources\static\js\echarts.js
    • 放入路径2:C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\resources\static\js\jquery-3.5.1.min.js
  • 所有的web请求都是通过controller
    • 存放子路径
    • 新建:C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\example\logsparkweb\controller\EchartsController.java
    • 请求返回到一张图上
package com.imooc.bigdata.logsparkweb.controller;

import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;

@Controller
public class EchartsController {
    //@GetMapping("/echarts")
    @RequestMapping("/echarts")
    public String echarts() {
        return "demo";
    }
}
  • 前端页面
    • 新建:C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\resources\templates\demo.html
<!DOCTYPE html>
<html lang="en" xmlns:th=www.thymeleaf.org>
<head>
    <meta charset="UTF-8">
    <title>Echarts入门</title>
    <!-- 引入 ECharts 文件 -->
    <script type="application/javascript" th:src="@{js/echarts.js}"></script>
</head>
<body>

<!-- 为 ECharts 准备一个具备大小(宽高)的 DOM -->
<div id="main" style="width: 600px;height:400px;"></div>

<script type="text/javascript">
    // 基于准备好的dom,初始化echarts实例
    var myChart = echarts.init(document.getElementById('main'));

    // 指定图表的配置项和数据
    var option = {
        title: {
            text: 'ECharts 入门示例'
        },
        tooltip: {},
        legend: {
            data:['销量']
        },
        xAxis: {
            data: ["衬衫","羊毛衫","雪纺衫","裤子","高跟鞋","袜子"]
        },
        yAxis: {},
        series: [{
            name: '销量',
            type: 'bar',
            data: [5, 20, 36, 10, 10, 20]
        }]
    };

    // 使用刚指定的配置项和数据显示图表。
    myChart.setOption(option);
</script>
</body>
</html>
  • 添加依赖
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-thymeleaf</artifactId>
        </dependency>
  • 更改application.yml
    thymeleaf:
      cache: false
  • 运行主入口类
  • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\imooc\bigdata\logsparkweb\LogSparkWebApplication.java
  • 结果如下:
  • Echarts入门

五、Spring Data整合Redis

  • 添加依赖
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-redis</artifactId>
            <version>2.6.6</version>
        </dependency>
  • 修改配置
  • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\resources\application.yml
    redis:
      host: spark000
      port: 6379
      database: 0
  •  C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\imooc\bigdata\logsparkweb\domain\ProvinceCntDay.java
package com.imooc.bigdata.logsparkweb.domain;

public class ProvinceCntDay {

    private String day;
    private String province;
    private Long cnt;

    public String getDay() {
        return day;
    }

    public String getProvince() {
        return province;
    }

    public Long getCnt() {
        return cnt;
    }

    public void setDay(String day) {
        this.day = day;
    }

    public void setProvince(String province) {
        this.province = province;
    }

    public void setCnt(Long cnt) {
        this.cnt = cnt;
    }

    @Override
    public String toString() {
        return "ProvinceCntDay{" +
                "day='" + day + '\'' +
                ", province='" + province + '\'' +
                ", cnt=" + cnt +
                '}';
    }
}
  • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\main\java\com\imooc\bigdata\logsparkweb\service\RedisService.java
package com.imooc.bigdata.logsparkweb.service;

import com.imooc.bigdata.logsparkweb.domain.ProvinceCntDay;
import org.springframework.data.redis.core.RedisCallback;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.stereotype.Service;

import javax.annotation.Resource;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

@Service
public class RedisService {

    @Resource
    RedisTemplate redisTemplate;

    public List<ProvinceCntDay> query(String day){

        List<ProvinceCntDay> results = new ArrayList<>();

        Map<String, String> map = hgetall("day-province-cnts-" + day);

        for(Map.Entry<String,String> entry : map.entrySet()) {
            ProvinceCntDay bean = new ProvinceCntDay();
            bean.setDay(day);
            bean.setProvince(entry.getKey());
            bean.setCnt(Long.parseLong(entry.getValue()));
            results.add(bean);
        }

        return results;
    }

    private Map<String,String> hgetall(String key){
        return (Map<String,String>)redisTemplate.execute((RedisCallback<Map<String,String>>) con ->{

            Map<byte[], byte[]> result = con.hGetAll(key.getBytes());

            Map<String,String> map = new HashMap<>(result.size());
            for(Map.Entry<byte[], byte[]> entry : result.entrySet()) {
                map.put(new String(entry.getKey()), new String(entry.getValue()));
            }
            return map;
        });
    }
}
  • C:\Users\jieqiong\IdeaProjects\log-time\log-spark-web\src\test\java\com\imooc\bigdata\logsparkweb\service\RedisServiceTest.java
package com.imooc.bigdata.logsparkweb.service;

import com.imooc.bigdata.logsparkweb.domain.ProvinceCntDay;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringRunner;

import javax.annotation.Resource;
import java.util.List;

@RunWith(SpringRunner.class)
@SpringBootTest
public class RedisServiceTest {

    @Resource
    RedisService redisService;

    @Test
    public void testQuery(){
        List<ProvinceCntDay> values = redisService.query("20220405");

        for(ProvinceCntDay value : values) {
            System.out.println(value);
        }
    }

}
  • 运行testQuery,控制台输出结果

六、可视化项目部署

  • 只要全部跑通,后面再修改。
  • 前端框架React,封装了Echarts。
  • Node.js (nodejs.org)下载对应的版本node-v12.12.0-linux-x64.tar.gz放到虚拟机/software中
[hadoop@spark000 software]$ tar -zxvf node-v12.12.0-linux-x64.tar.gz -C ~/app/
[hadoop@spark000 app]$ ll
drwxr-xr-x  6 hadoop hadoop  108 Oct 12  2019 node-v12.12.0-linux-x64

[hadoop@spark000 ~]$ vi ~/.bash_profile

export NODEJS_HOME=/home/hadoop/app/node-v12.12.0-linux-x64
export PATH=$NODEJS_HOME/bin:$PATH

[hadoop@spark000 ~]$ source ~/.bash_profile

  • sudo npm i yarn -g
[hadoop@spark000 ~]$ cd source/
[hadoop@spark000 source]$ mkdir user-charts
[hadoop@spark000 source]$ ll
drwxrwxr-x  2 hadoop hadoop        6 Apr 11 15:11 user-charts
[hadoop@spark000 source]$ cd user-charts/
[hadoop@spark000 user-charts]$ sudo npm i yarn -g

> yarn@1.22.18 preinstall /home/hadoop/app/node-v12.12.0-linux-x64/lib/node_modules/yarn
> :; (node ./preinstall.js > /dev/null 2>&1 || true)

/home/hadoop/app/node-v12.12.0-linux-x64/bin/yarn -> /home/hadoop/app/node-v12.12.0-linux-x64/lib/node_modules/yarn/bin/yarn.js
/home/hadoop/app/node-v12.12.0-linux-x64/bin/yarnpkg -> /home/hadoop/app/node-v12.12.0-linux-x64/lib/node_modules/yarn/bin/yarn.js
+ yarn@1.22.18
added 1 package in 2.692s

[hadoop@spark000 user-charts]$ /home/hadoop/app/node-v12.12.0-linux-x64/bin/yarn global add umi

[hadoop@spark000 user-charts]$ npm install

[hadoop@spark000 user-charts]$ npm start

 

七、统计分析可视化展示

  • 前后端项目部署
    • 前端:
      • 工程:user-charts
      • 启动后,默认访问地址:spark000:8000
    • 后端
      • jar包:imooc-spark-web-0.0.1.jar
      • 启动命令:nohup java -jar imooc-spark-web-0.0.1.jar &
      • Redis地址:hadoop000   6379
      • 只要修改ip和hostname的映射关系的配置文件
      • 将ip hadoop000 添加即可。
      • 前后端交互
        • http://hadoop000:9528/spark-web
  • 先启动前端spark000:8000
  • 再启动后端nohup java -jar imooc-spark-web-0.0.1.jar &
  • 再启动后端hadoop000   6379
  • http://hadoop000:9528/spark-web 就是Spring boot + data + redise
posted @ 2022-04-11 16:24  酱汁怪兽  阅读(525)  评论(0编辑  收藏  举报