|NO.Z.00014|——————————|BigDataEnd|——|Hadoop&实时数仓.V14|——|项目.v14|ODS层处理|将kafka中的维表写入DIM层.V2|

一、编程实现:样例类:将Kafka中的维度表写入DIM层
### --- 编程实现样例类一:TableObject

package ods

/**
 * 存放mysql log_bin日志信息的样例类
 * log_bin日志经过canel转成json发给kafka
 * flink应用读kafka中json数据保存成TableObject样例类格式
 */
case class TableObject (database:String, tableName:String, typeInfo: String, dataInfo: String) extends Serializable
### --- 编程实现样例类二:AreaInfo

package ods

case class AreaInfo(
                     id: String,
                     name: String,
                     pid: String,
                     sname: String,
                     level: String,
                     citycode: String,
                     yzcode: String,
                     mername: String,
                     Lng: String,
                     Lat: String,
                     pinyin: String
                   )
### --- 编程实现样例类三:DataInfo

package ods

case class DataInfo(
                     modifiedTime: String,
                     orderNo: String,
                     isPay: String,
                     orderId: String,
                     tradeSrc: String,
                     payTime: String,
                     productMoney: String,
                     totalMoney: String,
                     dataFlag: String,
                     userId: String,
                     areaId: String,
                     createTime: String,
                     payMethod: String,
                     isRefund: String,
                     tradeType: String,
                     status: String
                   )
二、编程实现工具类
### --- 编程实现工具类一:SourceKafka:将Kafka作为Source,Flink作为消费者从Kafka中获取数据。

package ods

import java.util.Properties

import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer

class SourceKafka {
  def getKafkaSource(topicName: String) : FlinkKafkaConsumer[String] = {
    val props = new Properties()
    props.setProperty("bootstrap.servers","hadoop01:9092,hadoop02:9092,hadoop03:9092");//3,4
    props.setProperty("group.id","consumer-group")
    props.setProperty("key.deserializer","org.apache.kafka.common.serialization.StringDeserializer")
    props.setProperty("value.deserializer","org.apache.kafka.common.serialization.StringDeserializer")
    props.setProperty("auto.offset.reset","latest")

    new FlinkKafkaConsumer[String](topicName, new SimpleStringSchema(),props);
  }
}
### --- 编程实现工具类二:ConnHBase

package ods

import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.client.{Connection, ConnectionFactory}
import org.apache.hadoop.hbase.{HBaseConfiguration, HConstants}

class ConnHBase {
  def connToHbase:Connection ={
    val conf : Configuration = HBaseConfiguration.create()
    conf.set("hbase.zookeeper.quorum","hadoop01,hadoop02,hadoop03")
    //    conf.set("hbase.zookeeper.quorum","hadoop01,hadoop02")
    conf.set("hbase.zookeeper.property.clientPort","2181")
    conf.setInt(HConstants.HBASE_CLIENT_OPERATION_TIMEOUT,30000)
    conf.setInt(HConstants.HBASE_CLIENT_SCANNER_TIMEOUT_PERIOD,30000)
    val connection = ConnectionFactory.createConnection(conf)
    connection
  }

}

 
 
 
 
 
 
 
 
 

Walter Savage Landor:strove with none,for none was worth my strife.Nature I loved and, next to Nature, Art:I warm'd both hands before the fire of life.It sinks, and I am ready to depart
                                                                                                                                                   ——W.S.Landor

 

 

posted on   yanqi_vip  阅读(19)  评论(0编辑  收藏  举报

相关博文:
阅读排行:
· 无需6万激活码!GitHub神秘组织3小时极速复刻Manus,手把手教你使用OpenManus搭建本
· Manus爆火,是硬核还是营销?
· 终于写完轮子一部分:tcp代理 了,记录一下
· 别再用vector<bool>了!Google高级工程师:这可能是STL最大的设计失误
· 单元测试从入门到精通
< 2025年3月 >
23 24 25 26 27 28 1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31 1 2 3 4 5

导航

统计

点击右上角即可分享
微信分享提示