|NO.Z.00014|——————————|BigDataEnd|——|Hadoop&实时数仓.V14|——|项目.v14|ODS层处理|将kafka中的维表写入DIM层.V2|
一、编程实现:样例类:将Kafka中的维度表写入DIM层
### --- 编程实现样例类一:TableObject
package ods
/**
* 存放mysql log_bin日志信息的样例类
* log_bin日志经过canel转成json发给kafka
* flink应用读kafka中json数据保存成TableObject样例类格式
*/
case class TableObject (database:String, tableName:String, typeInfo: String, dataInfo: String) extends Serializable
### --- 编程实现样例类二:AreaInfo
package ods
case class AreaInfo(
id: String,
name: String,
pid: String,
sname: String,
level: String,
citycode: String,
yzcode: String,
mername: String,
Lng: String,
Lat: String,
pinyin: String
)
### --- 编程实现样例类三:DataInfo
package ods
case class DataInfo(
modifiedTime: String,
orderNo: String,
isPay: String,
orderId: String,
tradeSrc: String,
payTime: String,
productMoney: String,
totalMoney: String,
dataFlag: String,
userId: String,
areaId: String,
createTime: String,
payMethod: String,
isRefund: String,
tradeType: String,
status: String
)
二、编程实现工具类
### --- 编程实现工具类一:SourceKafka:将Kafka作为Source,Flink作为消费者从Kafka中获取数据。
package ods
import java.util.Properties
import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer
class SourceKafka {
def getKafkaSource(topicName: String) : FlinkKafkaConsumer[String] = {
val props = new Properties()
props.setProperty("bootstrap.servers","hadoop01:9092,hadoop02:9092,hadoop03:9092");//3,4
props.setProperty("group.id","consumer-group")
props.setProperty("key.deserializer","org.apache.kafka.common.serialization.StringDeserializer")
props.setProperty("value.deserializer","org.apache.kafka.common.serialization.StringDeserializer")
props.setProperty("auto.offset.reset","latest")
new FlinkKafkaConsumer[String](topicName, new SimpleStringSchema(),props);
}
}
### --- 编程实现工具类二:ConnHBase
package ods
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.client.{Connection, ConnectionFactory}
import org.apache.hadoop.hbase.{HBaseConfiguration, HConstants}
class ConnHBase {
def connToHbase:Connection ={
val conf : Configuration = HBaseConfiguration.create()
conf.set("hbase.zookeeper.quorum","hadoop01,hadoop02,hadoop03")
// conf.set("hbase.zookeeper.quorum","hadoop01,hadoop02")
conf.set("hbase.zookeeper.property.clientPort","2181")
conf.setInt(HConstants.HBASE_CLIENT_OPERATION_TIMEOUT,30000)
conf.setInt(HConstants.HBASE_CLIENT_SCANNER_TIMEOUT_PERIOD,30000)
val connection = ConnectionFactory.createConnection(conf)
connection
}
}
Walter Savage Landor:strove with none,for none was worth my strife.Nature I loved and, next to Nature, Art:I warm'd both hands before the fire of life.It sinks, and I am ready to depart
——W.S.Landor
分类:
bdv026-EB实时数仓
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 无需6万激活码!GitHub神秘组织3小时极速复刻Manus,手把手教你使用OpenManus搭建本
· Manus爆火,是硬核还是营销?
· 终于写完轮子一部分:tcp代理 了,记录一下
· 别再用vector<bool>了!Google高级工程师:这可能是STL最大的设计失误
· 单元测试从入门到精通