hoodie.datasource.hive_sync.mode |
Y |
N/A |
|
hoodie.datasource.write.partitionpath.field |
Y |
N/A |
|
hoodie.datasource.write.precombine.field |
N |
ts |
|
hoodie.datasource.write.recordkey.field |
Y |
N/A |
|
hoodie.datasource.write.table.type |
N |
COPY_ON_WRITE |
|
hoodie.datasource.write.insert.drop.duplicates |
N |
false |
如果设置为 true,则插入时()过滤掉所有重复的记录 |
hoodie.sql.insert.mode |
N |
upsert |
|
hoodie.sql.bulk.insert.enable |
N |
false |
|
hoodie.datasource.write.table.name |
Y |
N/A |
|
hoodie.datasource.write.operation |
N |
upsert |
|
hoodie.datasource.write.payload.class |
N |
Spark默认为org.apache.hudi.common.model.OverwriteWithLatestAvroPayload,Flink默认为org.apache.hudi.common.model.EventTimeAvroPayload |
|
hoodie.datasource.write.partitionpath.urlencode |
N |
false |
|
hoodie.datasource.hive_sync.partition_fields |
N |
N/A |
|
hoodie.datasource.hive_sync.auto_create_database |
N |
true |
自动创建不存在的数据库 |
hoodie.datasource.hive_sync.database |
N |
default |
|
hoodie.datasource.hive_sync.table |
N |
unknown |
|
hoodie.datasource.hive_sync.use_jdbc |
N |
hive |
|
hoodie.datasource.hive_sync.password |
N |
hive |
|
hoodie.datasource.hive_sync.enable |
N |
false |
|
hoodie.datasource.hive_sync.ignore_exceptions |
N |
false |
|
hoodie.datasource.hive_sync.use_jdbc |
N |
true |
|
hoodie.datasource.hive_sync.jdbcurl |
N |
jdbc:hive2://localhost:10000 |
Hive metastore url |
hoodie.datasource.hive_sync.metastore.uris |
N |
thrift://localhost:9083 |
Hive metastore url |
hoodie.datasource.hive_sync.base_file_format |
N |
PARQUET |
|
hoodie.datasource.hive_sync.support_timestamp |
N |
false |
|
hoodie.datasource.meta.sync.enable |
N |
false |
|
hoodie.clustering.inline |
N |
false |
|
hoodie.datasource.write.partitions.to.delete |
Y |
N/A |
逗号分隔的待删除分区列表,支持星号通配符 |