Elasticsearch——》kibana操作索引:增删改查

一、新建索引
PUT /test_001
{
"settings": {
"index": {
"max_result_window": 1000000
},
"analysis": {
"analyzer": {
"ik_max_word": {
"tokenizer": "ik_max_word",
"filter": [
"lowercase",
"asciifolding"
]
},
"comma": {
"type": "pattern",
"pattern": ","
},
"ngram": {
"tokenizer": "ngram",
"filter": [
"lowercase",
"asciifolding"
]
}
},
"tokenizer": {
"ngram": {
"type": "ngram",
"token_chars": [
"letter",
"digit",
"punctuation",
"symbol"
]
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "integer",
"fields": {
"keyword": {
"type": "text",
"analyzer": "ngram"
}
}
},
"goodsName": {
"type": "text",
"analyzer": "ik_max_word",
"fields": {
"keyword": {
"type": "text",
"analyzer": "ngram"
}
}
},
"shopId": {
"type": "integer"
},
"putSaleTime": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
},
"isDelete": {
"type": "boolean"
},
"couponIds": {
"type": "text",
"analyzer": "comma"
}
}
}
}

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78


1、指定分词器
standard:单字拆分
ik_smart:
ik_max_word:

POST _analyze
{
"analyzer": "standard",
"text":"没毛病"
}

POST _analyze
{
"analyzer": "ik_smart",
"text":"没毛病"
}

POST _analyze
{
"analyzer": "ik_max_word",
"text":"没毛病"
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
二、修改索引
1、新增索引字段
PUT /test_001/_mapping
{
"properties": {
"actArea": {
"type": "double"
}
}
}
1
2
3
4
5
6
7
8


三、查看索引
1、查看所有索引(状态)
GET _cat/indices
1


2、查看所有索引(详细)
GET _all
1


3、查看指定索引
GET test_001
1


四、删除索引
DELETE test_001
1

————————————————
版权声明:本文为CSDN博主「小仙。」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/weixin_43453386/article/details/108670186

posted @ 2023-04-13 16:38  GaoYanbing  阅读(605)  评论(0编辑  收藏  举报