ElasticSearch 集成 IK 分词器
上传 IK分词器到 Elasticsearch目录的plugins目录中
解压 并修改名称
unzip elasticsearch-analysis-ik-6.3.0.zip -d ik-analyzer
然后重启 Elasticsearch
在kibana控制台输入下面的请求: POST _analyze { "analyzer": "ik_max_word", "text": "我是中国人" } 运行得到结果: { "tokens": [ { "token": "我", "start_offset": 0, "end_offset": 1, "type": "CN_CHAR", "position": 0 }, { "token": "是", "start_offset": 1, "end_offset": 2, "type": "CN_CHAR", "position": 1 }, { "token": "中国人", "start_offset": 2, "end_offset": 5, "type": "CN_WORD", "position": 2 }, { "token": "中国", "start_offset": 2, "end_offset": 4, "type": "CN_WORD", "position": 3 }, { "token": "国人", "start_offset": 3, "end_offset": 5, "type": "CN_WORD", "position": 4 } ] }