Es 自定义分词
index :
analysis :
analyzer :
descAnalyzer :
tokenizer : [standard,lang,letter,whitespace,uax_url_email]
filter : [porterStem,standard,lowercase,stop words,word_delimiter,synonyms]
analysis :
analyzer :
descAnalyzer :
tokenizer : [standard,lang,letter,whitespace,uax_url_email]
filter : [porterStem,standard,lowercase,stop words,word_delimiter,synonyms]
char_filte : [html_strip]
追加到 elasticsearch.conf 最下面即可
定,精,简,俭