095c34359f
The `edge_ngram` tokenizer limits tokens to the `max_gram` character length. Autocomplete searches for terms longer than this limit return no results. To prevent this, you can use the `truncate` token filter to truncate tokens to the `max_gram` character length. However, this could return irrelevant results. This commit adds some advisory text to make users aware of this limitation and outline the tradeoffs for each approach. Closes #48956. |
||
---|---|---|
.. | ||
analyzers | ||
charfilters | ||
tokenfilters | ||
tokenizers | ||
analyzers.asciidoc | ||
anatomy.asciidoc | ||
charfilters.asciidoc | ||
normalizers.asciidoc | ||
testing.asciidoc | ||
tokenfilters.asciidoc | ||
tokenizers.asciidoc |