diff --git a/docs/reference/analysis/tokenizers/standard-tokenizer.asciidoc b/docs/reference/analysis/tokenizers/standard-tokenizer.asciidoc index c8b405bf820..42dbe5a864a 100644 --- a/docs/reference/analysis/tokenizers/standard-tokenizer.asciidoc +++ b/docs/reference/analysis/tokenizers/standard-tokenizer.asciidoc @@ -13,6 +13,6 @@ type: |======================================================================= |Setting |Description |`max_token_length` |The maximum token length. If a token is seen that -exceeds this length then it is discarded. Defaults to `255`. +exceeds this length then it is split at `max_token_length` intervals. Defaults to `255`. |=======================================================================