From dc21ab75768ac9259ba8bf72d2d878e4e476de5a Mon Sep 17 00:00:00 2001 From: Clinton Gormley Date: Fri, 18 Mar 2016 10:57:16 +0100 Subject: [PATCH] Docs: Corrected behaviour of max_token_length in standard tokenizer --- docs/reference/analysis/tokenizers/standard-tokenizer.asciidoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/reference/analysis/tokenizers/standard-tokenizer.asciidoc b/docs/reference/analysis/tokenizers/standard-tokenizer.asciidoc index c8b405bf820..42dbe5a864a 100644 --- a/docs/reference/analysis/tokenizers/standard-tokenizer.asciidoc +++ b/docs/reference/analysis/tokenizers/standard-tokenizer.asciidoc @@ -13,6 +13,6 @@ type: |======================================================================= |Setting |Description |`max_token_length` |The maximum token length. If a token is seen that -exceeds this length then it is discarded. Defaults to `255`. +exceeds this length then it is split at `max_token_length` intervals. Defaults to `255`. |=======================================================================