OpenSearch/docs/reference/analysis/tokenizers
Christoph Büscher 3827918417 Add configurable `maxTokenLength` parameter to whitespace tokenizer (#26749)
Other tokenizers like the standard tokenizer allow overriding the default
maximum token length of 255 using the `"max_token_length` parameter. This change
enables using this parameter also with the whitespace tokenizer. The range that
is currently allowed is from 0 to StandardTokenizer.MAX_TOKEN_LENGTH_LIMIT,
which is 1024 * 1024 = 1048576 characters.

Closes #26643
2017-09-25 17:21:19 +02:00
..
classic-tokenizer.asciidoc Remove wait_for_status=yellow from the docs 2016-07-15 16:02:07 -04:00
edgengram-tokenizer.asciidoc Add a shard filter search phase to pre-filter shards based on query rewriting (#25658) 2017-07-12 22:19:20 +02:00
keyword-tokenizer.asciidoc Docs: Improved tokenizer docs (#18356) 2016-05-19 19:42:23 +02:00
letter-tokenizer.asciidoc Docs: Improved tokenizer docs (#18356) 2016-05-19 19:42:23 +02:00
lowercase-tokenizer.asciidoc Update lowercase-tokenizer.asciidoc (#21896) 2016-12-02 10:49:51 -05:00
ngram-tokenizer.asciidoc Remove wait_for_status=yellow from the docs 2016-07-15 16:02:07 -04:00
pathhierarchy-tokenizer.asciidoc Remove wait_for_status=yellow from the docs 2016-07-15 16:02:07 -04:00
pattern-tokenizer.asciidoc [Docs] Fix typo in pattern-tokenizer.asciidoc (#25626) 2017-07-13 18:43:48 +02:00
simplepattern-tokenizer.asciidoc Update experimental labels in the docs (#25727) 2017-07-18 14:06:22 +02:00
simplepatternsplit-tokenizer.asciidoc Update experimental labels in the docs (#25727) 2017-07-18 14:06:22 +02:00
standard-tokenizer.asciidoc Remove wait_for_status=yellow from the docs 2016-07-15 16:02:07 -04:00
thai-tokenizer.asciidoc Docs: Improved tokenizer docs (#18356) 2016-05-19 19:42:23 +02:00
uaxurlemail-tokenizer.asciidoc Remove wait_for_status=yellow from the docs 2016-07-15 16:02:07 -04:00
whitespace-tokenizer.asciidoc Add configurable `maxTokenLength` parameter to whitespace tokenizer (#26749) 2017-09-25 17:21:19 +02:00