mirror of
https://github.com/honeymoose/OpenSearch.git
synced 2025-02-10 06:55:32 +00:00
Other tokenizers like the standard tokenizer allow overriding the default maximum token length of 255 using the `"max_token_length` parameter. This change enables using this parameter also with the whitespace tokenizer. The range that is currently allowed is from 0 to StandardTokenizer.MAX_TOKEN_LENGTH_LIMIT, which is 1024 * 1024 = 1048576 characters. Closes #26643