mirror of
https://github.com/honeymoose/OpenSearch.git
synced 2025-03-05 18:39:14 +00:00
Docs: Corrected behaviour of max_token_length in standard tokenizer
This commit is contained in:
parent
2dffad9ec3
commit
dc21ab7576
@ -13,6 +13,6 @@ type:
|
||||
|=======================================================================
|
||||
|Setting |Description
|
||||
|`max_token_length` |The maximum token length. If a token is seen that
|
||||
exceeds this length then it is discarded. Defaults to `255`.
|
||||
exceeds this length then it is split at `max_token_length` intervals. Defaults to `255`.
|
||||
|=======================================================================
|
||||
|
||||
|
Loading…
x
Reference in New Issue
Block a user