Docs: Corrected behaviour of max_token_length in standard tokenizer
This commit is contained in:
parent
2dffad9ec3
commit
dc21ab7576
|
@ -13,6 +13,6 @@ type:
|
||||||
|=======================================================================
|
|=======================================================================
|
||||||
|Setting |Description
|
|Setting |Description
|
||||||
|`max_token_length` |The maximum token length. If a token is seen that
|
|`max_token_length` |The maximum token length. If a token is seen that
|
||||||
exceeds this length then it is discarded. Defaults to `255`.
|
exceeds this length then it is split at `max_token_length` intervals. Defaults to `255`.
|
||||||
|=======================================================================
|
|=======================================================================
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue