Docs: Corrected behaviour of max_token_length in standard tokenizer

This commit is contained in:
Clinton Gormley 2016-03-18 10:57:16 +01:00
parent 2dffad9ec3
commit dc21ab7576
1 changed files with 1 additions and 1 deletions

View File

@ -13,6 +13,6 @@ type:
|=======================================================================
|Setting |Description
|`max_token_length` |The maximum token length. If a token is seen that
exceeds this length then it is discarded. Defaults to `255`.
exceeds this length then it is split at `max_token_length` intervals. Defaults to `255`.
|=======================================================================