Mayya Sharipova cbd271e497
Limit the analyzed text for highlighting (#27934)
* Limit the analyzed text for highlighting

- Introduce index level settings to control the max number of character
to be analyzed for highlighting
- Throw an error if analysis is required on a larger text

Closes #27517
2017-12-21 10:19:58 -05:00

25 lines
1.1 KiB
Plaintext

[[breaking_70_analysis_changes]]
=== Analysis changes
==== The `delimited_payload_filter` is renamed
The `delimited_payload_filter` is renamed to `delimited_payload`, the old name is
deprecated and will be removed at some point, so it should be replaced by
`delimited_payload`.
==== Limiting the number of tokens produced by _analyze
To safeguard against out of memory errors, the number of tokens that can be produced
using the `_analyze` endpoint has been limited to 10000. This default limit can be changed
for a particular index with the index setting `index.analyze.max_token_count`.
==== Limiting the length of an analyzed text during highlighting
Highlighting a text that was indexed without offsets or term vectors,
requires analysis of this text in memory real time during the search request.
For large texts this analysis may take substantial amount of time and memory.
To protect against this, the maximum number of characters that will be analyzed has been
limited to 10000. This default limit can be changed
for a particular index with the index setting `index.highlight.max_analyzed_offset`.