Remove `delimited_payload_filter` (#27705)

From 7.0 on, using `delimited_payload_filter` should throw an error. 
It was deprecated in 6.2 in favour of `delimited_payload` (#26625).

Relates to #27704
This commit is contained in:
Christoph Büscher 2018-04-05 18:41:04 +02:00 committed by GitHub
parent fb1aba9389
commit 231fd4eb18
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 17 additions and 30 deletions

View File

@ -1,20 +1,12 @@
[[breaking_70_analysis_changes]] [[breaking_70_analysis_changes]]
=== Analysis changes === Analysis changes
==== The `delimited_payload_filter` is renamed
The `delimited_payload_filter` is renamed to `delimited_payload`, the old name is
deprecated and will be removed at some point, so it should be replaced by
`delimited_payload`.
==== Limiting the number of tokens produced by _analyze ==== Limiting the number of tokens produced by _analyze
To safeguard against out of memory errors, the number of tokens that can be produced To safeguard against out of memory errors, the number of tokens that can be produced
using the `_analyze` endpoint has been limited to 10000. This default limit can be changed using the `_analyze` endpoint has been limited to 10000. This default limit can be changed
for a particular index with the index setting `index.analyze.max_token_count`. for a particular index with the index setting `index.analyze.max_token_count`.
==== Limiting the length of an analyzed text during highlighting ==== Limiting the length of an analyzed text during highlighting
Highlighting a text that was indexed without offsets or term vectors, Highlighting a text that was indexed without offsets or term vectors,
@ -22,4 +14,11 @@ requires analysis of this text in memory real time during the search request.
For large texts this analysis may take substantial amount of time and memory. For large texts this analysis may take substantial amount of time and memory.
To protect against this, the maximum number of characters that will be analyzed has been To protect against this, the maximum number of characters that will be analyzed has been
limited to 1000000. This default limit can be changed limited to 1000000. This default limit can be changed
for a particular index with the index setting `index.highlight.max_analyzed_offset`. for a particular index with the index setting `index.highlight.max_analyzed_offset`.
==== `delimited_payload_filter` renaming
The `delimited_payload_filter` was deprecated and renamed to `delimited_payload` in 6.2.
Using it in indices created before 7.0 will issue deprecation warnings. Using the old
name in new indices created in 7.0 will throw an error. Use the new name `delimited_payload`
instead.

View File

@ -32,6 +32,10 @@ public class LegacyDelimitedPayloadTokenFilterFactory extends DelimitedPayloadTo
LegacyDelimitedPayloadTokenFilterFactory(IndexSettings indexSettings, Environment env, String name, Settings settings) { LegacyDelimitedPayloadTokenFilterFactory(IndexSettings indexSettings, Environment env, String name, Settings settings) {
super(indexSettings, env, name, settings); super(indexSettings, env, name, settings);
if (indexSettings.getIndexVersionCreated().onOrAfter(Version.V_7_0_0_alpha1)) {
throw new IllegalArgumentException(
"[delimited_payload_filter] is not supported for new indices, use [delimited_payload] instead");
}
if (indexSettings.getIndexVersionCreated().onOrAfter(Version.V_6_2_0)) { if (indexSettings.getIndexVersionCreated().onOrAfter(Version.V_6_2_0)) {
DEPRECATION_LOGGER.deprecated("Deprecated [delimited_payload_filter] used, replaced by [delimited_payload]"); DEPRECATION_LOGGER.deprecated("Deprecated [delimited_payload_filter] used, replaced by [delimited_payload]");
} }

View File

@ -1026,15 +1026,13 @@
- match: { tokens.10.token: ちた } - match: { tokens.10.token: ちた }
--- ---
"delimited_payload_filter": "delimited_payload_filter_error":
- skip: - skip:
version: " - 6.1.99" version: " - 6.99.99"
reason: delimited_payload_filter deprecated in 6.2, replaced by delimited_payload reason: using delimited_payload_filter throws error from 7.0 on
features: "warnings"
- do: - do:
warnings: catch: /\[delimited_payload_filter\] is not supported for new indices, use \[delimited_payload\] instead/
- "Deprecated [delimited_payload_filter] used, replaced by [delimited_payload]"
indices.create: indices.create:
index: test index: test
body: body:
@ -1045,29 +1043,15 @@
type: delimited_payload_filter type: delimited_payload_filter
delimiter: ^ delimiter: ^
encoding: identity encoding: identity
- do:
warnings:
- "Deprecated [delimited_payload_filter] used, replaced by [delimited_payload]"
indices.analyze:
index: test
body:
text: foo^bar
tokenizer: keyword
filter: [my_delimited_payload_filter]
- length: { tokens: 1 }
- match: { tokens.0.token: foo }
# Test pre-configured token filter too: # Test pre-configured token filter too:
- do: - do:
warnings: catch: /\[delimited_payload_filter\] is not supported for new indices, use \[delimited_payload\] instead/
- "Deprecated [delimited_payload_filter] used, replaced by [delimited_payload]"
indices.analyze: indices.analyze:
body: body:
text: foo|5 text: foo|5
tokenizer: keyword tokenizer: keyword
filter: [delimited_payload_filter] filter: [delimited_payload_filter]
- length: { tokens: 1 }
- match: { tokens.0.token: foo }
--- ---
"delimited_payload": "delimited_payload":