mirror of
https://github.com/honeymoose/OpenSearch.git
synced 2025-03-25 01:19:02 +00:00
Add a new cluster setting `search.allow_expensive_queries` which by default is `true`. If set to `false`, certain queries that have usually slow performance cannot be executed and an error message is returned. - Queries that need to do linear scans to identify matches: - Script queries - Queries that have a high up-front cost: - Fuzzy queries - Regexp queries - Prefix queries (without index_prefixes enabled - Wildcard queries - Range queries on text and keyword fields - Joining queries - HasParent queries - HasChild queries - ParentId queries - Nested queries - Queries on deprecated 6.x geo shapes (using PrefixTree implementation) - Queries that may have a high per-document cost: - Script score queries - Percolate queries Closes: #29050 (cherry picked from commit a8b39ed842c7770bd9275958c9f747502fd9a3ea)
This commit is contained in:
parent
40b58e612d
commit
dac720d7a1
@ -252,6 +252,10 @@ between index size and a reasonable level of precision of 50m at the
|
||||
equator. This allows for indexing tens of millions of shapes without
|
||||
overly bloating the resulting index too much relative to the input size.
|
||||
|
||||
[NOTE]
|
||||
Geo-shape queries on geo-shapes implemented with PrefixTrees will not be executed if
|
||||
<<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>> is set to false.
|
||||
|
||||
[[input-structure]]
|
||||
[float]
|
||||
==== Input Structure
|
||||
|
@ -25,6 +25,27 @@ or to alter their behaviour (such as the
|
||||
|
||||
Query clauses behave differently depending on whether they are used in
|
||||
<<query-filter-context,query context or filter context>>.
|
||||
|
||||
[[query-dsl-allow-expensive-queries]]
|
||||
Allow expensive queries::
|
||||
Certain types of queries will generally execute slowly due to the way they are implemented, which can affect
|
||||
the stability of the cluster. Those queries can be categorised as follows:
|
||||
* Queries that need to do linear scans to identify matches:
|
||||
** <<query-dsl-script-query, `script queries`>>
|
||||
* Queries that have a high up-front cost:
|
||||
** <<query-dsl-fuzzy-query,`fuzzy queries`>>
|
||||
** <<query-dsl-regexp-query,`regexp queries`>>
|
||||
** <<query-dsl-prefix-query,`prefix queries`>> without <<index-prefixes, `index_prefixes`>>
|
||||
** <<query-dsl-wildcard-query, `wildcard queries`>>
|
||||
** <<query-dsl-range-query, `range queries>> on <<text, `text`>> and <<keyword, `keyword`>> fields
|
||||
* <<joining-queries, `Joining queries`>>
|
||||
* Queries on <<prefix-trees, deprecated geo shapes>>
|
||||
* Queries that may have a high per-document cost:
|
||||
** <<query-dsl-script-score-query, `script score queries`>>
|
||||
** <<query-dsl-percolate-query, `percolate queries`>>
|
||||
|
||||
The execution of such queries can be prevented by setting the value of the `search.allow_expensive_queries`
|
||||
setting to `false` (defaults to `true`).
|
||||
--
|
||||
|
||||
include::query-dsl/query_filter_context.asciidoc[]
|
||||
@ -51,4 +72,4 @@ include::query-dsl/minimum-should-match.asciidoc[]
|
||||
|
||||
include::query-dsl/multi-term-rewrite.asciidoc[]
|
||||
|
||||
include::query-dsl/regexp-syntax.asciidoc[]
|
||||
include::query-dsl/regexp-syntax.asciidoc[]
|
||||
|
@ -97,4 +97,8 @@ adjacent characters (ab → ba). Defaults to `true`.
|
||||
|
||||
`rewrite`::
|
||||
(Optional, string) Method used to rewrite the query. For valid values and more
|
||||
information, see the <<query-dsl-multi-term-rewrite, `rewrite` parameter>>.
|
||||
information, see the <<query-dsl-multi-term-rewrite, `rewrite` parameter>>.
|
||||
|
||||
==== Notes
|
||||
Fuzzy queries will not be executed if <<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>>
|
||||
is set to false.
|
||||
|
@ -161,3 +161,7 @@ and will not match any documents for this query. This can be useful when
|
||||
querying multiple indexes which might have different mappings. When set to
|
||||
`false` (the default value) the query will throw an exception if the field
|
||||
is not mapped.
|
||||
|
||||
==== Notes
|
||||
Geo-shape queries on geo-shapes implemented with <<prefix-trees, `PrefixTrees`>> will not be executed if
|
||||
<<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>> is set to false.
|
||||
|
@ -29,4 +29,7 @@ include::has-parent-query.asciidoc[]
|
||||
|
||||
include::parent-id-query.asciidoc[]
|
||||
|
||||
|
||||
=== Notes
|
||||
==== Allow expensive queries
|
||||
Joining queries will not be executed if <<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>>
|
||||
is set to false.
|
||||
|
@ -693,3 +693,8 @@ being percolated, as opposed to a single index as we do in examples. There are a
|
||||
allows for fields to be stored in a denser, more efficient way.
|
||||
- Percolate queries do not scale in the same way as other queries, so percolation performance may benefit from using
|
||||
a different index configuration, like the number of primary shards.
|
||||
|
||||
=== Notes
|
||||
==== Allow expensive queries
|
||||
Percolate queries will not be executed if <<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>>
|
||||
is set to false.
|
||||
|
@ -64,4 +64,10 @@ GET /_search
|
||||
You can speed up prefix queries using the <<index-prefixes,`index_prefixes`>>
|
||||
mapping parameter. If enabled, {es} indexes prefixes between 2 and 5
|
||||
characters in a separate field. This lets {es} run prefix queries more
|
||||
efficiently at the cost of a larger index.
|
||||
efficiently at the cost of a larger index.
|
||||
|
||||
[[prefix-query-allow-expensive-queries]]
|
||||
===== Allow expensive queries
|
||||
Prefix queries will not be executed if <<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>>
|
||||
is set to false. However, if <<index-prefixes, `index_prefixes`>> are enabled, an optimised query is built which
|
||||
is not considered slow, and will be executed in spite of this setting.
|
||||
|
@ -537,3 +537,9 @@ The example above creates a boolean query:
|
||||
`(blended(terms:[field2:this, field1:this]) blended(terms:[field2:that, field1:that]) blended(terms:[field2:thus, field1:thus]))~2`
|
||||
|
||||
that matches documents with at least two of the three per-term blended queries.
|
||||
|
||||
==== Notes
|
||||
===== Allow expensive queries
|
||||
Query string query can be internally be transformed to a <<query-dsl-prefix-query, `prefix query`>> which means
|
||||
that if the prefix queries are disabled as explained <<prefix-query-allow-expensive-queries, here>> the query will not be
|
||||
executed and an exception will be thrown.
|
||||
|
@ -134,6 +134,11 @@ increases the relevance score.
|
||||
[[range-query-notes]]
|
||||
==== Notes
|
||||
|
||||
[[ranges-on-text-and-keyword]]
|
||||
===== Using the `range` query with `text` and `keyword` fields
|
||||
Range queries on <<text, `text`>> or <<keyword, `keyword`>> files will not be executed if
|
||||
<<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>> is set to false.
|
||||
|
||||
[[ranges-on-dates]]
|
||||
===== Using the `range` query with `date` fields
|
||||
|
||||
|
@ -86,3 +86,8 @@ regular expressions.
|
||||
`rewrite`::
|
||||
(Optional, string) Method used to rewrite the query. For valid values and more
|
||||
information, see the <<query-dsl-multi-term-rewrite, `rewrite` parameter>>.
|
||||
|
||||
==== Notes
|
||||
===== Allow expensive queries
|
||||
Regexp queries will not be executed if <<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>>
|
||||
is set to false.
|
||||
|
@ -69,3 +69,7 @@ GET /_search
|
||||
}
|
||||
}
|
||||
----
|
||||
|
||||
===== Allow expensive queries
|
||||
Script queries will not be executed if <<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>>
|
||||
is set to false.
|
||||
|
@ -221,6 +221,10 @@ and default time zone. Also calculations with `now` are not supported.
|
||||
<<vector-functions, Functions for vector fields>> are accessible through
|
||||
`script_score` query.
|
||||
|
||||
===== Allow expensive queries
|
||||
Script score queries will not be executed if <<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>>
|
||||
is set to false.
|
||||
|
||||
[[script-score-faster-alt]]
|
||||
===== Faster alternatives
|
||||
The `script_score` query calculates the score for
|
||||
|
@ -67,4 +67,9 @@ increases the relevance score.
|
||||
|
||||
`rewrite`::
|
||||
(Optional, string) Method used to rewrite the query. For valid values and more information, see the
|
||||
<<query-dsl-multi-term-rewrite, `rewrite` parameter>>.
|
||||
<<query-dsl-multi-term-rewrite, `rewrite` parameter>>.
|
||||
|
||||
==== Notes
|
||||
===== Allow expensive queries
|
||||
Wildcard queries will not be executed if <<query-dsl-allow-expensive-queries, `search.allow_expensive_queries`>>
|
||||
is set to false.
|
||||
|
@ -26,6 +26,7 @@ import org.apache.lucene.search.PrefixQuery;
|
||||
import org.apache.lucene.search.TermInSetQuery;
|
||||
import org.apache.lucene.search.TermQuery;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.index.mapper.SearchAsYouTypeFieldMapper.Defaults;
|
||||
import org.elasticsearch.index.mapper.SearchAsYouTypeFieldMapper.PrefixFieldType;
|
||||
import org.elasticsearch.index.mapper.SearchAsYouTypeFieldMapper.SearchAsYouTypeFieldType;
|
||||
@ -100,14 +101,19 @@ public class SearchAsYouTypeFieldTypeTests extends FieldTypeTestCase {
|
||||
|
||||
// this term should be a length that can be rewriteable to a term query on the prefix field
|
||||
final String withinBoundsTerm = "foo";
|
||||
assertThat(fieldType.prefixQuery(withinBoundsTerm, CONSTANT_SCORE_REWRITE, null),
|
||||
assertThat(fieldType.prefixQuery(withinBoundsTerm, CONSTANT_SCORE_REWRITE, randomMockShardContext()),
|
||||
equalTo(new ConstantScoreQuery(new TermQuery(new Term(PREFIX_NAME, withinBoundsTerm)))));
|
||||
|
||||
// our defaults don't allow a situation where a term can be too small
|
||||
|
||||
// this term should be too long to be rewriteable to a term query on the prefix field
|
||||
final String longTerm = "toolongforourprefixfieldthistermis";
|
||||
assertThat(fieldType.prefixQuery(longTerm, CONSTANT_SCORE_REWRITE, null),
|
||||
assertThat(fieldType.prefixQuery(longTerm, CONSTANT_SCORE_REWRITE, MOCK_QSC),
|
||||
equalTo(new PrefixQuery(new Term(NAME, longTerm))));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> fieldType.prefixQuery(longTerm, CONSTANT_SCORE_REWRITE, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[prefix] queries cannot be executed when 'search.allow_expensive_queries' is set to false. " +
|
||||
"For optimised prefix queries on text fields please enable [index_prefixes].", ee.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -27,6 +27,7 @@ import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.join.JoinUtil;
|
||||
import org.apache.lucene.search.join.ScoreMode;
|
||||
import org.apache.lucene.search.similarities.Similarity;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.ParsingException;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
@ -55,6 +56,8 @@ import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
import java.util.Objects;
|
||||
|
||||
import static org.elasticsearch.search.SearchService.ALLOW_EXPENSIVE_QUERIES;
|
||||
|
||||
/**
|
||||
* A query builder for {@code has_child} query.
|
||||
*/
|
||||
@ -302,6 +305,11 @@ public class HasChildQueryBuilder extends AbstractQueryBuilder<HasChildQueryBuil
|
||||
|
||||
@Override
|
||||
protected Query doToQuery(QueryShardContext context) throws IOException {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[joining] queries cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
|
||||
ParentJoinFieldMapper joinFieldMapper = ParentJoinFieldMapper.getMapper(context.getMapperService());
|
||||
if (joinFieldMapper == null) {
|
||||
if (ignoreUnmapped) {
|
||||
|
@ -21,6 +21,7 @@ package org.elasticsearch.join.query;
|
||||
import org.apache.lucene.search.MatchNoDocsQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.join.ScoreMode;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.ParsingException;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
@ -45,6 +46,8 @@ import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
import java.util.Objects;
|
||||
|
||||
import static org.elasticsearch.search.SearchService.ALLOW_EXPENSIVE_QUERIES;
|
||||
|
||||
/**
|
||||
* Builder for the 'has_parent' query.
|
||||
*/
|
||||
@ -158,6 +161,11 @@ public class HasParentQueryBuilder extends AbstractQueryBuilder<HasParentQueryBu
|
||||
|
||||
@Override
|
||||
protected Query doToQuery(QueryShardContext context) throws IOException {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[joining] queries cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
|
||||
ParentJoinFieldMapper joinFieldMapper = ParentJoinFieldMapper.getMapper(context.getMapperService());
|
||||
if (joinFieldMapper == null) {
|
||||
if (ignoreUnmapped) {
|
||||
|
@ -23,6 +23,7 @@ import org.apache.lucene.search.BooleanClause;
|
||||
import org.apache.lucene.search.BooleanQuery;
|
||||
import org.apache.lucene.search.MatchNoDocsQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.ParsingException;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
@ -38,6 +39,8 @@ import org.elasticsearch.join.mapper.ParentJoinFieldMapper;
|
||||
import java.io.IOException;
|
||||
import java.util.Objects;
|
||||
|
||||
import static org.elasticsearch.search.SearchService.ALLOW_EXPENSIVE_QUERIES;
|
||||
|
||||
public final class ParentIdQueryBuilder extends AbstractQueryBuilder<ParentIdQueryBuilder> {
|
||||
public static final String NAME = "parent_id";
|
||||
|
||||
@ -153,6 +156,11 @@ public final class ParentIdQueryBuilder extends AbstractQueryBuilder<ParentIdQue
|
||||
|
||||
@Override
|
||||
protected Query doToQuery(QueryShardContext context) throws IOException {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[joining] queries cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
|
||||
ParentJoinFieldMapper joinFieldMapper = ParentJoinFieldMapper.getMapper(context.getMapperService());
|
||||
if (joinFieldMapper == null) {
|
||||
if (ignoreUnmapped) {
|
||||
|
@ -31,6 +31,7 @@ import org.apache.lucene.search.TermQuery;
|
||||
import org.apache.lucene.search.join.ScoreMode;
|
||||
import org.apache.lucene.search.similarities.PerFieldSimilarityWrapper;
|
||||
import org.apache.lucene.search.similarities.Similarity;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.common.Strings;
|
||||
@ -69,6 +70,8 @@ import static org.hamcrest.CoreMatchers.containsString;
|
||||
import static org.hamcrest.CoreMatchers.equalTo;
|
||||
import static org.hamcrest.CoreMatchers.instanceOf;
|
||||
import static org.hamcrest.CoreMatchers.notNullValue;
|
||||
import static org.mockito.Mockito.mock;
|
||||
import static org.mockito.Mockito.when;
|
||||
|
||||
public class HasChildQueryBuilderTests extends AbstractQueryTestCase<HasChildQueryBuilder> {
|
||||
|
||||
@ -371,5 +374,18 @@ public class HasChildQueryBuilderTests extends AbstractQueryTestCase<HasChildQue
|
||||
queryBuilder.innerHit(new InnerHitBuilder("some_name"));
|
||||
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
|
||||
() -> InnerHitContextBuilder.extractInnerHits(queryBuilder, Collections.singletonMap("some_name", null)));
|
||||
assertEquals("[inner_hits] already contains an entry for key [some_name]", e.getMessage());
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() {
|
||||
QueryShardContext queryShardContext = mock(QueryShardContext.class);
|
||||
when(queryShardContext.allowExpensiveQueries()).thenReturn(false);
|
||||
|
||||
HasChildQueryBuilder queryBuilder =
|
||||
hasChildQuery(CHILD_DOC, new TermQueryBuilder("custom_string", "value"), ScoreMode.None);
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> queryBuilder.toQuery(queryShardContext));
|
||||
assertEquals("[joining] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
e.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -22,6 +22,7 @@ package org.elasticsearch.join.query;
|
||||
import org.apache.lucene.search.MatchNoDocsQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.join.ScoreMode;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.common.Strings;
|
||||
@ -57,6 +58,8 @@ import static org.hamcrest.CoreMatchers.containsString;
|
||||
import static org.hamcrest.CoreMatchers.equalTo;
|
||||
import static org.hamcrest.CoreMatchers.instanceOf;
|
||||
import static org.hamcrest.CoreMatchers.notNullValue;
|
||||
import static org.mockito.Mockito.mock;
|
||||
import static org.mockito.Mockito.when;
|
||||
|
||||
public class HasParentQueryBuilderTests extends AbstractQueryTestCase<HasParentQueryBuilder> {
|
||||
private static final String TYPE = "_doc";
|
||||
@ -265,5 +268,18 @@ public class HasParentQueryBuilderTests extends AbstractQueryTestCase<HasParentQ
|
||||
queryBuilder.innerHit(new InnerHitBuilder("some_name"));
|
||||
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
|
||||
() -> InnerHitContextBuilder.extractInnerHits(queryBuilder, Collections.singletonMap("some_name", null)));
|
||||
assertEquals("[inner_hits] already contains an entry for key [some_name]", e.getMessage());
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() {
|
||||
QueryShardContext queryShardContext = mock(QueryShardContext.class);
|
||||
when(queryShardContext.allowExpensiveQueries()).thenReturn(false);
|
||||
|
||||
HasParentQueryBuilder queryBuilder = new HasParentQueryBuilder(
|
||||
CHILD_DOC, new WrapperQueryBuilder(new MatchAllQueryBuilder().toString()), false);
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> queryBuilder.toQuery(queryShardContext));
|
||||
assertEquals("[joining] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
e.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -25,6 +25,7 @@ import org.apache.lucene.search.BooleanQuery;
|
||||
import org.apache.lucene.search.MatchNoDocsQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.TermQuery;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.common.Strings;
|
||||
@ -48,6 +49,8 @@ import static org.hamcrest.CoreMatchers.containsString;
|
||||
import static org.hamcrest.CoreMatchers.equalTo;
|
||||
import static org.hamcrest.CoreMatchers.instanceOf;
|
||||
import static org.hamcrest.CoreMatchers.notNullValue;
|
||||
import static org.mockito.Mockito.mock;
|
||||
import static org.mockito.Mockito.when;
|
||||
|
||||
public class ParentIdQueryBuilderTests extends AbstractQueryTestCase<ParentIdQueryBuilder> {
|
||||
|
||||
@ -154,4 +157,14 @@ public class ParentIdQueryBuilderTests extends AbstractQueryTestCase<ParentIdQue
|
||||
assertThat(e.getMessage(), containsString("[" + ParentIdQueryBuilder.NAME + "] no relation found for child [unmapped]"));
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() {
|
||||
QueryShardContext queryShardContext = mock(QueryShardContext.class);
|
||||
when(queryShardContext.allowExpensiveQueries()).thenReturn(false);
|
||||
|
||||
ParentIdQueryBuilder queryBuilder = doCreateTestQueryBuilder();
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> queryBuilder.toQuery(queryShardContext));
|
||||
assertEquals("[joining] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
e.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -26,6 +26,18 @@ setup:
|
||||
- do:
|
||||
indices.refresh: {}
|
||||
|
||||
---
|
||||
teardown:
|
||||
- skip:
|
||||
version: " - 7.6.99"
|
||||
reason: "implemented in 7.7.0"
|
||||
|
||||
- do:
|
||||
cluster.put_settings:
|
||||
body:
|
||||
transient:
|
||||
search.allow_expensive_queries: null
|
||||
|
||||
---
|
||||
"Parent/child inner hits":
|
||||
- do:
|
||||
@ -53,3 +65,24 @@ setup:
|
||||
- is_false: hits.hits.0.inner_hits.child.hits.hits.0._nested
|
||||
- gte: { hits.hits.0.inner_hits.child.hits.hits.0._seq_no: 0 }
|
||||
- gte: { hits.hits.0.inner_hits.child.hits.hits.0._primary_term: 1 }
|
||||
|
||||
---
|
||||
"HasChild disallow expensive queries":
|
||||
- skip:
|
||||
version: " - 7.6.99"
|
||||
reason: "implemented in 7.7.0"
|
||||
|
||||
### Update setting to false
|
||||
- do:
|
||||
cluster.put_settings:
|
||||
body:
|
||||
transient:
|
||||
search.allow_expensive_queries: "false"
|
||||
flat_settings: true
|
||||
|
||||
- match: {transient: {search.allow_expensive_queries: "false"}}
|
||||
|
||||
- do:
|
||||
catch: /\[joining\] queries cannot be executed when \'search.allow_expensive_queries\' is set to false./
|
||||
search:
|
||||
body: { "query": { "has_child": { "type": "child", "query": { "match_all": {} }, "inner_hits": {} } } }
|
||||
|
@ -51,6 +51,18 @@ setup:
|
||||
- do:
|
||||
indices.refresh: {}
|
||||
|
||||
---
|
||||
teardown:
|
||||
- skip:
|
||||
version: " - 7.6.99"
|
||||
reason: "implemented in 7.7.0"
|
||||
|
||||
- do:
|
||||
cluster.put_settings:
|
||||
body:
|
||||
transient:
|
||||
search.allow_expensive_queries: null
|
||||
|
||||
---
|
||||
"Test basic":
|
||||
- do:
|
||||
@ -116,3 +128,29 @@ setup:
|
||||
- match: { hits.hits.1._id: "4" }
|
||||
- match: { hits.hits.1._source.join_field.name: "child" }
|
||||
- match: { hits.hits.1._source.join_field.parent: "1" }
|
||||
|
||||
---
|
||||
"HasChild disallow expensive queries":
|
||||
- skip:
|
||||
version: " - 7.6.99"
|
||||
reason: "implemented in 7.7.0"
|
||||
|
||||
### Update setting to false
|
||||
- do:
|
||||
cluster.put_settings:
|
||||
body:
|
||||
transient:
|
||||
search.allow_expensive_queries: "false"
|
||||
flat_settings: true
|
||||
|
||||
- match: {transient: {search.allow_expensive_queries: "false"}}
|
||||
|
||||
- do:
|
||||
catch: /\[joining\] queries cannot be executed when \'search.allow_expensive_queries\' is set to false./
|
||||
search:
|
||||
body:
|
||||
sort: [ "id" ]
|
||||
query:
|
||||
parent_id:
|
||||
type: child
|
||||
id: 1
|
||||
|
@ -1,3 +1,61 @@
|
||||
---
|
||||
setup:
|
||||
- skip:
|
||||
version: " - 7.6.99"
|
||||
reason: "implemented in 7.7.0"
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
entity_type: { "type": "keyword" }
|
||||
join_field: { "type": "join", "relations": { "question": "answer", "person": "address" } }
|
||||
settings:
|
||||
number_of_shards: 1
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test
|
||||
id: 1
|
||||
body: { "join_field": { "name": "question" }, "entity_type": "question" }
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test
|
||||
id: 2
|
||||
routing: 1
|
||||
body: { "join_field": { "name": "answer", "parent": 1} , "entity_type": "answer" }
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test
|
||||
id: 3
|
||||
body: { "join_field": { "name": "person" }, "entity_type": "person" }
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test
|
||||
routing: 3
|
||||
id: 4
|
||||
body: { "join_field": { "name": "address", "parent": 3 }, "entity_type": "address" }
|
||||
|
||||
- do:
|
||||
indices.refresh: {}
|
||||
|
||||
---
|
||||
teardown:
|
||||
- skip:
|
||||
version: " - 7.6.99"
|
||||
reason: "implemented in 7.7.0"
|
||||
|
||||
- do:
|
||||
cluster.put_settings:
|
||||
body:
|
||||
transient:
|
||||
search.allow_expensive_queries: null
|
||||
|
||||
---
|
||||
"Test two sub-queries with only one having inner_hits":
|
||||
- skip:
|
||||
@ -66,3 +124,35 @@
|
||||
- match: { hits.hits.1._id: "2" }
|
||||
- match: { hits.hits.1.inner_hits.question.hits.total.value: 1}
|
||||
- match: { hits.hits.1.inner_hits.question.hits.hits.0._id: "1"}
|
||||
|
||||
---
|
||||
"HasParent disallow expensive queries":
|
||||
- skip:
|
||||
version: " - 7.6.99"
|
||||
reason: "implemented in 7.7.0"
|
||||
|
||||
### Update setting to false
|
||||
- do:
|
||||
cluster.put_settings:
|
||||
body:
|
||||
transient:
|
||||
search.allow_expensive_queries: "false"
|
||||
flat_settings: true
|
||||
|
||||
- match: {transient: {search.allow_expensive_queries: "false"}}
|
||||
|
||||
- do:
|
||||
catch: /\[joining\] queries cannot be executed when \'search.allow_expensive_queries\' is set to false./
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
bool:
|
||||
should:
|
||||
- term:
|
||||
entity_type: person
|
||||
- has_parent:
|
||||
parent_type: question
|
||||
query:
|
||||
match_all: {}
|
||||
inner_hits: {}
|
||||
|
@ -91,6 +91,7 @@ import java.util.Objects;
|
||||
import java.util.function.Supplier;
|
||||
|
||||
import static org.elasticsearch.percolator.PercolatorFieldMapper.parseQuery;
|
||||
import static org.elasticsearch.search.SearchService.ALLOW_EXPENSIVE_QUERIES;
|
||||
|
||||
public class PercolateQueryBuilder extends AbstractQueryBuilder<PercolateQueryBuilder> {
|
||||
public static final String NAME = "percolate";
|
||||
@ -569,6 +570,11 @@ public class PercolateQueryBuilder extends AbstractQueryBuilder<PercolateQueryBu
|
||||
|
||||
@Override
|
||||
protected Query doToQuery(QueryShardContext context) throws IOException {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[percolate] queries cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
|
||||
// Call nowInMillis() so that this query becomes un-cacheable since we
|
||||
// can't be sure that it doesn't use now or scripts
|
||||
context.nowInMillis();
|
||||
|
@ -20,6 +20,7 @@
|
||||
package org.elasticsearch.percolator;
|
||||
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.ResourceNotFoundException;
|
||||
import org.elasticsearch.action.admin.indices.mapping.put.PutMappingRequest;
|
||||
import org.elasticsearch.action.get.GetRequest;
|
||||
@ -57,6 +58,8 @@ import java.util.function.Supplier;
|
||||
|
||||
import static org.elasticsearch.index.seqno.SequenceNumbers.UNASSIGNED_SEQ_NO;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.mockito.Mockito.mock;
|
||||
import static org.mockito.Mockito.when;
|
||||
|
||||
public class PercolateQueryBuilderTests extends AbstractQueryTestCase<PercolateQueryBuilder> {
|
||||
|
||||
@ -364,4 +367,14 @@ public class PercolateQueryBuilderTests extends AbstractQueryTestCase<PercolateQ
|
||||
assertNotEquals(rewrittenQueryBuilder, percolateQueryBuilder);
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() {
|
||||
QueryShardContext queryShardContext = mock(QueryShardContext.class);
|
||||
when(queryShardContext.allowExpensiveQueries()).thenReturn(false);
|
||||
|
||||
PercolateQueryBuilder queryBuilder = doCreateTestQueryBuilder(true);
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> queryBuilder.toQuery(queryShardContext));
|
||||
assertEquals("[percolate] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
e.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -19,11 +19,14 @@
|
||||
package org.elasticsearch.percolator;
|
||||
|
||||
import org.apache.lucene.search.join.ScoreMode;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.admin.cluster.settings.ClusterUpdateSettingsRequest;
|
||||
import org.elasticsearch.action.search.MultiSearchResponse;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.bytes.BytesArray;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.geo.GeoPoint;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.unit.DistanceUnit;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
@ -37,6 +40,7 @@ import org.elasticsearch.search.fetch.subphase.highlight.HighlightBuilder;
|
||||
import org.elasticsearch.search.sort.SortOrder;
|
||||
import org.elasticsearch.test.ESIntegTestCase;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
|
||||
@ -886,4 +890,55 @@ public class PercolatorQuerySearchIT extends ESIntegTestCase {
|
||||
assertThat(item.getFailureMessage(), containsString("[test/type/6] couldn't be found"));
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() throws IOException {
|
||||
try {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.addMapping("_doc", "id", "type=keyword", "field1", "type=keyword", "query", "type=percolator")
|
||||
);
|
||||
|
||||
client().prepareIndex("test", "_doc").setId("1")
|
||||
.setSource(jsonBuilder().startObject()
|
||||
.field("id", "1")
|
||||
.field("query", matchQuery("field1", "value")).endObject())
|
||||
.get();
|
||||
refresh();
|
||||
|
||||
// Execute with search.allow_expensive_queries = null => default value = false => success
|
||||
BytesReference source = BytesReference.bytes(jsonBuilder().startObject().field("field1", "value").endObject());
|
||||
SearchResponse response = client().prepareSearch()
|
||||
.setQuery(new PercolateQueryBuilder("query", source, XContentType.JSON))
|
||||
.get();
|
||||
assertHitCount(response, 1);
|
||||
assertThat(response.getHits().getAt(0).getId(), equalTo("1"));
|
||||
assertThat(response.getHits().getAt(0).getFields().get("_percolator_document_slot").getValue(), equalTo(0));
|
||||
|
||||
// Set search.allow_expensive_queries to "false" => assert failure
|
||||
ClusterUpdateSettingsRequest updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", false));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> client().prepareSearch()
|
||||
.setQuery(new PercolateQueryBuilder("query", source, XContentType.JSON))
|
||||
.get());
|
||||
assertEquals("[percolate] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
e.getCause().getMessage());
|
||||
|
||||
// Set search.allow_expensive_queries setting to "true" ==> success
|
||||
updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", true));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
|
||||
response = client().prepareSearch()
|
||||
.setQuery(new PercolateQueryBuilder("query", source, XContentType.JSON))
|
||||
.get();
|
||||
assertHitCount(response, 1);
|
||||
assertThat(response.getHits().getAt(0).getId(), equalTo("1"));
|
||||
assertThat(response.getHits().getAt(0).getFields().get("_percolator_document_slot").getValue(), equalTo(0));
|
||||
} finally {
|
||||
ClusterUpdateSettingsRequest updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", (String) null));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -158,7 +158,7 @@ public class ICUCollationKeywordFieldMapper extends FieldMapper {
|
||||
|
||||
@Override
|
||||
public Query fuzzyQuery(Object value, Fuzziness fuzziness, int prefixLength, int maxExpansions,
|
||||
boolean transpositions) {
|
||||
boolean transpositions, QueryShardContext context) {
|
||||
throw new UnsupportedOperationException("[fuzzy] queries are not supported on [" + CONTENT_TYPE + "] fields.");
|
||||
}
|
||||
|
||||
|
@ -28,6 +28,7 @@ import org.apache.lucene.search.TermInSetQuery;
|
||||
import org.apache.lucene.search.TermQuery;
|
||||
import org.apache.lucene.search.TermRangeQuery;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.unit.Fuzziness;
|
||||
import org.elasticsearch.index.mapper.ICUCollationKeywordFieldMapper.CollationFieldType;
|
||||
import org.elasticsearch.index.mapper.MappedFieldType.Relation;
|
||||
@ -101,32 +102,36 @@ public class CollationFieldTypeTests extends FieldTypeTestCase {
|
||||
MappedFieldType ft = createDefaultFieldType();
|
||||
ft.setName("field");
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.regexpQuery("foo.*", 0, 10, null, null));
|
||||
UnsupportedOperationException e = expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.regexpQuery("foo.*", 0, 10, null, randomMockShardContext()));
|
||||
assertEquals("[regexp] queries are not supported on [icu_collation_keyword] fields.", e.getMessage());
|
||||
}
|
||||
|
||||
public void testFuzzyQuery() {
|
||||
MappedFieldType ft = createDefaultFieldType();
|
||||
ft.setName("field");
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.fuzzyQuery("foo", Fuzziness.fromEdits(2), 1, 50, true));
|
||||
UnsupportedOperationException e = expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.fuzzyQuery("foo", Fuzziness.fromEdits(2), 1, 50, true, randomMockShardContext()));
|
||||
assertEquals("[fuzzy] queries are not supported on [icu_collation_keyword] fields.", e.getMessage());
|
||||
}
|
||||
|
||||
public void testPrefixQuery() {
|
||||
MappedFieldType ft = createDefaultFieldType();
|
||||
ft.setName("field");
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.prefixQuery("prefix", null, null));
|
||||
UnsupportedOperationException e = expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.prefixQuery("prefix", null, randomMockShardContext()));
|
||||
assertEquals("[prefix] queries are not supported on [icu_collation_keyword] fields.", e.getMessage());
|
||||
}
|
||||
|
||||
public void testWildcardQuery() {
|
||||
MappedFieldType ft = createDefaultFieldType();
|
||||
ft.setName("field");
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.wildcardQuery("foo*", null, null));
|
||||
UnsupportedOperationException e = expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.wildcardQuery("foo*", null, randomMockShardContext()));
|
||||
assertEquals("[wildcard] queries are not supported on [icu_collation_keyword] fields.", e.getMessage());
|
||||
}
|
||||
|
||||
public void testRangeQuery() {
|
||||
@ -143,11 +148,16 @@ public class CollationFieldTypeTests extends FieldTypeTestCase {
|
||||
TermRangeQuery expected = new TermRangeQuery("field", new BytesRef(aKey.bytes, 0, aKey.size),
|
||||
new BytesRef(bKey.bytes, 0, bKey.size), false, false);
|
||||
|
||||
assertEquals(expected, ft.rangeQuery("a", "b", false, false, null, null, null, null));
|
||||
assertEquals(expected, ft.rangeQuery("a", "b", false, false, null, null, null, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.rangeQuery("a", "b", true, true, null, null, null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[range] queries on [text] or [keyword] fields cannot be executed when " +
|
||||
"'search.allow_expensive_queries' is set to false.", ee.getMessage());
|
||||
|
||||
ft.setIndexOptions(IndexOptions.NONE);
|
||||
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
|
||||
() -> ft.rangeQuery("a", "b", false, false, null, null, null, null));
|
||||
() -> ft.rangeQuery("a", "b", false, false, null, null, null, MOCK_QSC));
|
||||
assertEquals("Cannot search on field [field] since it is not indexed.", e.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -0,0 +1,330 @@
|
||||
---
|
||||
setup:
|
||||
- skip:
|
||||
version: " - 7.6.99"
|
||||
reason: "implemented in 7.7.0"
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
text:
|
||||
type: text
|
||||
analyzer: standard
|
||||
fields:
|
||||
raw:
|
||||
type: keyword
|
||||
nested1:
|
||||
type: nested
|
||||
|
||||
- do:
|
||||
bulk:
|
||||
refresh: true
|
||||
body:
|
||||
- '{"index": {"_index": "test", "_id": "1"}}'
|
||||
- '{"text" : "Some like it hot, some like it cold", "nested1": [{"foo": "bar1"}]}'
|
||||
- '{"index": {"_index": "test", "_id": "2"}}'
|
||||
- '{"text" : "Its cold outside, theres no kind of atmosphere", "nested1": [{"foo": "bar2"}]}'
|
||||
- '{"index": {"_index": "test", "_id": "3"}}'
|
||||
- '{"text" : "Baby its cold there outside", "nested1": [{"foo": "bar3"}]}'
|
||||
- '{"index": {"_index": "test", "_id": "4"}}'
|
||||
- '{"text" : "Outside it is cold and wet", "nested1": [{"foo": "bar4"}]}'
|
||||
|
||||
---
|
||||
teardown:
|
||||
- skip:
|
||||
version: " - 7.6.99"
|
||||
reason: "implemented in 7.7.0"
|
||||
|
||||
- do:
|
||||
cluster.put_settings:
|
||||
body:
|
||||
transient:
|
||||
search.allow_expensive_queries: null
|
||||
|
||||
---
|
||||
"Test disallow expensive queries":
|
||||
- skip:
|
||||
version: " - 7.6.99"
|
||||
reason: "implemented in 7.7.0"
|
||||
|
||||
### Check for initial setting = null -> false
|
||||
- do:
|
||||
cluster.get_settings:
|
||||
flat_settings: true
|
||||
|
||||
- match: {search.allow_expensive_queries: null}
|
||||
|
||||
### Prefix
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
prefix:
|
||||
text:
|
||||
value: out
|
||||
|
||||
- match: { hits.total.value: 3 }
|
||||
|
||||
### Fuzzy
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
fuzzy:
|
||||
text:
|
||||
value: outwide
|
||||
|
||||
- match: { hits.total.value: 3 }
|
||||
|
||||
|
||||
### Regexp
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
regexp:
|
||||
text:
|
||||
value: .*ou.*id.*
|
||||
|
||||
- match: { hits.total.value: 3 }
|
||||
|
||||
### Wildcard
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
wildcard:
|
||||
text:
|
||||
value: out?ide
|
||||
|
||||
- match: { hits.total.value: 3 }
|
||||
|
||||
### Range on text
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
range:
|
||||
text:
|
||||
gte: "theres"
|
||||
|
||||
- match: { hits.total.value: 2 }
|
||||
|
||||
### Range on keyword
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
range:
|
||||
text.raw:
|
||||
gte : "Outside it is cold and wet"
|
||||
|
||||
- match: { hits.total.value: 2 }
|
||||
|
||||
### Nested
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
nested:
|
||||
path: "nested1"
|
||||
query:
|
||||
bool:
|
||||
must: [{"match": {"nested1.foo": "bar2"}}]
|
||||
|
||||
- match: { hits.total.value: 1 }
|
||||
|
||||
### Update setting to false
|
||||
- do:
|
||||
cluster.put_settings:
|
||||
body:
|
||||
transient:
|
||||
search.allow_expensive_queries: "false"
|
||||
flat_settings: true
|
||||
|
||||
- match: {transient: {search.allow_expensive_queries: "false"}}
|
||||
|
||||
### Prefix
|
||||
- do:
|
||||
catch: /\[prefix\] queries cannot be executed when \'search.allow_expensive_queries\' is set to false. For optimised prefix queries on text fields please enable \[index_prefixes\]./
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
prefix:
|
||||
text:
|
||||
value: out
|
||||
|
||||
### Fuzzy
|
||||
- do:
|
||||
catch: /\[fuzzy\] queries cannot be executed when \'search.allow_expensive_queries\' is set to false./
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
fuzzy:
|
||||
text:
|
||||
value: outwide
|
||||
|
||||
### Regexp
|
||||
- do:
|
||||
catch: /\[regexp\] queries cannot be executed when \'search.allow_expensive_queries\' is set to false./
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
regexp:
|
||||
text:
|
||||
value: .*ou.*id.*
|
||||
|
||||
### Wildcard
|
||||
- do:
|
||||
catch: /\[wildcard\] queries cannot be executed when \'search.allow_expensive_queries\' is set to false./
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
wildcard:
|
||||
text:
|
||||
value: out?ide
|
||||
|
||||
### Range on text
|
||||
- do:
|
||||
catch: /\[range\] queries on \[text\] or \[keyword\] fields cannot be executed when \'search.allow_expensive_queries\' is set to false./
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
range:
|
||||
text:
|
||||
gte: "theres"
|
||||
|
||||
### Range on keyword
|
||||
- do:
|
||||
catch: /\[range\] queries on \[text\] or \[keyword\] fields cannot be executed when \'search.allow_expensive_queries\' is set to false./
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
range:
|
||||
text.raw:
|
||||
gte : "Outside it is cold and wet"
|
||||
|
||||
### Nested
|
||||
- do:
|
||||
catch: /\[joining\] queries cannot be executed when \'search.allow_expensive_queries\' is set to false./
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
nested:
|
||||
path: "nested1"
|
||||
query:
|
||||
bool:
|
||||
must: [{"match" : {"nested1.foo" : "bar2"}}]
|
||||
|
||||
### Revert setting to true
|
||||
- do:
|
||||
cluster.put_settings:
|
||||
body:
|
||||
transient:
|
||||
search.allow_expensive_queries: "true"
|
||||
flat_settings: true
|
||||
|
||||
- match: {transient: {search.allow_expensive_queries: "true"}}
|
||||
|
||||
### Prefix
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
prefix:
|
||||
text:
|
||||
value: out
|
||||
|
||||
- match: { hits.total.value: 3 }
|
||||
|
||||
### Fuzzy
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
fuzzy:
|
||||
text:
|
||||
value: outwide
|
||||
|
||||
- match: { hits.total.value: 3 }
|
||||
|
||||
### Regexp
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
regexp:
|
||||
text:
|
||||
value: .*ou.*id.*
|
||||
|
||||
- match: { hits.total.value: 3 }
|
||||
|
||||
### Wildcard
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
wildcard:
|
||||
text:
|
||||
value: out?ide
|
||||
|
||||
- match: { hits.total.value: 3 }
|
||||
|
||||
### Range on text
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
range:
|
||||
text:
|
||||
gte: "theres"
|
||||
|
||||
- match: { hits.total.value: 2 }
|
||||
|
||||
### Range on keyword
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
range:
|
||||
text.raw:
|
||||
gte: "Outside it is cold and wet"
|
||||
|
||||
- match: { hits.total.value: 2 }
|
||||
|
||||
### Nested
|
||||
- do:
|
||||
search:
|
||||
index: test
|
||||
body:
|
||||
query:
|
||||
nested:
|
||||
path: "nested1"
|
||||
query:
|
||||
bool:
|
||||
must: [{"match": {"nested1.foo": "bar2"}}]
|
||||
|
||||
- match: { hits.total.value: 1 }
|
@ -450,6 +450,7 @@ public final class ClusterSettings extends AbstractScopedSettings {
|
||||
SearchService.DEFAULT_KEEPALIVE_SETTING,
|
||||
SearchService.KEEPALIVE_INTERVAL_SETTING,
|
||||
SearchService.MAX_KEEPALIVE_SETTING,
|
||||
SearchService.ALLOW_EXPENSIVE_QUERIES,
|
||||
MultiBucketConsumerService.MAX_BUCKET_SETTING,
|
||||
SearchService.LOW_LEVEL_CANCELLATION_SETTING,
|
||||
SearchService.MAX_OPEN_SCROLL_CONTEXT,
|
||||
|
@ -130,6 +130,7 @@ public final class IndexModule {
|
||||
private final List<SearchOperationListener> searchOperationListeners = new ArrayList<>();
|
||||
private final List<IndexingOperationListener> indexOperationListeners = new ArrayList<>();
|
||||
private final AtomicBoolean frozen = new AtomicBoolean(false);
|
||||
private final BooleanSupplier allowExpensiveQueries;
|
||||
|
||||
/**
|
||||
* Construct the index module for the index with the specified index settings. The index module contains extension points for plugins
|
||||
@ -144,13 +145,15 @@ public final class IndexModule {
|
||||
final IndexSettings indexSettings,
|
||||
final AnalysisRegistry analysisRegistry,
|
||||
final EngineFactory engineFactory,
|
||||
final Map<String, IndexStorePlugin.DirectoryFactory> directoryFactories) {
|
||||
final Map<String, IndexStorePlugin.DirectoryFactory> directoryFactories,
|
||||
final BooleanSupplier allowExpensiveQueries) {
|
||||
this.indexSettings = indexSettings;
|
||||
this.analysisRegistry = analysisRegistry;
|
||||
this.engineFactory = Objects.requireNonNull(engineFactory);
|
||||
this.searchOperationListeners.add(new SearchSlowLog(indexSettings));
|
||||
this.indexOperationListeners.add(new IndexingSlowLog(indexSettings));
|
||||
this.directoryFactories = Collections.unmodifiableMap(directoryFactories);
|
||||
this.allowExpensiveQueries = allowExpensiveQueries;
|
||||
}
|
||||
|
||||
/**
|
||||
@ -424,7 +427,7 @@ public final class IndexModule {
|
||||
new SimilarityService(indexSettings, scriptService, similarities), shardStoreDeleter, indexAnalyzers,
|
||||
engineFactory, circuitBreakerService, bigArrays, threadPool, scriptService, clusterService, client, queryCache,
|
||||
directoryFactory, eventListener, readerWrapperFactory, mapperRegistry, indicesFieldDataCache, searchOperationListeners,
|
||||
indexOperationListeners, namedWriteableRegistry, idFieldDataEnabled);
|
||||
indexOperationListeners, namedWriteableRegistry, idFieldDataEnabled, allowExpensiveQueries);
|
||||
success = true;
|
||||
return indexService;
|
||||
} finally {
|
||||
|
@ -59,8 +59,8 @@ import org.elasticsearch.index.engine.EngineFactory;
|
||||
import org.elasticsearch.index.fielddata.IndexFieldDataCache;
|
||||
import org.elasticsearch.index.fielddata.IndexFieldDataService;
|
||||
import org.elasticsearch.index.mapper.MapperService;
|
||||
import org.elasticsearch.index.query.SearchIndexNameMatcher;
|
||||
import org.elasticsearch.index.query.QueryShardContext;
|
||||
import org.elasticsearch.index.query.SearchIndexNameMatcher;
|
||||
import org.elasticsearch.index.seqno.RetentionLeaseSyncer;
|
||||
import org.elasticsearch.index.shard.IndexEventListener;
|
||||
import org.elasticsearch.index.shard.IndexShard;
|
||||
@ -127,6 +127,7 @@ public class IndexService extends AbstractIndexComponent implements IndicesClust
|
||||
private final IndexSettings indexSettings;
|
||||
private final List<SearchOperationListener> searchOperationListeners;
|
||||
private final List<IndexingOperationListener> indexingOperationListeners;
|
||||
private final BooleanSupplier allowExpensiveQueries;
|
||||
private volatile AsyncRefreshTask refreshTask;
|
||||
private volatile AsyncTranslogFSync fsyncTask;
|
||||
private volatile AsyncGlobalCheckpointTask globalCheckpointTask;
|
||||
@ -167,8 +168,10 @@ public class IndexService extends AbstractIndexComponent implements IndicesClust
|
||||
List<SearchOperationListener> searchOperationListeners,
|
||||
List<IndexingOperationListener> indexingOperationListeners,
|
||||
NamedWriteableRegistry namedWriteableRegistry,
|
||||
BooleanSupplier idFieldDataEnabled) {
|
||||
BooleanSupplier idFieldDataEnabled,
|
||||
BooleanSupplier allowExpensiveQueries) {
|
||||
super(indexSettings);
|
||||
this.allowExpensiveQueries = allowExpensiveQueries;
|
||||
this.indexSettings = indexSettings;
|
||||
this.xContentRegistry = xContentRegistry;
|
||||
this.similarityService = similarityService;
|
||||
@ -570,7 +573,7 @@ public class IndexService extends AbstractIndexComponent implements IndicesClust
|
||||
return new QueryShardContext(
|
||||
shardId, indexSettings, bigArrays, indexCache.bitsetFilterCache(), indexFieldData::getForField, mapperService(),
|
||||
similarityService(), scriptService, xContentRegistry, namedWriteableRegistry, client, searcher, nowInMillis, clusterAlias,
|
||||
indexNameMatcher);
|
||||
indexNameMatcher, allowExpensiveQueries);
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -26,6 +26,7 @@ import org.apache.lucene.index.IndexReader;
|
||||
import org.apache.lucene.index.PrefixCodedTerms;
|
||||
import org.apache.lucene.index.PrefixCodedTerms.TermIterator;
|
||||
import org.apache.lucene.index.Term;
|
||||
import org.apache.lucene.queries.intervals.IntervalsSource;
|
||||
import org.apache.lucene.search.BooleanClause.Occur;
|
||||
import org.apache.lucene.search.BooleanQuery;
|
||||
import org.apache.lucene.search.BoostQuery;
|
||||
@ -34,7 +35,6 @@ import org.apache.lucene.search.MultiTermQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.TermInSetQuery;
|
||||
import org.apache.lucene.search.TermQuery;
|
||||
import org.apache.lucene.queries.intervals.IntervalsSource;
|
||||
import org.apache.lucene.search.spans.SpanMultiTermQueryWrapper;
|
||||
import org.apache.lucene.search.spans.SpanQuery;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
@ -351,7 +351,8 @@ public abstract class MappedFieldType extends FieldType {
|
||||
throw new IllegalArgumentException("Field [" + name + "] of type [" + typeName() + "] does not support range queries");
|
||||
}
|
||||
|
||||
public Query fuzzyQuery(Object value, Fuzziness fuzziness, int prefixLength, int maxExpansions, boolean transpositions) {
|
||||
public Query fuzzyQuery(Object value, Fuzziness fuzziness, int prefixLength, int maxExpansions, boolean transpositions,
|
||||
QueryShardContext context) {
|
||||
throw new IllegalArgumentException("Can only use fuzzy queries on keyword and text fields - not on [" + name
|
||||
+ "] which is of type [" + typeName() + "]");
|
||||
}
|
||||
|
@ -31,6 +31,7 @@ import org.apache.lucene.search.TermInSetQuery;
|
||||
import org.apache.lucene.search.TermRangeQuery;
|
||||
import org.apache.lucene.search.WildcardQuery;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.lucene.BytesRefs;
|
||||
import org.elasticsearch.common.unit.Fuzziness;
|
||||
import org.elasticsearch.index.query.QueryShardContext;
|
||||
@ -38,6 +39,8 @@ import org.elasticsearch.index.query.support.QueryParsers;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
import static org.elasticsearch.search.SearchService.ALLOW_EXPENSIVE_QUERIES;
|
||||
|
||||
/** Base class for {@link MappedFieldType} implementations that use the same
|
||||
* representation for internal index terms as the external representation so
|
||||
* that partial matching queries such as prefix, wildcard and fuzzy queries
|
||||
@ -62,7 +65,11 @@ public abstract class StringFieldType extends TermBasedFieldType {
|
||||
|
||||
@Override
|
||||
public Query fuzzyQuery(Object value, Fuzziness fuzziness, int prefixLength, int maxExpansions,
|
||||
boolean transpositions) {
|
||||
boolean transpositions, QueryShardContext context) {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[fuzzy] queries cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
failIfNotIndexed();
|
||||
return new FuzzyQuery(new Term(name(), indexedValueForSearch(value)),
|
||||
fuzziness.asDistance(BytesRefs.toString(value)), prefixLength, maxExpansions, transpositions);
|
||||
@ -70,6 +77,11 @@ public abstract class StringFieldType extends TermBasedFieldType {
|
||||
|
||||
@Override
|
||||
public Query prefixQuery(String value, MultiTermQuery.RewriteMethod method, QueryShardContext context) {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[prefix] queries cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false. For optimised prefix queries on text " +
|
||||
"fields please enable [index_prefixes].");
|
||||
}
|
||||
failIfNotIndexed();
|
||||
PrefixQuery query = new PrefixQuery(new Term(name(), indexedValueForSearch(value)));
|
||||
if (method != null) {
|
||||
@ -84,6 +96,11 @@ public abstract class StringFieldType extends TermBasedFieldType {
|
||||
if (termQuery instanceof MatchNoDocsQuery || termQuery instanceof MatchAllDocsQuery) {
|
||||
return termQuery;
|
||||
}
|
||||
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[wildcard] queries cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
Term term = MappedFieldType.extractTerm(termQuery);
|
||||
|
||||
WildcardQuery query = new WildcardQuery(term);
|
||||
@ -94,6 +111,10 @@ public abstract class StringFieldType extends TermBasedFieldType {
|
||||
@Override
|
||||
public Query regexpQuery(String value, int flags, int maxDeterminizedStates,
|
||||
MultiTermQuery.RewriteMethod method, QueryShardContext context) {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[regexp] queries cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
failIfNotIndexed();
|
||||
RegexpQuery query = new RegexpQuery(new Term(name(), indexedValueForSearch(value)), flags, maxDeterminizedStates);
|
||||
if (method != null) {
|
||||
@ -104,6 +125,10 @@ public abstract class StringFieldType extends TermBasedFieldType {
|
||||
|
||||
@Override
|
||||
public Query rangeQuery(Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, QueryShardContext context) {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[range] queries on [text] or [keyword] fields cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
failIfNotIndexed();
|
||||
return new TermRangeQuery(name(),
|
||||
lowerTerm == null ? null : indexedValueForSearch(lowerTerm),
|
||||
|
@ -328,7 +328,7 @@ public class FuzzyQueryBuilder extends AbstractQueryBuilder<FuzzyQueryBuilder> i
|
||||
String rewrite = this.rewrite;
|
||||
MappedFieldType fieldType = context.fieldMapper(fieldName);
|
||||
if (fieldType != null) {
|
||||
query = fieldType.fuzzyQuery(value, fuzziness, prefixLength, maxExpansions, transpositions);
|
||||
query = fieldType.fuzzyQuery(value, fuzziness, prefixLength, maxExpansions, transpositions, context);
|
||||
}
|
||||
if (query == null) {
|
||||
int maxEdits = fuzziness.asDistance(BytesRefs.toString(value));
|
||||
|
@ -26,6 +26,7 @@ import org.apache.lucene.spatial.prefix.PrefixTreeStrategy;
|
||||
import org.apache.lucene.spatial.prefix.RecursivePrefixTreeStrategy;
|
||||
import org.apache.lucene.spatial.query.SpatialArgs;
|
||||
import org.apache.lucene.spatial.query.SpatialOperation;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.geo.ShapeRelation;
|
||||
import org.elasticsearch.common.geo.SpatialStrategy;
|
||||
import org.elasticsearch.common.geo.builders.CircleBuilder;
|
||||
@ -59,6 +60,8 @@ import org.locationtech.spatial4j.shape.Shape;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
import static org.elasticsearch.search.SearchService.ALLOW_EXPENSIVE_QUERIES;
|
||||
|
||||
public class LegacyGeoShapeQueryProcessor implements AbstractGeometryFieldMapper.QueryProcessor {
|
||||
|
||||
private AbstractGeometryFieldMapper.AbstractGeometryFieldType ft;
|
||||
@ -74,6 +77,11 @@ public class LegacyGeoShapeQueryProcessor implements AbstractGeometryFieldMapper
|
||||
|
||||
@Override
|
||||
public Query process(Geometry shape, String fieldName, SpatialStrategy strategy, ShapeRelation relation, QueryShardContext context) {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[geo-shape] queries on [PrefixTree geo shapes] cannot be executed when '"
|
||||
+ ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
|
||||
LegacyGeoShapeFieldMapper.GeoShapeFieldType shapeFieldType = (LegacyGeoShapeFieldMapper.GeoShapeFieldType) ft;
|
||||
SpatialStrategy spatialStrategy = shapeFieldType.strategy();
|
||||
if (strategy != null) {
|
||||
|
@ -34,6 +34,7 @@ import org.apache.lucene.search.Weight;
|
||||
import org.apache.lucene.search.join.BitSetProducer;
|
||||
import org.apache.lucene.search.join.ParentChildrenBlockJoinQuery;
|
||||
import org.apache.lucene.search.join.ScoreMode;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.search.MaxScoreCollector;
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.ParsingException;
|
||||
@ -57,6 +58,7 @@ import java.util.Locale;
|
||||
import java.util.Map;
|
||||
import java.util.Objects;
|
||||
|
||||
import static org.elasticsearch.search.SearchService.ALLOW_EXPENSIVE_QUERIES;
|
||||
import static org.elasticsearch.search.fetch.subphase.InnerHitsContext.intersect;
|
||||
|
||||
public class NestedQueryBuilder extends AbstractQueryBuilder<NestedQueryBuilder> {
|
||||
@ -266,6 +268,11 @@ public class NestedQueryBuilder extends AbstractQueryBuilder<NestedQueryBuilder>
|
||||
|
||||
@Override
|
||||
protected Query doToQuery(QueryShardContext context) throws IOException {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[joining] queries cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
|
||||
ObjectMapper nestedObjectMapper = context.getObjectMapper(path);
|
||||
if (nestedObjectMapper == null) {
|
||||
if (ignoreUnmapped) {
|
||||
|
@ -71,6 +71,7 @@ import java.util.Map;
|
||||
import java.util.Set;
|
||||
import java.util.function.BiConsumer;
|
||||
import java.util.function.BiFunction;
|
||||
import java.util.function.BooleanSupplier;
|
||||
import java.util.function.LongSupplier;
|
||||
import java.util.function.Predicate;
|
||||
|
||||
@ -100,6 +101,7 @@ public class QueryShardContext extends QueryRewriteContext {
|
||||
|
||||
private final Index fullyQualifiedIndex;
|
||||
private final Predicate<String> indexNameMatcher;
|
||||
private final BooleanSupplier allowExpensiveQueries;
|
||||
|
||||
public void setTypes(String... types) {
|
||||
this.types = types;
|
||||
@ -128,18 +130,19 @@ public class QueryShardContext extends QueryRewriteContext {
|
||||
IndexSearcher searcher,
|
||||
LongSupplier nowInMillis,
|
||||
String clusterAlias,
|
||||
Predicate<String> indexNameMatcher) {
|
||||
Predicate<String> indexNameMatcher,
|
||||
BooleanSupplier allowExpensiveQueries) {
|
||||
this(shardId, indexSettings, bigArrays, bitsetFilterCache, indexFieldDataLookup, mapperService, similarityService,
|
||||
scriptService, xContentRegistry, namedWriteableRegistry, client, searcher, nowInMillis, indexNameMatcher,
|
||||
new Index(RemoteClusterAware.buildRemoteIndexName(clusterAlias, indexSettings.getIndex().getName()),
|
||||
indexSettings.getIndex().getUUID()));
|
||||
scriptService, xContentRegistry, namedWriteableRegistry, client, searcher, nowInMillis, indexNameMatcher,
|
||||
new Index(RemoteClusterAware.buildRemoteIndexName(clusterAlias, indexSettings.getIndex().getName()),
|
||||
indexSettings.getIndex().getUUID()), allowExpensiveQueries);
|
||||
}
|
||||
|
||||
public QueryShardContext(QueryShardContext source) {
|
||||
this(source.shardId, source.indexSettings, source.bigArrays, source.bitsetFilterCache, source.indexFieldDataService,
|
||||
source.mapperService, source.similarityService, source.scriptService, source.getXContentRegistry(),
|
||||
source.getWriteableRegistry(), source.client, source.searcher, source.nowInMillis, source.indexNameMatcher,
|
||||
source.fullyQualifiedIndex);
|
||||
source.fullyQualifiedIndex, source.allowExpensiveQueries);
|
||||
}
|
||||
|
||||
private QueryShardContext(int shardId,
|
||||
@ -156,7 +159,8 @@ public class QueryShardContext extends QueryRewriteContext {
|
||||
IndexSearcher searcher,
|
||||
LongSupplier nowInMillis,
|
||||
Predicate<String> indexNameMatcher,
|
||||
Index fullyQualifiedIndex) {
|
||||
Index fullyQualifiedIndex,
|
||||
BooleanSupplier allowExpensiveQueries) {
|
||||
super(xContentRegistry, namedWriteableRegistry, client, nowInMillis);
|
||||
this.shardId = shardId;
|
||||
this.similarityService = similarityService;
|
||||
@ -171,6 +175,7 @@ public class QueryShardContext extends QueryRewriteContext {
|
||||
this.searcher = searcher;
|
||||
this.indexNameMatcher = indexNameMatcher;
|
||||
this.fullyQualifiedIndex = fullyQualifiedIndex;
|
||||
this.allowExpensiveQueries = allowExpensiveQueries;
|
||||
}
|
||||
|
||||
private void reset() {
|
||||
@ -208,6 +213,10 @@ public class QueryShardContext extends QueryRewriteContext {
|
||||
return bitsetFilterCache.getBitSetProducer(filter);
|
||||
}
|
||||
|
||||
public boolean allowExpensiveQueries() {
|
||||
return allowExpensiveQueries.getAsBoolean();
|
||||
}
|
||||
|
||||
public <IFD extends IndexFieldData<?>> IFD getForField(MappedFieldType fieldType) {
|
||||
return (IFD) indexFieldDataService.apply(fieldType, fullyQualifiedIndex.getName());
|
||||
}
|
||||
|
@ -29,6 +29,7 @@ import org.apache.lucene.search.ScoreMode;
|
||||
import org.apache.lucene.search.Scorer;
|
||||
import org.apache.lucene.search.TwoPhaseIterator;
|
||||
import org.apache.lucene.search.Weight;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.ParsingException;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
@ -40,6 +41,8 @@ import org.elasticsearch.script.Script;
|
||||
import java.io.IOException;
|
||||
import java.util.Objects;
|
||||
|
||||
import static org.elasticsearch.search.SearchService.ALLOW_EXPENSIVE_QUERIES;
|
||||
|
||||
public class ScriptQueryBuilder extends AbstractQueryBuilder<ScriptQueryBuilder> {
|
||||
public static final String NAME = "script";
|
||||
|
||||
@ -130,6 +133,10 @@ public class ScriptQueryBuilder extends AbstractQueryBuilder<ScriptQueryBuilder>
|
||||
|
||||
@Override
|
||||
protected Query doToQuery(QueryShardContext context) throws IOException {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[script] queries cannot be executed when '" +
|
||||
ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
FilterScript.Factory factory = context.compile(script, FilterScript.CONTEXT);
|
||||
FilterScript.LeafFactory filterScript = factory.newFactory(script.getParams(), context.lookup());
|
||||
return new ScriptQuery(script, filterScript);
|
||||
|
@ -20,6 +20,7 @@
|
||||
package org.elasticsearch.index.query.functionscore;
|
||||
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
@ -42,6 +43,7 @@ import java.util.Objects;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.ConstructingObjectParser.constructorArg;
|
||||
import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg;
|
||||
import static org.elasticsearch.search.SearchService.ALLOW_EXPENSIVE_QUERIES;
|
||||
|
||||
/**
|
||||
* A query that computes a document score based on the provided script
|
||||
@ -170,6 +172,10 @@ public class ScriptScoreQueryBuilder extends AbstractQueryBuilder<ScriptScoreQue
|
||||
|
||||
@Override
|
||||
protected Query doToQuery(QueryShardContext context) throws IOException {
|
||||
if (context.allowExpensiveQueries() == false) {
|
||||
throw new ElasticsearchException("[script score] queries cannot be executed when '"
|
||||
+ ALLOW_EXPENSIVE_QUERIES.getKey() + "' is set to false.");
|
||||
}
|
||||
ScoreScript.Factory factory = context.compile(script, ScoreScript.CONTEXT);
|
||||
ScoreScript.LeafFactory scoreScriptFactory = factory.newFactory(script.getParams(), context.lookup());
|
||||
Query query = this.query.toQuery(context);
|
||||
|
@ -566,7 +566,8 @@ public class MatchQuery {
|
||||
Supplier<Query> querySupplier;
|
||||
if (fuzziness != null) {
|
||||
querySupplier = () -> {
|
||||
Query query = fieldType.fuzzyQuery(term.text(), fuzziness, fuzzyPrefixLength, maxExpansions, transpositions);
|
||||
Query query = fieldType.fuzzyQuery(term.text(), fuzziness, fuzzyPrefixLength, maxExpansions,
|
||||
transpositions, context);
|
||||
if (query instanceof FuzzyQuery) {
|
||||
QueryParsers.setRewriteMethod((FuzzyQuery) query, fuzzyRewriteMethod);
|
||||
}
|
||||
|
@ -463,7 +463,7 @@ public class QueryStringQueryParser extends XQueryParser {
|
||||
Analyzer normalizer = forceAnalyzer == null ? queryBuilder.context.getSearchAnalyzer(currentFieldType) : forceAnalyzer;
|
||||
BytesRef term = termStr == null ? null : normalizer.normalize(field, termStr);
|
||||
return currentFieldType.fuzzyQuery(term, Fuzziness.fromEdits((int) minSimilarity),
|
||||
getFuzzyPrefixLength(), fuzzyMaxExpansions, fuzzyTranspositions);
|
||||
getFuzzyPrefixLength(), fuzzyMaxExpansions, fuzzyTranspositions, context);
|
||||
} catch (RuntimeException e) {
|
||||
if (lenient) {
|
||||
return newLenientFieldQuery(field, e);
|
||||
|
@ -42,10 +42,10 @@ import org.elasticsearch.index.query.QueryShardContext;
|
||||
import org.elasticsearch.index.query.SimpleQueryStringBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Objects;
|
||||
import java.util.List;
|
||||
import java.util.ArrayList;
|
||||
|
||||
import static org.elasticsearch.common.lucene.search.Queries.newUnmappedFieldQuery;
|
||||
|
||||
@ -134,7 +134,7 @@ public class SimpleQueryStringQueryParser extends SimpleQueryParser {
|
||||
try {
|
||||
final BytesRef term = getAnalyzer(ft).normalize(fieldName, text);
|
||||
Query query = ft.fuzzyQuery(term, Fuzziness.fromEdits(fuzziness), settings.fuzzyPrefixLength,
|
||||
settings.fuzzyMaxExpansions, settings.fuzzyTranspositions);
|
||||
settings.fuzzyMaxExpansions, settings.fuzzyTranspositions, context);
|
||||
disjuncts.add(wrapWithBoost(query, entry.getValue()));
|
||||
} catch (RuntimeException e) {
|
||||
disjuncts.add(rethrowUnlessLenient(e));
|
||||
|
@ -169,6 +169,7 @@ import static org.elasticsearch.common.util.concurrent.EsExecutors.daemonThreadF
|
||||
import static org.elasticsearch.index.IndexService.IndexCreationContext.CREATE_INDEX;
|
||||
import static org.elasticsearch.index.IndexService.IndexCreationContext.META_DATA_VERIFICATION;
|
||||
import static org.elasticsearch.index.query.AbstractQueryBuilder.parseInnerQueryBuilder;
|
||||
import static org.elasticsearch.search.SearchService.ALLOW_EXPENSIVE_QUERIES;
|
||||
|
||||
public class IndicesService extends AbstractLifecycleComponent
|
||||
implements IndicesClusterStateService.AllocatedIndices<IndexShard, IndexService>, IndexService.ShardStoreDeleter {
|
||||
@ -221,6 +222,7 @@ public class IndicesService extends AbstractLifecycleComponent
|
||||
final AbstractRefCounted indicesRefCount; // pkg-private for testing
|
||||
private final CountDownLatch closeLatch = new CountDownLatch(1);
|
||||
private volatile boolean idFieldDataEnabled;
|
||||
private volatile boolean allowExpensiveQueries;
|
||||
|
||||
@Nullable
|
||||
private final EsThreadPoolExecutor danglingIndicesThreadPoolExecutor;
|
||||
@ -317,6 +319,9 @@ public class IndicesService extends AbstractLifecycleComponent
|
||||
0, TimeUnit.MILLISECONDS,
|
||||
daemonThreadFactory(nodeName, DANGLING_INDICES_UPDATE_THREAD_NAME),
|
||||
threadPool.getThreadContext()) : null;
|
||||
|
||||
this.allowExpensiveQueries = ALLOW_EXPENSIVE_QUERIES.get(clusterService.getSettings());
|
||||
clusterService.getClusterSettings().addSettingsUpdateConsumer(ALLOW_EXPENSIVE_QUERIES, this::setAllowExpensiveQueries);
|
||||
}
|
||||
|
||||
private static final String DANGLING_INDICES_UPDATE_THREAD_NAME = "DanglingIndices#updateTask";
|
||||
@ -592,7 +597,8 @@ public class IndicesService extends AbstractLifecycleComponent
|
||||
idxSettings.getNumberOfReplicas(),
|
||||
indexCreationContext);
|
||||
|
||||
final IndexModule indexModule = new IndexModule(idxSettings, analysisRegistry, getEngineFactory(idxSettings), directoryFactories);
|
||||
final IndexModule indexModule = new IndexModule(idxSettings, analysisRegistry, getEngineFactory(idxSettings),
|
||||
directoryFactories, () -> allowExpensiveQueries);
|
||||
for (IndexingOperationListener operationListener : indexingOperationListeners) {
|
||||
indexModule.addIndexOperationListener(operationListener);
|
||||
}
|
||||
@ -661,7 +667,8 @@ public class IndicesService extends AbstractLifecycleComponent
|
||||
*/
|
||||
public synchronized MapperService createIndexMapperService(IndexMetaData indexMetaData) throws IOException {
|
||||
final IndexSettings idxSettings = new IndexSettings(indexMetaData, this.settings, indexScopedSettings);
|
||||
final IndexModule indexModule = new IndexModule(idxSettings, analysisRegistry, getEngineFactory(idxSettings), directoryFactories);
|
||||
final IndexModule indexModule = new IndexModule(idxSettings, analysisRegistry, getEngineFactory(idxSettings),
|
||||
directoryFactories, () -> allowExpensiveQueries);
|
||||
pluginsService.onIndexModule(indexModule);
|
||||
return indexModule.newIndexMapperService(xContentRegistry, mapperRegistry, scriptService);
|
||||
}
|
||||
@ -1574,6 +1581,10 @@ public class IndicesService extends AbstractLifecycleComponent
|
||||
}
|
||||
}
|
||||
|
||||
private void setAllowExpensiveQueries(Boolean allowExpensiveQueries) {
|
||||
this.allowExpensiveQueries = allowExpensiveQueries;
|
||||
}
|
||||
|
||||
// visible for testing
|
||||
public boolean allPendingDanglingIndicesWritten() {
|
||||
return nodeWriteDanglingIndicesInfo == false ||
|
||||
|
@ -136,6 +136,8 @@ public class SearchService extends AbstractLifecycleComponent implements IndexEv
|
||||
Setting.positiveTimeSetting("search.max_keep_alive", timeValueHours(24), Property.NodeScope, Property.Dynamic);
|
||||
public static final Setting<TimeValue> KEEPALIVE_INTERVAL_SETTING =
|
||||
Setting.positiveTimeSetting("search.keep_alive_interval", timeValueMinutes(1), Property.NodeScope);
|
||||
public static final Setting<Boolean> ALLOW_EXPENSIVE_QUERIES =
|
||||
Setting.boolSetting("search.allow_expensive_queries", true, Property.NodeScope, Property.Dynamic);
|
||||
|
||||
/**
|
||||
* Enables low-level, frequent search cancellation checks. Enabling low-level checks will make long running searches to react
|
||||
|
@ -133,7 +133,7 @@ public class MetaDataCreateIndexServiceTests extends ESTestCase {
|
||||
queryShardContext = new QueryShardContext(0,
|
||||
new IndexSettings(IndexMetaData.builder("test").settings(indexSettings).build(), indexSettings),
|
||||
BigArrays.NON_RECYCLING_INSTANCE, null, null, null, null, null, xContentRegistry(), writableRegistry(),
|
||||
null, null, () -> randomNonNegativeLong(), null, null);
|
||||
null, null, () -> randomNonNegativeLong(), null, null, () -> true);
|
||||
}
|
||||
|
||||
private ClusterState createClusterState(String name, int numShards, int numReplicas, Settings settings) {
|
||||
|
@ -166,7 +166,8 @@ public class IndexModuleTests extends ESTestCase {
|
||||
|
||||
public void testWrapperIsBound() throws IOException {
|
||||
final MockEngineFactory engineFactory = new MockEngineFactory(AssertingDirectoryReader.class);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, engineFactory, Collections.emptyMap());
|
||||
IndexModule module = new IndexModule(
|
||||
indexSettings, emptyAnalysisRegistry, engineFactory, Collections.emptyMap(), () -> true);
|
||||
module.setReaderWrapper(s -> new Wrapper());
|
||||
|
||||
IndexService indexService = newIndexService(module);
|
||||
@ -186,7 +187,8 @@ public class IndexModuleTests extends ESTestCase {
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
|
||||
final Map<String, IndexStorePlugin.DirectoryFactory> indexStoreFactories = singletonMap(
|
||||
"foo_store", new FooFunction());
|
||||
final IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), indexStoreFactories);
|
||||
final IndexModule module = new IndexModule(
|
||||
indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), indexStoreFactories, () -> true);
|
||||
|
||||
final IndexService indexService = newIndexService(module);
|
||||
assertThat(indexService.getDirectoryFactory(), instanceOf(FooFunction.class));
|
||||
@ -203,7 +205,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
}
|
||||
};
|
||||
IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
module.addIndexEventListener(eventListener);
|
||||
IndexService indexService = newIndexService(module);
|
||||
IndexSettings x = indexService.getIndexSettings();
|
||||
@ -218,7 +220,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
public void testListener() throws IOException {
|
||||
Setting<Boolean> booleanSetting = Setting.boolSetting("index.foo.bar", false, Property.Dynamic, Property.IndexScope);
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings, booleanSetting);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
Setting<Boolean> booleanSetting2 = Setting.boolSetting("index.foo.bar.baz", false, Property.Dynamic, Property.IndexScope);
|
||||
AtomicBoolean atomicBoolean = new AtomicBoolean(false);
|
||||
module.addSettingsUpdateConsumer(booleanSetting, atomicBoolean::set);
|
||||
@ -238,7 +240,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
|
||||
public void testAddIndexOperationListener() throws IOException {
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
AtomicBoolean executed = new AtomicBoolean(false);
|
||||
IndexingOperationListener listener = new IndexingOperationListener() {
|
||||
@Override
|
||||
@ -269,7 +271,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
|
||||
public void testAddSearchOperationListener() throws IOException {
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
AtomicBoolean executed = new AtomicBoolean(false);
|
||||
SearchOperationListener listener = new SearchOperationListener() {
|
||||
|
||||
@ -304,7 +306,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
.build();
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
|
||||
IndexModule module =
|
||||
new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
module.addSimilarity("test_similarity",
|
||||
(providerSettings, indexCreatedVersion, scriptService) -> new TestSimilarity(providerSettings.get("key")));
|
||||
|
||||
@ -320,9 +322,11 @@ public class IndexModuleTests extends ESTestCase {
|
||||
indexService.close("simon says", false);
|
||||
}
|
||||
|
||||
|
||||
|
||||
public void testFrozen() {
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
module.freeze();
|
||||
String msg = "Can't modify IndexModule once the index service has been created";
|
||||
assertEquals(msg, expectThrows(IllegalStateException.class, () -> module.addSearchOperationListener(null)).getMessage());
|
||||
@ -333,7 +337,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
assertEquals(msg, expectThrows(IllegalStateException.class, () -> module.forceQueryCacheProvider(null)).getMessage());
|
||||
}
|
||||
|
||||
public void testSetupUnknownSimilarity() throws IOException {
|
||||
public void testSetupUnknownSimilarity() {
|
||||
Settings settings = Settings.builder()
|
||||
.put("index.similarity.my_similarity.type", "test_similarity")
|
||||
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT)
|
||||
@ -341,19 +345,19 @@ public class IndexModuleTests extends ESTestCase {
|
||||
.build();
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
|
||||
IndexModule module =
|
||||
new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
Exception ex = expectThrows(IllegalArgumentException.class, () -> newIndexService(module));
|
||||
assertEquals("Unknown Similarity type [test_similarity] for [my_similarity]", ex.getMessage());
|
||||
}
|
||||
|
||||
public void testSetupWithoutType() throws IOException {
|
||||
public void testSetupWithoutType() {
|
||||
Settings settings = Settings.builder()
|
||||
.put("index.similarity.my_similarity.foo", "bar")
|
||||
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
|
||||
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT)
|
||||
.build();
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
Exception ex = expectThrows(IllegalArgumentException.class, () -> newIndexService(module));
|
||||
assertEquals("Similarity [my_similarity] must have an associated type", ex.getMessage());
|
||||
}
|
||||
@ -363,7 +367,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
|
||||
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT).build();
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
final Set<CustomQueryCache> liveQueryCaches = new HashSet<>();
|
||||
module.forceQueryCacheProvider((a, b) -> {
|
||||
final CustomQueryCache customQueryCache = new CustomQueryCache(liveQueryCaches);
|
||||
@ -384,7 +388,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
|
||||
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT).build();
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
IndexService indexService = newIndexService(module);
|
||||
assertTrue(indexService.cache().query() instanceof IndexQueryCache);
|
||||
indexService.close("simon says", false);
|
||||
@ -396,7 +400,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
|
||||
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT).build();
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
module.forceQueryCacheProvider((a, b) -> new CustomQueryCache(null));
|
||||
IndexService indexService = newIndexService(module);
|
||||
assertTrue(indexService.cache().query() instanceof DisabledQueryCache);
|
||||
@ -408,7 +412,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
|
||||
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT).build();
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
|
||||
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
final Set<CustomQueryCache> liveQueryCaches = new HashSet<>();
|
||||
module.forceQueryCacheProvider((a, b) -> {
|
||||
final CustomQueryCache customQueryCache = new CustomQueryCache(liveQueryCaches);
|
||||
@ -458,7 +462,7 @@ public class IndexModuleTests extends ESTestCase {
|
||||
};
|
||||
final AnalysisRegistry analysisRegistry = new AnalysisRegistry(environment, emptyMap(), emptyMap(), emptyMap(),
|
||||
singletonMap("test", analysisProvider), emptyMap(), emptyMap(), emptyMap(), emptyMap(), emptyMap());
|
||||
IndexModule module = new IndexModule(indexSettings, analysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule module = createIndexModule(indexSettings, analysisRegistry);
|
||||
threadPool.shutdown(); // causes index service creation to fail
|
||||
expectThrows(EsRejectedExecutionException.class, () -> newIndexService(module));
|
||||
assertThat(openAnalyzers, empty());
|
||||
@ -475,11 +479,16 @@ public class IndexModuleTests extends ESTestCase {
|
||||
.build();
|
||||
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(new Index("foo", "_na_"), settings, nodeSettings);
|
||||
final IndexModule module =
|
||||
new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
createIndexModule(indexSettings, emptyAnalysisRegistry);
|
||||
final IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () -> newIndexService(module));
|
||||
assertThat(e, hasToString(containsString("store type [" + storeType + "] is not allowed")));
|
||||
}
|
||||
|
||||
private static IndexModule createIndexModule(IndexSettings indexSettings, AnalysisRegistry emptyAnalysisRegistry) {
|
||||
return new IndexModule(
|
||||
indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap(), () -> true);
|
||||
}
|
||||
|
||||
class CustomQueryCache implements QueryCache {
|
||||
|
||||
private final Set<CustomQueryCache> liveQueryCaches;
|
||||
@ -545,5 +554,4 @@ public class IndexModuleTests extends ESTestCase {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -179,7 +179,7 @@ public class DateFieldTypeTests extends FieldTypeTestCase {
|
||||
QueryShardContext context = new QueryShardContext(0,
|
||||
new IndexSettings(IndexMetaData.builder("foo").settings(indexSettings).build(), indexSettings),
|
||||
BigArrays.NON_RECYCLING_INSTANCE, null, null, null, null, null,
|
||||
xContentRegistry(), writableRegistry(), null, null, () -> nowInMillis, null, null);
|
||||
xContentRegistry(), writableRegistry(), null, null, () -> nowInMillis, null, null, () -> true);
|
||||
MappedFieldType ft = createDefaultFieldType();
|
||||
ft.setName("field");
|
||||
String date = "2015-10-12T14:10:55";
|
||||
@ -202,7 +202,7 @@ public class DateFieldTypeTests extends FieldTypeTestCase {
|
||||
QueryShardContext context = new QueryShardContext(0,
|
||||
new IndexSettings(IndexMetaData.builder("foo").settings(indexSettings).build(), indexSettings),
|
||||
BigArrays.NON_RECYCLING_INSTANCE, null, null, null, null, null, xContentRegistry(), writableRegistry(),
|
||||
null, null, () -> nowInMillis, null, null);
|
||||
null, null, () -> nowInMillis, null, null, () -> true);
|
||||
MappedFieldType ft = createDefaultFieldType();
|
||||
ft.setName("field");
|
||||
String date1 = "2015-10-12T14:10:55";
|
||||
|
@ -68,7 +68,7 @@ public class FieldNamesFieldTypeTests extends FieldTypeTestCase {
|
||||
|
||||
QueryShardContext queryShardContext = new QueryShardContext(0,
|
||||
indexSettings, BigArrays.NON_RECYCLING_INSTANCE, null, null, mapperService,
|
||||
null, null, null, null, null, null, () -> 0L, null, null);
|
||||
null, null, null, null, null, null, () -> 0L, null, null, () -> true);
|
||||
fieldNamesFieldType.setEnabled(true);
|
||||
Query termQuery = fieldNamesFieldType.termQuery("field_name", queryShardContext);
|
||||
assertEquals(new TermQuery(new Term(FieldNamesFieldMapper.CONTENT_TYPE, "field_name")), termQuery);
|
||||
|
@ -26,6 +26,7 @@ import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.RegexpQuery;
|
||||
import org.apache.lucene.search.WildcardQuery;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
|
||||
public class IgnoredFieldTypeTests extends FieldTypeTestCase {
|
||||
|
||||
@ -40,7 +41,12 @@ public class IgnoredFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
|
||||
Query expected = new PrefixQuery(new Term("field", new BytesRef("foo*")));
|
||||
assertEquals(expected, ft.prefixQuery("foo*", null, null));
|
||||
assertEquals(expected, ft.prefixQuery("foo*", null, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.prefixQuery("foo*", null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[prefix] queries cannot be executed when 'search.allow_expensive_queries' is set to false. " +
|
||||
"For optimised prefix queries on text fields please enable [index_prefixes].", ee.getMessage());
|
||||
}
|
||||
|
||||
public void testRegexpQuery() {
|
||||
@ -49,7 +55,12 @@ public class IgnoredFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
|
||||
Query expected = new RegexpQuery(new Term("field", new BytesRef("foo?")));
|
||||
assertEquals(expected, ft.regexpQuery("foo?", 0, 10, null, null));
|
||||
assertEquals(expected, ft.regexpQuery("foo?", 0, 10, null, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.regexpQuery("foo?", randomInt(10), randomInt(10) + 1, null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[regexp] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
|
||||
public void testWildcardQuery() {
|
||||
@ -58,6 +69,11 @@ public class IgnoredFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
|
||||
Query expected = new WildcardQuery(new Term("field", new BytesRef("foo*")));
|
||||
assertEquals(expected, ft.wildcardQuery("foo*", null, null));
|
||||
assertEquals(expected, ft.wildcardQuery("foo*", null, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.wildcardQuery("valu*", null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[wildcard] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -78,6 +78,6 @@ public class IndexFieldTypeTests extends FieldTypeTestCase {
|
||||
|
||||
Predicate<String> indexNameMatcher = pattern -> Regex.simpleMatch(pattern, "index");
|
||||
return new QueryShardContext(0, indexSettings, null, null, null, null, null, null, xContentRegistry(), writableRegistry(),
|
||||
null, null, System::currentTimeMillis, null, indexNameMatcher);
|
||||
null, null, System::currentTimeMillis, null, indexNameMatcher, () -> true);
|
||||
}
|
||||
}
|
||||
|
@ -33,7 +33,10 @@ import org.apache.lucene.search.NormsFieldExistsQuery;
|
||||
import org.apache.lucene.search.RegexpQuery;
|
||||
import org.apache.lucene.search.TermInSetQuery;
|
||||
import org.apache.lucene.search.TermQuery;
|
||||
import org.apache.lucene.search.TermRangeQuery;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.lucene.BytesRefs;
|
||||
import org.elasticsearch.common.lucene.Lucene;
|
||||
import org.elasticsearch.common.unit.Fuzziness;
|
||||
import org.elasticsearch.index.analysis.AnalyzerScope;
|
||||
@ -150,17 +153,35 @@ public class KeywordFieldTypeTests extends FieldTypeTestCase {
|
||||
assertEquals(new TermQuery(new Term(FieldNamesFieldMapper.NAME, "field")), ft.existsQuery(null));
|
||||
}
|
||||
|
||||
public void testRangeQuery() {
|
||||
MappedFieldType ft = createDefaultFieldType();
|
||||
ft.setName("field");
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
assertEquals(new TermRangeQuery("field", BytesRefs.toBytesRef("foo"), BytesRefs.toBytesRef("bar"), true, false),
|
||||
ft.rangeQuery("foo", "bar", true, false, null, null, null, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.rangeQuery("foo", "bar", true, false, null, null, null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[range] queries on [text] or [keyword] fields cannot be executed when " +
|
||||
"'search.allow_expensive_queries' is set to false.", ee.getMessage());
|
||||
}
|
||||
|
||||
public void testRegexpQuery() {
|
||||
MappedFieldType ft = createDefaultFieldType();
|
||||
ft.setName("field");
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
assertEquals(new RegexpQuery(new Term("field","foo.*")),
|
||||
ft.regexpQuery("foo.*", 0, 10, null, null));
|
||||
ft.regexpQuery("foo.*", 0, 10, null, MOCK_QSC));
|
||||
|
||||
ft.setIndexOptions(IndexOptions.NONE);
|
||||
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
|
||||
() -> ft.regexpQuery("foo.*", 0, 10, null, null));
|
||||
() -> ft.regexpQuery("foo.*", 0, 10, null, MOCK_QSC));
|
||||
assertEquals("Cannot search on field [field] since it is not indexed.", e.getMessage());
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.regexpQuery("foo.*", randomInt(10), randomInt(10) + 1, null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[regexp] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
|
||||
public void testFuzzyQuery() {
|
||||
@ -168,12 +189,18 @@ public class KeywordFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setName("field");
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
assertEquals(new FuzzyQuery(new Term("field","foo"), 2, 1, 50, true),
|
||||
ft.fuzzyQuery("foo", Fuzziness.fromEdits(2), 1, 50, true));
|
||||
ft.fuzzyQuery("foo", Fuzziness.fromEdits(2), 1, 50, true, MOCK_QSC));
|
||||
|
||||
ft.setIndexOptions(IndexOptions.NONE);
|
||||
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
|
||||
() -> ft.fuzzyQuery("foo", Fuzziness.fromEdits(2), 1, 50, true));
|
||||
() -> ft.fuzzyQuery("foo", Fuzziness.fromEdits(2), 1, 50, true, MOCK_QSC));
|
||||
assertEquals("Cannot search on field [field] since it is not indexed.", e.getMessage());
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.fuzzyQuery("foo", Fuzziness.AUTO, randomInt(10) + 1, randomInt(10) + 1,
|
||||
randomBoolean(), MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[fuzzy] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
|
||||
public void testNormalizeQueries() {
|
||||
|
@ -22,15 +22,20 @@ import org.apache.lucene.spatial.prefix.PrefixTreeStrategy;
|
||||
import org.apache.lucene.spatial.prefix.RecursivePrefixTreeStrategy;
|
||||
import org.apache.lucene.spatial.prefix.tree.GeohashPrefixTree;
|
||||
import org.apache.lucene.spatial.prefix.tree.QuadPrefixTree;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.ElasticsearchParseException;
|
||||
import org.elasticsearch.common.Explicit;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.compress.CompressedXContent;
|
||||
import org.elasticsearch.common.geo.GeoUtils;
|
||||
import org.elasticsearch.common.geo.ShapeRelation;
|
||||
import org.elasticsearch.common.geo.SpatialStrategy;
|
||||
import org.elasticsearch.common.geo.builders.ShapeBuilder;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.geometry.Point;
|
||||
import org.elasticsearch.index.query.QueryShardContext;
|
||||
import org.elasticsearch.plugins.Plugin;
|
||||
import org.elasticsearch.test.ESSingleNodeTestCase;
|
||||
import org.elasticsearch.test.InternalSettingsPlugin;
|
||||
@ -44,6 +49,8 @@ import static org.hamcrest.Matchers.containsString;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.instanceOf;
|
||||
import static org.hamcrest.Matchers.not;
|
||||
import static org.mockito.Mockito.mock;
|
||||
import static org.mockito.Mockito.when;
|
||||
|
||||
public class LegacyGeoShapeFieldMapperTests extends ESSingleNodeTestCase {
|
||||
|
||||
@ -695,6 +702,31 @@ public class LegacyGeoShapeFieldMapperTests extends ESSingleNodeTestCase {
|
||||
assertFieldWarnings("tree", "precision", "strategy", "points_only");
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() throws IOException {
|
||||
QueryShardContext queryShardContext = mock(QueryShardContext.class);
|
||||
when(queryShardContext.allowExpensiveQueries()).thenReturn(false);
|
||||
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties").startObject("location")
|
||||
.field("type", "geo_shape")
|
||||
.field("tree", "quadtree")
|
||||
.endObject().endObject()
|
||||
.endObject().endObject());
|
||||
|
||||
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser()
|
||||
.parse("type1", new CompressedXContent(mapping));
|
||||
Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
|
||||
assertThat(fieldMapper, instanceOf(LegacyGeoShapeFieldMapper.class));
|
||||
LegacyGeoShapeFieldMapper geoShapeFieldMapper = (LegacyGeoShapeFieldMapper) fieldMapper;
|
||||
|
||||
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> geoShapeFieldMapper.fieldType().geometryQueryBuilder().process(
|
||||
new Point(-10, 10), "location", SpatialStrategy.TERM, ShapeRelation.INTERSECTS, queryShardContext));
|
||||
assertEquals("[geo-shape] queries on [PrefixTree geo shapes] cannot be executed when " +
|
||||
"'search.allow_expensive_queries' is set to false.", e.getMessage());
|
||||
assertFieldWarnings("tree");
|
||||
}
|
||||
|
||||
public String toXContentString(LegacyGeoShapeFieldMapper mapper, boolean includeDefaults) throws IOException {
|
||||
XContentBuilder builder = XContentFactory.jsonBuilder().startObject();
|
||||
ToXContent.Params params;
|
||||
@ -710,5 +742,4 @@ public class LegacyGeoShapeFieldMapperTests extends ESSingleNodeTestCase {
|
||||
public String toXContentString(LegacyGeoShapeFieldMapper mapper) throws IOException {
|
||||
return toXContentString(mapper, true);
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -230,7 +230,7 @@ public class RangeFieldTypeTests extends FieldTypeTestCase {
|
||||
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT).build();
|
||||
IndexSettings idxSettings = IndexSettingsModule.newIndexSettings(randomAlphaOfLengthBetween(1, 10), indexSettings);
|
||||
return new QueryShardContext(0, idxSettings, BigArrays.NON_RECYCLING_INSTANCE, null, null, null, null, null,
|
||||
xContentRegistry(), writableRegistry(), null, null, () -> nowInMillis, null, null);
|
||||
xContentRegistry(), writableRegistry(), null, null, () -> nowInMillis, null, null, () -> true);
|
||||
}
|
||||
|
||||
public void testDateRangeQueryUsingMappingFormat() {
|
||||
|
@ -25,8 +25,10 @@ import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.RegexpQuery;
|
||||
import org.apache.lucene.search.WildcardQuery;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
|
||||
public class RoutingFieldTypeTests extends FieldTypeTestCase {
|
||||
|
||||
@Override
|
||||
protected MappedFieldType createDefaultFieldType() {
|
||||
return new RoutingFieldMapper.RoutingFieldType();
|
||||
@ -38,7 +40,12 @@ public class RoutingFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
|
||||
Query expected = new PrefixQuery(new Term("field", new BytesRef("foo*")));
|
||||
assertEquals(expected, ft.prefixQuery("foo*", null, null));
|
||||
assertEquals(expected, ft.prefixQuery("foo*", null, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.prefixQuery("foo*", null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[prefix] queries cannot be executed when 'search.allow_expensive_queries' is set to false. " +
|
||||
"For optimised prefix queries on text fields please enable [index_prefixes].", ee.getMessage());
|
||||
}
|
||||
|
||||
public void testRegexpQuery() {
|
||||
@ -47,7 +54,12 @@ public class RoutingFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
|
||||
Query expected = new RegexpQuery(new Term("field", new BytesRef("foo?")));
|
||||
assertEquals(expected, ft.regexpQuery("foo?", 0, 10, null, null));
|
||||
assertEquals(expected, ft.regexpQuery("foo?", 0, 10, null, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.regexpQuery("foo?", randomInt(10), randomInt(10) + 1, null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[regexp] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
|
||||
public void testWildcardQuery() {
|
||||
@ -56,6 +68,11 @@ public class RoutingFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
|
||||
Query expected = new WildcardQuery(new Term("field", new BytesRef("foo*")));
|
||||
assertEquals(expected, ft.wildcardQuery("foo*", null, null));
|
||||
assertEquals(expected, ft.wildcardQuery("foo*", null, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.wildcardQuery("valu*", null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[wildcard] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -30,10 +30,13 @@ import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.RegexpQuery;
|
||||
import org.apache.lucene.search.TermInSetQuery;
|
||||
import org.apache.lucene.search.TermQuery;
|
||||
import org.apache.lucene.search.TermRangeQuery;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.apache.lucene.util.automaton.Automata;
|
||||
import org.apache.lucene.util.automaton.Automaton;
|
||||
import org.apache.lucene.util.automaton.Operations;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.lucene.BytesRefs;
|
||||
import org.elasticsearch.common.unit.Fuzziness;
|
||||
import org.junit.Before;
|
||||
|
||||
@ -45,6 +48,7 @@ import static org.apache.lucene.search.MultiTermQuery.CONSTANT_SCORE_REWRITE;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
public class TextFieldTypeTests extends FieldTypeTestCase {
|
||||
|
||||
@Override
|
||||
protected MappedFieldType createDefaultFieldType() {
|
||||
return new TextFieldMapper.TextFieldType();
|
||||
@ -130,17 +134,35 @@ public class TextFieldTypeTests extends FieldTypeTestCase {
|
||||
assertEquals("Cannot search on field [field] since it is not indexed.", e.getMessage());
|
||||
}
|
||||
|
||||
public void testRangeQuery() {
|
||||
MappedFieldType ft = createDefaultFieldType();
|
||||
ft.setName("field");
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
assertEquals(new TermRangeQuery("field", BytesRefs.toBytesRef("foo"), BytesRefs.toBytesRef("bar"), true, false),
|
||||
ft.rangeQuery("foo", "bar", true, false, null, null, null, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.rangeQuery("foo", "bar", true, false, null, null, null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[range] queries on [text] or [keyword] fields cannot be executed when " +
|
||||
"'search.allow_expensive_queries' is set to false.", ee.getMessage());
|
||||
}
|
||||
|
||||
public void testRegexpQuery() {
|
||||
MappedFieldType ft = createDefaultFieldType();
|
||||
ft.setName("field");
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
assertEquals(new RegexpQuery(new Term("field","foo.*")),
|
||||
ft.regexpQuery("foo.*", 0, 10, null, null));
|
||||
ft.regexpQuery("foo.*", 0, 10, null, MOCK_QSC));
|
||||
|
||||
ft.setIndexOptions(IndexOptions.NONE);
|
||||
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
|
||||
() -> ft.regexpQuery("foo.*", 0, 10, null, null));
|
||||
() -> ft.regexpQuery("foo.*", 0, 10, null, MOCK_QSC));
|
||||
assertEquals("Cannot search on field [field] since it is not indexed.", e.getMessage());
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.regexpQuery("foo.*", randomInt(10), randomInt(10) + 1, null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[regexp] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
|
||||
public void testFuzzyQuery() {
|
||||
@ -148,12 +170,18 @@ public class TextFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setName("field");
|
||||
ft.setIndexOptions(IndexOptions.DOCS);
|
||||
assertEquals(new FuzzyQuery(new Term("field","foo"), 2, 1, 50, true),
|
||||
ft.fuzzyQuery("foo", Fuzziness.fromEdits(2), 1, 50, true));
|
||||
ft.fuzzyQuery("foo", Fuzziness.fromEdits(2), 1, 50, true, MOCK_QSC));
|
||||
|
||||
ft.setIndexOptions(IndexOptions.NONE);
|
||||
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
|
||||
() -> ft.fuzzyQuery("foo", Fuzziness.fromEdits(2), 1, 50, true));
|
||||
() -> ft.fuzzyQuery("foo", Fuzziness.fromEdits(2), 1, 50, true, MOCK_QSC));
|
||||
assertEquals("Cannot search on field [field] since it is not indexed.", e.getMessage());
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.fuzzyQuery("foo", Fuzziness.AUTO, randomInt(10) + 1, randomInt(10) + 1,
|
||||
randomBoolean(), MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[fuzzy] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
|
||||
public void testIndexPrefixes() {
|
||||
@ -161,13 +189,18 @@ public class TextFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setName("field");
|
||||
ft.setPrefixFieldType(new TextFieldMapper.PrefixFieldType("field", "field._index_prefix", 2, 10));
|
||||
|
||||
Query q = ft.prefixQuery("goin", CONSTANT_SCORE_REWRITE, null);
|
||||
Query q = ft.prefixQuery("goin", CONSTANT_SCORE_REWRITE, randomMockShardContext());
|
||||
assertEquals(new ConstantScoreQuery(new TermQuery(new Term("field._index_prefix", "goin"))), q);
|
||||
|
||||
q = ft.prefixQuery("internationalisatio", CONSTANT_SCORE_REWRITE, null);
|
||||
q = ft.prefixQuery("internationalisatio", CONSTANT_SCORE_REWRITE, MOCK_QSC);
|
||||
assertEquals(new PrefixQuery(new Term("field", "internationalisatio")), q);
|
||||
|
||||
q = ft.prefixQuery("g", CONSTANT_SCORE_REWRITE, null);
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.prefixQuery("internationalisatio", null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[prefix] queries cannot be executed when 'search.allow_expensive_queries' is set to false. " +
|
||||
"For optimised prefix queries on text fields please enable [index_prefixes].", ee.getMessage());
|
||||
|
||||
q = ft.prefixQuery("g", CONSTANT_SCORE_REWRITE, randomMockShardContext());
|
||||
Automaton automaton
|
||||
= Operations.concatenate(Arrays.asList(Automata.makeChar('g'), Automata.makeAnyChar()));
|
||||
|
||||
|
@ -425,7 +425,7 @@ public class IntervalQueryBuilderTests extends AbstractQueryTestCase<IntervalQue
|
||||
QueryShardContext baseContext = createShardContext();
|
||||
QueryShardContext context = new QueryShardContext(baseContext.getShardId(), baseContext.getIndexSettings(),
|
||||
BigArrays.NON_RECYCLING_INSTANCE, null, null, baseContext.getMapperService(),
|
||||
null, scriptService, null, null, null, null, null, null, null);
|
||||
null, scriptService, null, null, null, null, null, null, null, () -> true);
|
||||
|
||||
String json = "{ \"intervals\" : { \"" + STRING_FIELD_NAME + "\": { " +
|
||||
"\"match\" : { " +
|
||||
|
@ -23,6 +23,7 @@ import com.carrotsearch.randomizedtesting.generators.RandomPicks;
|
||||
import org.apache.lucene.search.MatchNoDocsQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.join.ScoreMode;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.action.admin.indices.mapping.put.PutMappingRequest;
|
||||
import org.elasticsearch.common.Strings;
|
||||
@ -355,5 +356,17 @@ public class NestedQueryBuilderTests extends AbstractQueryTestCase<NestedQueryBu
|
||||
queryBuilder.innerHit(new InnerHitBuilder("some_name"));
|
||||
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
|
||||
() -> InnerHitContextBuilder.extractInnerHits(queryBuilder,Collections.singletonMap("some_name", null)));
|
||||
assertEquals("[inner_hits] already contains an entry for key [some_name]", e.getMessage());
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() {
|
||||
QueryShardContext queryShardContext = mock(QueryShardContext.class);
|
||||
when(queryShardContext.allowExpensiveQueries()).thenReturn(false);
|
||||
|
||||
NestedQueryBuilder queryBuilder = new NestedQueryBuilder("path", new MatchAllQueryBuilder(), ScoreMode.None);
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> queryBuilder.toQuery(queryShardContext));
|
||||
assertEquals("[joining] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
e.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -153,6 +153,6 @@ public class QueryShardContextTests extends ESTestCase {
|
||||
(mappedFieldType, idxName) ->
|
||||
mappedFieldType.fielddataBuilder(idxName).build(indexSettings, mappedFieldType, null, null, null),
|
||||
mapperService, null, null, NamedXContentRegistry.EMPTY, new NamedWriteableRegistry(Collections.emptyList()),
|
||||
null, null, () -> nowInMillis, clusterAlias, null);
|
||||
null, null, () -> nowInMillis, clusterAlias, null, () -> true);
|
||||
}
|
||||
}
|
||||
|
@ -41,7 +41,7 @@ public class RangeQueryRewriteTests extends ESSingleNodeTestCase {
|
||||
IndexReader reader = new MultiReader();
|
||||
QueryRewriteContext context = new QueryShardContext(0, indexService.getIndexSettings(), BigArrays.NON_RECYCLING_INSTANCE,
|
||||
null, null, indexService.mapperService(), null, null, xContentRegistry(), writableRegistry(),
|
||||
null, new IndexSearcher(reader), null, null, null);
|
||||
null, new IndexSearcher(reader), null, null, null, () -> true);
|
||||
RangeQueryBuilder range = new RangeQueryBuilder("foo");
|
||||
assertEquals(Relation.DISJOINT, range.getRelation(context));
|
||||
}
|
||||
@ -58,7 +58,8 @@ public class RangeQueryRewriteTests extends ESSingleNodeTestCase {
|
||||
indexService.mapperService().merge("type",
|
||||
new CompressedXContent(mapping), MergeReason.MAPPING_UPDATE);
|
||||
QueryRewriteContext context = new QueryShardContext(0, indexService.getIndexSettings(), null, null, null,
|
||||
indexService.mapperService(), null, null, xContentRegistry(), writableRegistry(), null, null, null, null, null);
|
||||
indexService.mapperService(), null, null, xContentRegistry(), writableRegistry(),
|
||||
null, null, null, null, null, () -> true);
|
||||
RangeQueryBuilder range = new RangeQueryBuilder("foo");
|
||||
// can't make assumptions on a missing reader, so it must return INTERSECT
|
||||
assertEquals(Relation.INTERSECTS, range.getRelation(context));
|
||||
@ -78,7 +79,7 @@ public class RangeQueryRewriteTests extends ESSingleNodeTestCase {
|
||||
IndexReader reader = new MultiReader();
|
||||
QueryRewriteContext context = new QueryShardContext(0, indexService.getIndexSettings(), BigArrays.NON_RECYCLING_INSTANCE,
|
||||
null, null, indexService.mapperService(), null, null, xContentRegistry(), writableRegistry(),
|
||||
null, new IndexSearcher(reader), null, null, null);
|
||||
null, new IndexSearcher(reader), null, null, null, () -> true);
|
||||
RangeQueryBuilder range = new RangeQueryBuilder("foo");
|
||||
// no values -> DISJOINT
|
||||
assertEquals(Relation.DISJOINT, range.getRelation(context));
|
||||
|
@ -20,6 +20,7 @@
|
||||
package org.elasticsearch.index.query;
|
||||
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.ParsingException;
|
||||
import org.elasticsearch.script.MockScriptEngine;
|
||||
import org.elasticsearch.script.Script;
|
||||
@ -33,6 +34,8 @@ import java.util.Set;
|
||||
|
||||
import static org.hamcrest.Matchers.containsString;
|
||||
import static org.hamcrest.Matchers.instanceOf;
|
||||
import static org.mockito.Mockito.mock;
|
||||
import static org.mockito.Mockito.when;
|
||||
|
||||
public class ScriptQueryBuilderTests extends AbstractQueryTestCase<ScriptQueryBuilder> {
|
||||
@Override
|
||||
@ -53,7 +56,9 @@ public class ScriptQueryBuilderTests extends AbstractQueryTestCase<ScriptQueryBu
|
||||
}
|
||||
|
||||
public void testIllegalConstructorArg() {
|
||||
expectThrows(IllegalArgumentException.class, () -> new ScriptQueryBuilder((Script) null));
|
||||
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
|
||||
() -> new ScriptQueryBuilder((Script) null));
|
||||
assertEquals("script cannot be null", e.getMessage());
|
||||
}
|
||||
|
||||
public void testFromJsonVerbose() throws IOException {
|
||||
@ -126,4 +131,15 @@ public class ScriptQueryBuilderTests extends AbstractQueryTestCase<ScriptQueryBu
|
||||
assertNotNull(rewriteQuery.toQuery(context));
|
||||
assertFalse("query should not be cacheable: " + queryBuilder.toString(), context.isCacheable());
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() {
|
||||
QueryShardContext queryShardContext = mock(QueryShardContext.class);
|
||||
when(queryShardContext.allowExpensiveQueries()).thenReturn(false);
|
||||
|
||||
ScriptQueryBuilder queryBuilder = doCreateTestQueryBuilder();
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> queryBuilder.toQuery(queryShardContext));
|
||||
assertEquals("[script] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
e.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -20,6 +20,7 @@
|
||||
package org.elasticsearch.index.query;
|
||||
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.lucene.search.function.ScriptScoreQuery;
|
||||
import org.elasticsearch.index.query.functionscore.ScriptScoreQueryBuilder;
|
||||
import org.elasticsearch.script.MockScriptEngine;
|
||||
@ -32,6 +33,8 @@ import java.util.Collections;
|
||||
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.hamcrest.CoreMatchers.instanceOf;
|
||||
import static org.mockito.Mockito.mock;
|
||||
import static org.mockito.Mockito.when;
|
||||
|
||||
public class ScriptScoreQueryBuilderTests extends AbstractQueryTestCase<ScriptScoreQueryBuilder> {
|
||||
|
||||
@ -71,15 +74,17 @@ public class ScriptScoreQueryBuilderTests extends AbstractQueryTestCase<ScriptSc
|
||||
String scriptStr = "1";
|
||||
Script script = new Script(ScriptType.INLINE, MockScriptEngine.NAME, scriptStr, Collections.emptyMap());
|
||||
|
||||
expectThrows(
|
||||
IllegalArgumentException e = expectThrows(
|
||||
IllegalArgumentException.class,
|
||||
() -> new ScriptScoreQueryBuilder(matchAllQuery(), null)
|
||||
);
|
||||
assertEquals("script_score: script must not be null" , e.getMessage());
|
||||
|
||||
expectThrows(
|
||||
e = expectThrows(
|
||||
IllegalArgumentException.class,
|
||||
() -> new ScriptScoreQueryBuilder(null, script)
|
||||
);
|
||||
assertEquals("script_score: query must not be null" , e.getMessage());
|
||||
}
|
||||
|
||||
/**
|
||||
@ -93,4 +98,15 @@ public class ScriptScoreQueryBuilderTests extends AbstractQueryTestCase<ScriptSc
|
||||
assertNotNull(rewriteQuery.toQuery(context));
|
||||
assertFalse("query should not be cacheable: " + queryBuilder.toString(), context.isCacheable());
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() {
|
||||
QueryShardContext queryShardContext = mock(QueryShardContext.class);
|
||||
when(queryShardContext.allowExpensiveQueries()).thenReturn(false);
|
||||
|
||||
ScriptScoreQueryBuilder queryBuilder = doCreateTestQueryBuilder();
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> queryBuilder.toQuery(queryShardContext));
|
||||
assertEquals("[script score] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
e.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -97,7 +97,7 @@ public class ExtendedBoundsTests extends ESTestCase {
|
||||
QueryShardContext qsc = new QueryShardContext(0,
|
||||
new IndexSettings(IndexMetaData.builder("foo").settings(indexSettings).build(), indexSettings),
|
||||
BigArrays.NON_RECYCLING_INSTANCE, null, null, null, null, null, xContentRegistry(), writableRegistry(),
|
||||
null, null, () -> now, null, null);
|
||||
null, null, () -> now, null, null, () -> true);
|
||||
DateFormatter formatter = DateFormatter.forPattern("dateOptionalTime");
|
||||
DocValueFormat format = new DocValueFormat.DateTime(formatter, ZoneOffset.UTC, DateFieldMapper.Resolution.MILLISECONDS);
|
||||
|
||||
|
@ -426,6 +426,6 @@ public class ScriptedMetricAggregatorTests extends AggregatorTestCase {
|
||||
Map<String, ScriptEngine> engines = Collections.singletonMap(scriptEngine.getType(), scriptEngine);
|
||||
ScriptService scriptService = new ScriptService(Settings.EMPTY, engines, ScriptModule.CORE_CONTEXTS);
|
||||
return new QueryShardContext(0, indexSettings, BigArrays.NON_RECYCLING_INSTANCE, null, null, mapperService, null, scriptService,
|
||||
xContentRegistry(), writableRegistry(), null, null, System::currentTimeMillis, null, null);
|
||||
xContentRegistry(), writableRegistry(), null, null, System::currentTimeMillis, null, null, () -> true);
|
||||
}
|
||||
}
|
||||
|
@ -280,7 +280,7 @@ public class HighlightBuilderTests extends ESTestCase {
|
||||
// shard context will only need indicesQueriesRegistry for building Query objects nested in highlighter
|
||||
QueryShardContext mockShardContext = new QueryShardContext(0, idxSettings, BigArrays.NON_RECYCLING_INSTANCE,
|
||||
null, null, null, null, null, xContentRegistry(), namedWriteableRegistry,
|
||||
null, null, System::currentTimeMillis, null, null) {
|
||||
null, null, System::currentTimeMillis, null, null, () -> true) {
|
||||
@Override
|
||||
public MappedFieldType fieldMapper(String name) {
|
||||
TextFieldMapper.Builder builder = new TextFieldMapper.Builder(name);
|
||||
|
@ -18,12 +18,14 @@
|
||||
*/
|
||||
package org.elasticsearch.search.geo;
|
||||
|
||||
import org.elasticsearch.action.admin.cluster.settings.ClusterUpdateSettingsRequest;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.cluster.ClusterState;
|
||||
import org.elasticsearch.cluster.routing.IndexShardRoutingTable;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.geo.builders.PointBuilder;
|
||||
import org.elasticsearch.common.geo.builders.ShapeBuilder;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.common.xcontent.XContentType;
|
||||
import org.elasticsearch.index.IndexService;
|
||||
@ -41,6 +43,16 @@ import static org.hamcrest.Matchers.instanceOf;
|
||||
|
||||
public class GeoShapeIntegrationIT extends ESIntegTestCase {
|
||||
|
||||
@Override
|
||||
protected Settings nodeSettings(int nodeOrdinal) {
|
||||
return Settings.builder()
|
||||
// Check that only geo-shape queries on legacy PrefixTree based
|
||||
// geo shapes are disallowed.
|
||||
.put("search.allow_expensive_queries", false)
|
||||
.put(super.nodeSettings(nodeOrdinal))
|
||||
.build();
|
||||
}
|
||||
|
||||
/**
|
||||
* Test that orientation parameter correctly persists across cluster restart
|
||||
*/
|
||||
@ -183,21 +195,21 @@ public class GeoShapeIntegrationIT extends ESIntegTestCase {
|
||||
|
||||
public void testIndexPolygonDateLine() throws Exception {
|
||||
String mappingVector = "{\n" +
|
||||
" \"properties\": {\n" +
|
||||
" \"shape\": {\n" +
|
||||
" \"type\": \"geo_shape\"\n" +
|
||||
" }\n" +
|
||||
" }\n" +
|
||||
" }";
|
||||
" \"properties\": {\n" +
|
||||
" \"shape\": {\n" +
|
||||
" \"type\": \"geo_shape\"\n" +
|
||||
" }\n" +
|
||||
" }\n" +
|
||||
" }";
|
||||
|
||||
String mappingQuad = "{\n" +
|
||||
" \"properties\": {\n" +
|
||||
" \"shape\": {\n" +
|
||||
" \"type\": \"geo_shape\",\n" +
|
||||
" \"tree\": \"quadtree\"\n" +
|
||||
" }\n" +
|
||||
" }\n" +
|
||||
" }";
|
||||
" \"properties\": {\n" +
|
||||
" \"shape\": {\n" +
|
||||
" \"type\": \"geo_shape\",\n" +
|
||||
" \"tree\": \"quadtree\"\n" +
|
||||
" }\n" +
|
||||
" }\n" +
|
||||
" }";
|
||||
|
||||
|
||||
// create index
|
||||
@ -208,37 +220,47 @@ public class GeoShapeIntegrationIT extends ESIntegTestCase {
|
||||
ensureGreen();
|
||||
|
||||
String source = "{\n" +
|
||||
" \"shape\" : \"POLYGON((179 0, -179 0, -179 2, 179 2, 179 0))\""+
|
||||
"}";
|
||||
" \"shape\" : \"POLYGON((179 0, -179 0, -179 2, 179 2, 179 0))\""+
|
||||
"}";
|
||||
|
||||
indexRandom(true, client().prepareIndex("quad", "doc", "0").setSource(source, XContentType.JSON));
|
||||
indexRandom(true, client().prepareIndex("vector", "doc", "0").setSource(source, XContentType.JSON));
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("quad").setQuery(
|
||||
geoShapeQuery("shape", new PointBuilder(-179.75, 1))
|
||||
).get();
|
||||
try {
|
||||
ClusterUpdateSettingsRequest updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", true));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("quad").setQuery(
|
||||
geoShapeQuery("shape", new PointBuilder(-179.75, 1))
|
||||
).get();
|
||||
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(1L));
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(1L));
|
||||
|
||||
searchResponse = client().prepareSearch("quad").setQuery(
|
||||
geoShapeQuery("shape", new PointBuilder(90, 1))
|
||||
).get();
|
||||
searchResponse = client().prepareSearch("quad").setQuery(
|
||||
geoShapeQuery("shape", new PointBuilder(90, 1))
|
||||
).get();
|
||||
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(0L));
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(0L));
|
||||
|
||||
searchResponse = client().prepareSearch("quad").setQuery(
|
||||
geoShapeQuery("shape", new PointBuilder(-180, 1))
|
||||
).get();
|
||||
searchResponse = client().prepareSearch("quad").setQuery(
|
||||
geoShapeQuery("shape", new PointBuilder(-180, 1))
|
||||
).get();
|
||||
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(1L));
|
||||
searchResponse = client().prepareSearch("quad").setQuery(
|
||||
geoShapeQuery("shape", new PointBuilder(180, 1))
|
||||
).get();
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(1L));
|
||||
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(1L));
|
||||
searchResponse = client().prepareSearch("quad").setQuery(
|
||||
geoShapeQuery("shape", new PointBuilder(180, 1))
|
||||
).get();
|
||||
|
||||
searchResponse = client().prepareSearch("vector").setQuery(
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(1L));
|
||||
} finally {
|
||||
ClusterUpdateSettingsRequest updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", (String) null));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
}
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("vector").setQuery(
|
||||
geoShapeQuery("shape", new PointBuilder(90, 1))
|
||||
).get();
|
||||
|
||||
|
@ -18,11 +18,14 @@
|
||||
*/
|
||||
package org.elasticsearch.search.geo;
|
||||
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.admin.cluster.settings.ClusterUpdateSettingsRequest;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.cluster.ClusterState;
|
||||
import org.elasticsearch.cluster.routing.IndexShardRoutingTable;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.geo.builders.ShapeBuilder;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.common.xcontent.XContentType;
|
||||
@ -33,6 +36,8 @@ import org.elasticsearch.index.mapper.MappedFieldType;
|
||||
import org.elasticsearch.indices.IndicesService;
|
||||
import org.elasticsearch.test.ESIntegTestCase;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
import static org.elasticsearch.index.query.QueryBuilders.geoShapeQuery;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
@ -186,6 +191,53 @@ public class LegacyGeoShapeIntegrationIT extends ESIntegTestCase {
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(1L));
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() throws InterruptedException, IOException {
|
||||
try {
|
||||
// create index
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.addMapping("_doc", "shape", "type=geo_shape,strategy=recursive,tree=geohash").get());
|
||||
ensureGreen();
|
||||
|
||||
indexRandom(true, client().prepareIndex("test", "_doc").setId("0").setSource(
|
||||
"shape", (ToXContent) (builder, params) -> {
|
||||
builder.startObject().field("type", "circle")
|
||||
.startArray("coordinates").value(30).value(50).endArray()
|
||||
.field("radius", "77km")
|
||||
.endObject();
|
||||
return builder;
|
||||
}));
|
||||
refresh();
|
||||
|
||||
// Execute with search.allow_expensive_queries = null => default value = false => success
|
||||
SearchResponse searchResponse = client().prepareSearch("test").setQuery(geoShapeQuery("shape",
|
||||
new Circle(30, 50, 77000))).get();
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(1L));
|
||||
|
||||
ClusterUpdateSettingsRequest updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", false));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
|
||||
// Set search.allow_expensive_queries to "false" => assert failure
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> client().prepareSearch("test").setQuery(geoShapeQuery("shape",
|
||||
new Circle(30, 50, 77000))).get());
|
||||
assertEquals("[geo-shape] queries on [PrefixTree geo shapes] cannot be executed when " +
|
||||
"'search.allow_expensive_queries' is set to false.", e.getCause().getMessage());
|
||||
|
||||
// Set search.allow_expensive_queries to "true" => success
|
||||
updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", true));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
searchResponse = client().prepareSearch("test").setQuery(geoShapeQuery("shape",
|
||||
new Circle(30, 50, 77000))).get();
|
||||
assertThat(searchResponse.getHits().getTotalHits().value, equalTo(1L));
|
||||
} finally {
|
||||
ClusterUpdateSettingsRequest updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", (String) null));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
}
|
||||
}
|
||||
|
||||
private String findNodeName(String index) {
|
||||
ClusterState state = client().admin().cluster().prepareState().get().getState();
|
||||
IndexShardRoutingTable shard = state.getRoutingTable().index(index).shard(0);
|
||||
|
@ -19,6 +19,8 @@
|
||||
|
||||
package org.elasticsearch.search.query;
|
||||
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.admin.cluster.settings.ClusterUpdateSettingsRequest;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.index.fielddata.ScriptDocValues;
|
||||
@ -156,4 +158,57 @@ public class ScriptScoreQueryIT extends ESIntegTestCase {
|
||||
assertNoFailures(resp);
|
||||
assertOrderedSearchHits(resp, "3", "2", "1");
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() {
|
||||
try {
|
||||
assertAcked(
|
||||
prepareCreate("test-index").addMapping("_doc", "field1", "type=text", "field2", "type=double")
|
||||
);
|
||||
int docCount = 10;
|
||||
for (int i = 1; i <= docCount; i++) {
|
||||
client().prepareIndex("test-index", "_doc").setId("" + i)
|
||||
.setSource("field1", "text" + (i % 2), "field2", i)
|
||||
.get();
|
||||
}
|
||||
refresh();
|
||||
|
||||
Map<String, Object> params = new HashMap<>();
|
||||
params.put("param1", 0.1);
|
||||
|
||||
// Execute with search.allow_expensive_queries = null => default value = true => success
|
||||
Script script = new Script(ScriptType.INLINE, CustomScriptPlugin.NAME, "doc['field2'].value * param1", params);
|
||||
SearchResponse resp = client()
|
||||
.prepareSearch("test-index")
|
||||
.setQuery(scriptScoreQuery(matchQuery("field1", "text0"), script))
|
||||
.get();
|
||||
assertNoFailures(resp);
|
||||
|
||||
// Set search.allow_expensive_queries to "false" => assert failure
|
||||
ClusterUpdateSettingsRequest updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", false));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> client()
|
||||
.prepareSearch("test-index")
|
||||
.setQuery(scriptScoreQuery(matchQuery("field1", "text0"), script))
|
||||
.get());
|
||||
assertEquals("[script score] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
e.getCause().getMessage());
|
||||
|
||||
// Set search.allow_expensive_queries to "true" => success
|
||||
updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", true));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
resp = client()
|
||||
.prepareSearch("test-index")
|
||||
.setQuery(scriptScoreQuery(matchQuery("field1", "text0"), script))
|
||||
.get();
|
||||
assertNoFailures(resp);
|
||||
} finally {
|
||||
ClusterUpdateSettingsRequest updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", (String) null));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -144,7 +144,7 @@ public class QueryRescorerBuilderTests extends ESTestCase {
|
||||
// shard context will only need indicesQueriesRegistry for building Query objects nested in query rescorer
|
||||
QueryShardContext mockShardContext = new QueryShardContext(0, idxSettings, BigArrays.NON_RECYCLING_INSTANCE,
|
||||
null, null, null, null, null,
|
||||
xContentRegistry(), namedWriteableRegistry, null, null, () -> nowInMillis, null, null) {
|
||||
xContentRegistry(), namedWriteableRegistry, null, null, () -> nowInMillis, null, null, () -> true) {
|
||||
@Override
|
||||
public MappedFieldType fieldMapper(String name) {
|
||||
TextFieldMapper.Builder builder = new TextFieldMapper.Builder(name);
|
||||
@ -188,7 +188,7 @@ public class QueryRescorerBuilderTests extends ESTestCase {
|
||||
// shard context will only need indicesQueriesRegistry for building Query objects nested in query rescorer
|
||||
QueryShardContext mockShardContext = new QueryShardContext(0, idxSettings, BigArrays.NON_RECYCLING_INSTANCE,
|
||||
null, null, null, null, null,
|
||||
xContentRegistry(), namedWriteableRegistry, null, null, () -> nowInMillis, null, null) {
|
||||
xContentRegistry(), namedWriteableRegistry, null, null, () -> nowInMillis, null, null, () -> true) {
|
||||
@Override
|
||||
public MappedFieldType fieldMapper(String name) {
|
||||
TextFieldMapper.Builder builder = new TextFieldMapper.Builder(name);
|
||||
|
@ -19,6 +19,8 @@
|
||||
|
||||
package org.elasticsearch.search.scriptfilter;
|
||||
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.admin.cluster.settings.ClusterUpdateSettingsRequest;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
@ -47,6 +49,7 @@ import static java.util.Collections.emptyMap;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.scriptQuery;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
@ESIntegTestCase.ClusterScope(scope = ESIntegTestCase.Scope.SUITE)
|
||||
@ -222,6 +225,55 @@ public class ScriptQuerySearchIT extends ESIntegTestCase {
|
||||
assertThat(response.getHits().getAt(2).getFields().get("sNum1").getValues().get(0), equalTo(3.0));
|
||||
}
|
||||
|
||||
public void testDisallowExpensiveQueries() {
|
||||
try {
|
||||
assertAcked(
|
||||
prepareCreate("test-index").addMapping("_doc", "num1", "type=double")
|
||||
);
|
||||
int docCount = 10;
|
||||
for (int i = 1; i <= docCount; i++) {
|
||||
client().prepareIndex("test-index", "_doc").setId("" + i)
|
||||
.setSource("num1", i)
|
||||
.get();
|
||||
}
|
||||
refresh();
|
||||
|
||||
// Execute with search.allow_expensive_queries = null => default value = false => success
|
||||
Script script = new Script(ScriptType.INLINE, CustomScriptPlugin.NAME, "doc['num1'].value > 1",
|
||||
Collections.emptyMap());
|
||||
SearchResponse resp = client().prepareSearch("test-index")
|
||||
.setQuery(scriptQuery(script))
|
||||
.get();
|
||||
assertNoFailures(resp);
|
||||
|
||||
ClusterUpdateSettingsRequest updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", false));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
|
||||
// Set search.allow_expensive_queries to "false" => assert failure
|
||||
ElasticsearchException e = expectThrows(ElasticsearchException.class,
|
||||
() -> client()
|
||||
.prepareSearch("test-index")
|
||||
.setQuery(scriptQuery(script))
|
||||
.get());
|
||||
assertEquals("[script] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
e.getCause().getMessage());
|
||||
|
||||
// Set search.allow_expensive_queries to "true" => success
|
||||
updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", true));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
resp = client().prepareSearch("test-index")
|
||||
.setQuery(scriptQuery(script))
|
||||
.get();
|
||||
assertNoFailures(resp);
|
||||
} finally {
|
||||
ClusterUpdateSettingsRequest updateSettingsRequest = new ClusterUpdateSettingsRequest();
|
||||
updateSettingsRequest.persistentSettings(Settings.builder().put("search.allow_expensive_queries", (String) null));
|
||||
assertAcked(client().admin().cluster().updateSettings(updateSettingsRequest).actionGet());
|
||||
}
|
||||
}
|
||||
|
||||
private static AtomicInteger scriptCounter = new AtomicInteger(0);
|
||||
|
||||
public static int incrementScriptCounter() {
|
||||
|
@ -198,7 +198,7 @@ public abstract class AbstractSortTestCase<T extends SortBuilder<T>> extends EST
|
||||
};
|
||||
return new QueryShardContext(0, idxSettings, BigArrays.NON_RECYCLING_INSTANCE, bitsetFilterCache, indexFieldDataLookup,
|
||||
null, null, scriptService, xContentRegistry(), namedWriteableRegistry, null, searcher,
|
||||
() -> randomNonNegativeLong(), null, null) {
|
||||
() -> randomNonNegativeLong(), null, null, () -> true) {
|
||||
|
||||
@Override
|
||||
public MappedFieldType fieldMapper(String name) {
|
||||
|
@ -181,7 +181,7 @@ public abstract class AbstractSuggestionBuilderTestCase<SB extends SuggestionBui
|
||||
((Script) invocation.getArguments()[0]).getIdOrCode()));
|
||||
QueryShardContext mockShardContext = new QueryShardContext(0, idxSettings, BigArrays.NON_RECYCLING_INSTANCE, null,
|
||||
null, mapperService, null, scriptService, xContentRegistry(), namedWriteableRegistry, null, null,
|
||||
System::currentTimeMillis, null, null);
|
||||
System::currentTimeMillis, null, null, () -> true);
|
||||
|
||||
SuggestionContext suggestionContext = suggestionBuilder.build(mockShardContext);
|
||||
assertEquals(toBytesRef(suggestionBuilder.text()), suggestionContext.getText());
|
||||
|
@ -31,9 +31,15 @@ import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
import static org.mockito.Mockito.mock;
|
||||
import static org.mockito.Mockito.when;
|
||||
|
||||
/** Base test case for subclasses of MappedFieldType */
|
||||
public abstract class FieldTypeTestCase extends ESTestCase {
|
||||
|
||||
public static final QueryShardContext MOCK_QSC = createMockQueryShardContext(true);
|
||||
public static final QueryShardContext MOCK_QSC_DISALLOW_EXPENSIVE = createMockQueryShardContext(false);
|
||||
|
||||
/** Abstraction for mutating a property of a MappedFieldType */
|
||||
public abstract static class Modifier {
|
||||
/** The name of the property that is being modified. Used in test failure messages. */
|
||||
@ -243,6 +249,16 @@ public abstract class FieldTypeTestCase extends ESTestCase {
|
||||
"} " + super.toString();
|
||||
}
|
||||
|
||||
protected QueryShardContext randomMockShardContext() {
|
||||
return randomFrom(MOCK_QSC, MOCK_QSC_DISALLOW_EXPENSIVE);
|
||||
}
|
||||
|
||||
static QueryShardContext createMockQueryShardContext(boolean allowExpensiveQueries) {
|
||||
QueryShardContext queryShardContext = mock(QueryShardContext.class);
|
||||
when(queryShardContext.allowExpensiveQueries()).thenReturn(allowExpensiveQueries);
|
||||
return queryShardContext;
|
||||
}
|
||||
|
||||
public void testClone() {
|
||||
MappedFieldType fieldType = createNamedDefaultFieldType();
|
||||
MappedFieldType clone = fieldType.clone();
|
||||
|
@ -280,7 +280,7 @@ public abstract class AggregatorTestCase extends ESTestCase {
|
||||
return new QueryShardContext(0, indexSettings, BigArrays.NON_RECYCLING_INSTANCE, null,
|
||||
getIndexFieldDataLookup(mapperService, circuitBreakerService),
|
||||
mapperService, null, getMockScriptService(), xContentRegistry(),
|
||||
writableRegistry(), null, searcher, System::currentTimeMillis, null, null);
|
||||
writableRegistry(), null, searcher, System::currentTimeMillis, null, null, () -> true);
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -414,7 +414,7 @@ public abstract class AbstractBuilderTestCase extends ESTestCase {
|
||||
QueryShardContext createShardContext(IndexSearcher searcher) {
|
||||
return new QueryShardContext(0, idxSettings, BigArrays.NON_RECYCLING_INSTANCE, bitsetFilterCache,
|
||||
indexFieldDataService::getForField, mapperService, similarityService, scriptService, xContentRegistry,
|
||||
namedWriteableRegistry, this.client, searcher, () -> nowInMillis, null, indexNameMatcher());
|
||||
namedWriteableRegistry, this.client, searcher, () -> nowInMillis, null, indexNameMatcher(), () -> true);
|
||||
}
|
||||
|
||||
ScriptModule createScriptModule(List<ScriptPlugin> scriptPlugins) {
|
||||
|
@ -43,7 +43,7 @@ public class MockSearchServiceTests extends ESTestCase {
|
||||
final long nowInMillis = randomNonNegativeLong();
|
||||
SearchContext s = new TestSearchContext(new QueryShardContext(0,
|
||||
new IndexSettings(EMPTY_INDEX_METADATA, Settings.EMPTY), BigArrays.NON_RECYCLING_INSTANCE, null, null, null, null, null,
|
||||
xContentRegistry(), writableRegistry(), null, null, () -> nowInMillis, null, null)) {
|
||||
xContentRegistry(), writableRegistry(), null, null, () -> nowInMillis, null, null, () -> true)) {
|
||||
|
||||
@Override
|
||||
public SearchShardTarget shardTarget() {
|
||||
|
@ -24,8 +24,8 @@ import org.apache.lucene.store.Directory;
|
||||
import org.apache.lucene.util.BitSet;
|
||||
import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.common.CheckedBiConsumer;
|
||||
import org.elasticsearch.common.logging.Loggers;
|
||||
import org.elasticsearch.common.CheckedConsumer;
|
||||
import org.elasticsearch.common.logging.Loggers;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.util.BigArrays;
|
||||
import org.elasticsearch.index.IndexSettings;
|
||||
@ -522,7 +522,7 @@ public class DocumentSubsetBitsetCacheTests extends ESTestCase {
|
||||
|
||||
final QueryShardContext shardContext = new QueryShardContext(shardId.id(), indexSettings, BigArrays.NON_RECYCLING_INSTANCE,
|
||||
null, null, mapperService, null, null, xContentRegistry(), writableRegistry(),
|
||||
client, new IndexSearcher(directoryReader), () -> nowInMillis, null, null);
|
||||
client, new IndexSearcher(directoryReader), () -> nowInMillis, null, null, () -> true);
|
||||
|
||||
context = new TestIndexContext(directory, iw, directoryReader, shardContext, leaf);
|
||||
return context;
|
||||
|
@ -84,7 +84,7 @@ public class SecurityIndexReaderWrapperIntegrationTests extends AbstractBuilderT
|
||||
final long nowInMillis = randomNonNegativeLong();
|
||||
QueryShardContext realQueryShardContext = new QueryShardContext(shardId.id(), indexSettings, BigArrays.NON_RECYCLING_INSTANCE,
|
||||
null, null, mapperService, null, null, xContentRegistry(), writableRegistry(),
|
||||
client, null, () -> nowInMillis, null, null);
|
||||
client, null, () -> nowInMillis, null, null, () -> true);
|
||||
QueryShardContext queryShardContext = spy(realQueryShardContext);
|
||||
DocumentSubsetBitsetCache bitsetCache = new DocumentSubsetBitsetCache(Settings.EMPTY, Executors.newSingleThreadExecutor());
|
||||
XPackLicenseState licenseState = mock(XPackLicenseState.class);
|
||||
@ -209,7 +209,7 @@ public class SecurityIndexReaderWrapperIntegrationTests extends AbstractBuilderT
|
||||
final long nowInMillis = randomNonNegativeLong();
|
||||
QueryShardContext realQueryShardContext = new QueryShardContext(shardId.id(), indexSettings, BigArrays.NON_RECYCLING_INSTANCE,
|
||||
null, null, mapperService, null, null, xContentRegistry(), writableRegistry(),
|
||||
client, null, () -> nowInMillis, null, null);
|
||||
client, null, () -> nowInMillis, null, null, () -> true);
|
||||
QueryShardContext queryShardContext = spy(realQueryShardContext);
|
||||
DocumentSubsetBitsetCache bitsetCache = new DocumentSubsetBitsetCache(Settings.EMPTY, Executors.newSingleThreadExecutor());
|
||||
|
||||
|
@ -306,7 +306,7 @@ public final class FlatObjectFieldMapper extends DynamicKeyFieldMapper {
|
||||
|
||||
@Override
|
||||
public Query fuzzyQuery(Object value, Fuzziness fuzziness, int prefixLength, int maxExpansions,
|
||||
boolean transpositions) {
|
||||
boolean transpositions, QueryShardContext context) {
|
||||
throw new UnsupportedOperationException("[fuzzy] queries are not currently supported on keyed " +
|
||||
"[" + CONTENT_TYPE + "] fields.");
|
||||
}
|
||||
|
@ -15,6 +15,7 @@ import org.apache.lucene.search.TermInSetQuery;
|
||||
import org.apache.lucene.search.TermQuery;
|
||||
import org.apache.lucene.search.TermRangeQuery;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.unit.Fuzziness;
|
||||
import org.elasticsearch.index.mapper.FieldTypeTestCase;
|
||||
import org.elasticsearch.index.mapper.MappedFieldType;
|
||||
@ -98,7 +99,12 @@ public class KeyedFlatObjectFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setName("field");
|
||||
|
||||
Query expected = new PrefixQuery(new Term("field", "key\0val"));
|
||||
assertEquals(expected, ft.prefixQuery("val", MultiTermQuery.CONSTANT_SCORE_REWRITE, null));
|
||||
assertEquals(expected, ft.prefixQuery("val", MultiTermQuery.CONSTANT_SCORE_REWRITE, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.prefixQuery("val", MultiTermQuery.CONSTANT_SCORE_REWRITE, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[prefix] queries cannot be executed when 'search.allow_expensive_queries' is set to false. " +
|
||||
"For optimised prefix queries on text fields please enable [index_prefixes].", ee.getMessage());
|
||||
}
|
||||
|
||||
public void testFuzzyQuery() {
|
||||
@ -106,7 +112,7 @@ public class KeyedFlatObjectFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setName("field");
|
||||
|
||||
UnsupportedOperationException e = expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.fuzzyQuery("valuee", Fuzziness.fromEdits(2), 1, 50, true));
|
||||
() -> ft.fuzzyQuery("value", Fuzziness.fromEdits(2), 1, 50, true, randomMockShardContext()));
|
||||
assertEquals("[fuzzy] queries are not currently supported on keyed [flattened] fields.", e.getMessage());
|
||||
}
|
||||
|
||||
@ -117,12 +123,12 @@ public class KeyedFlatObjectFieldTypeTests extends FieldTypeTestCase {
|
||||
TermRangeQuery expected = new TermRangeQuery("field",
|
||||
new BytesRef("key\0lower"),
|
||||
new BytesRef("key\0upper"), false, false);
|
||||
assertEquals(expected, ft.rangeQuery("lower", "upper", false, false, null));
|
||||
assertEquals(expected, ft.rangeQuery("lower", "upper", false, false, MOCK_QSC));
|
||||
|
||||
expected = new TermRangeQuery("field",
|
||||
new BytesRef("key\0lower"),
|
||||
new BytesRef("key\0upper"), true, true);
|
||||
assertEquals(expected, ft.rangeQuery("lower", "upper", true, true, null));
|
||||
assertEquals(expected, ft.rangeQuery("lower", "upper", true, true, MOCK_QSC));
|
||||
|
||||
IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () ->
|
||||
ft.rangeQuery("lower", null, false, false, null));
|
||||
@ -130,9 +136,14 @@ public class KeyedFlatObjectFieldTypeTests extends FieldTypeTestCase {
|
||||
e.getMessage());
|
||||
|
||||
e = expectThrows(IllegalArgumentException.class, () ->
|
||||
ft.rangeQuery(null, "upper", false, false, null));
|
||||
ft.rangeQuery(null, "upper", false, false, MOCK_QSC));
|
||||
assertEquals("[range] queries on keyed [flattened] fields must include both an upper and a lower bound.",
|
||||
e.getMessage());
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.rangeQuery("lower", "upper", false, false, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[range] queries on [text] or [keyword] fields cannot be executed when " +
|
||||
"'search.allow_expensive_queries' is set to false.", ee.getMessage());
|
||||
}
|
||||
|
||||
public void testRegexpQuery() {
|
||||
@ -140,7 +151,7 @@ public class KeyedFlatObjectFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setName("field");
|
||||
|
||||
UnsupportedOperationException e = expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.regexpQuery("valu*", 0, 10, null, null));
|
||||
() -> ft.regexpQuery("valu*", 0, 10, null, randomMockShardContext()));
|
||||
assertEquals("[regexp] queries are not currently supported on keyed [flattened] fields.", e.getMessage());
|
||||
}
|
||||
|
||||
@ -149,7 +160,7 @@ public class KeyedFlatObjectFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setName("field");
|
||||
|
||||
UnsupportedOperationException e = expectThrows(UnsupportedOperationException.class,
|
||||
() -> ft.wildcardQuery("valu*", null, null));
|
||||
() -> ft.wildcardQuery("valu*", null, randomMockShardContext()));
|
||||
assertEquals("[wildcard] queries are not currently supported on keyed [flattened] fields.", e.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -16,6 +16,7 @@ import org.apache.lucene.search.TermQuery;
|
||||
import org.apache.lucene.search.TermRangeQuery;
|
||||
import org.apache.lucene.search.WildcardQuery;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.unit.Fuzziness;
|
||||
import org.elasticsearch.index.mapper.FieldNamesFieldMapper;
|
||||
import org.elasticsearch.index.mapper.FieldTypeTestCase;
|
||||
@ -78,8 +79,14 @@ public class RootFlatObjectFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setName("field");
|
||||
|
||||
Query expected = new FuzzyQuery(new Term("field", "value"), 2, 1, 50, true);
|
||||
Query actual = ft.fuzzyQuery("value", Fuzziness.fromEdits(2), 1, 50, true);
|
||||
Query actual = ft.fuzzyQuery("value", Fuzziness.fromEdits(2), 1, 50, true, MOCK_QSC);
|
||||
assertEquals(expected, actual);
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.fuzzyQuery("value", Fuzziness.AUTO, randomInt(10) + 1, randomInt(10) + 1,
|
||||
randomBoolean(), MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[fuzzy] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
|
||||
public void testRangeQuery() {
|
||||
@ -89,12 +96,17 @@ public class RootFlatObjectFieldTypeTests extends FieldTypeTestCase {
|
||||
TermRangeQuery expected = new TermRangeQuery("field",
|
||||
new BytesRef("lower"),
|
||||
new BytesRef("upper"), false, false);
|
||||
assertEquals(expected, ft.rangeQuery("lower", "upper", false, false, null));
|
||||
assertEquals(expected, ft.rangeQuery("lower", "upper", false, false, MOCK_QSC));
|
||||
|
||||
expected = new TermRangeQuery("field",
|
||||
new BytesRef("lower"),
|
||||
new BytesRef("upper"), true, true);
|
||||
assertEquals(expected, ft.rangeQuery("lower", "upper", true, true, null));
|
||||
assertEquals(expected, ft.rangeQuery("lower", "upper", true, true, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.rangeQuery("lower", "upper", true, true, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[range] queries on [text] or [keyword] fields cannot be executed when " +
|
||||
"'search.allow_expensive_queries' is set to false.", ee.getMessage());
|
||||
}
|
||||
|
||||
public void testRegexpQuery() {
|
||||
@ -102,8 +114,13 @@ public class RootFlatObjectFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setName("field");
|
||||
|
||||
Query expected = new RegexpQuery(new Term("field", "val.*"));
|
||||
Query actual = ft.regexpQuery("val.*", 0, 10, null, null);
|
||||
Query actual = ft.regexpQuery("val.*", 0, 10, null, MOCK_QSC);
|
||||
assertEquals(expected, actual);
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.regexpQuery("val.*", randomInt(10), randomInt(10) + 1, null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[regexp] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
|
||||
public void testWildcardQuery() {
|
||||
@ -111,6 +128,11 @@ public class RootFlatObjectFieldTypeTests extends FieldTypeTestCase {
|
||||
ft.setName("field");
|
||||
|
||||
Query expected = new WildcardQuery(new Term("field", new BytesRef("valu*")));
|
||||
assertEquals(expected, ft.wildcardQuery("valu*", null, null));
|
||||
assertEquals(expected, ft.wildcardQuery("valu*", null, MOCK_QSC));
|
||||
|
||||
ElasticsearchException ee = expectThrows(ElasticsearchException.class,
|
||||
() -> ft.wildcardQuery("valu*", null, MOCK_QSC_DISALLOW_EXPENSIVE));
|
||||
assertEquals("[wildcard] queries cannot be executed when 'search.allow_expensive_queries' is set to false.",
|
||||
ee.getMessage());
|
||||
}
|
||||
}
|
||||
|
@ -93,7 +93,7 @@ public class RollupIndexerIndexingTests extends AggregatorTestCase {
|
||||
settings = createIndexSettings();
|
||||
queryShardContext = new QueryShardContext(0, settings,
|
||||
BigArrays.NON_RECYCLING_INSTANCE, null, null, null, null, null,
|
||||
null, null, null, null, () -> 0L, null, null);
|
||||
null, null, null, null, () -> 0L, null, null, () -> true);
|
||||
}
|
||||
|
||||
public void testSimpleDateHisto() throws Exception {
|
||||
|
@ -75,7 +75,8 @@ public class WatcherPluginTests extends ESTestCase {
|
||||
IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(Watch.INDEX, settings);
|
||||
AnalysisRegistry registry = new AnalysisRegistry(TestEnvironment.newEnvironment(settings), emptyMap(), emptyMap(), emptyMap(),
|
||||
emptyMap(), emptyMap(), emptyMap(), emptyMap(), emptyMap(), emptyMap());
|
||||
IndexModule indexModule = new IndexModule(indexSettings, registry, new InternalEngineFactory(), Collections.emptyMap());
|
||||
IndexModule indexModule = new IndexModule(indexSettings, registry, new InternalEngineFactory(), Collections.emptyMap(),
|
||||
() -> true);
|
||||
// this will trip an assertion if the watcher indexing operation listener is null (which it is) but we try to add it
|
||||
watcher.onIndexModule(indexModule);
|
||||
|
||||
|
Loading…
x
Reference in New Issue
Block a user