Refactor field expansion for match, multi_match and query_string query (#25726)

This commit changes the way we handle field expansion in `match`, `multi_match` and `query_string` query.
 The main changes are:

- For exact field name, the new behavior is to rewrite to a matchnodocs query when the field name is not found in the mapping.

- For partial field names (with `*` suffix), the expansion is done only on `keyword`, `text`, `date`, `ip` and `number` field types. Other field types are simply ignored.

- For all fields (`*`), the expansion is done on accepted field types only (see above) and metadata fields are also filtered.

- The `*` notation can also be used to set `default_field` option on`query_string` query. This should replace the needs for the extra option `use_all_fields` which is deprecated in this change.

This commit also rewrites simple `*` query to matchalldocs query when all fields are requested (Fixes #25556). 

The same change should be done on `simple_query_string` for completeness.

`use_all_fields` option in `query_string` is also deprecated in this change, `default_field` should be set to `*` instead.

Relates #25551
This commit is contained in:
Jim Ferenczi 2017-07-21 16:52:57 +02:00 committed by GitHub
parent 47f92d7c62
commit c3784326eb
16 changed files with 408 additions and 440 deletions

View File

@ -30,6 +30,7 @@ import org.apache.lucene.search.MatchNoDocsQuery;
import org.apache.lucene.search.PrefixQuery; import org.apache.lucene.search.PrefixQuery;
import org.apache.lucene.search.Query; import org.apache.lucene.search.Query;
import org.apache.lucene.util.BytesRef; import org.apache.lucene.util.BytesRef;
import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.common.Nullable; import org.elasticsearch.common.Nullable;
import org.elasticsearch.index.mapper.TypeFieldMapper; import org.elasticsearch.index.mapper.TypeFieldMapper;
@ -47,6 +48,16 @@ public class Queries {
return new MatchNoDocsQuery(reason); return new MatchNoDocsQuery(reason);
} }
public static Query newUnmappedFieldQuery(String field) {
return Queries.newMatchNoDocsQuery("unmapped field [" + (field != null ? field : "null") + "]");
}
public static Query newLenientFieldQuery(String field, RuntimeException e) {
String message = ElasticsearchException.getExceptionName(e) + ":[" + e.getMessage() + "]";
return Queries.newMatchNoDocsQuery("failed [" + field + "] query, caused by " + message);
}
public static Query newNestedFilter() { public static Query newNestedFilter() {
return new PrefixQuery(new Term(TypeFieldMapper.NAME, new BytesRef("__"))); return new PrefixQuery(new Term(TypeFieldMapper.NAME, new BytesRef("__")));
} }

View File

@ -36,6 +36,7 @@ import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.index.query.support.QueryParsers; import org.elasticsearch.index.query.support.QueryParsers;
import org.elasticsearch.index.search.MatchQuery; import org.elasticsearch.index.search.MatchQuery;
import org.elasticsearch.index.search.MultiMatchQuery; import org.elasticsearch.index.search.MultiMatchQuery;
import org.elasticsearch.index.search.QueryStringQueryParser;
import java.io.IOException; import java.io.IOException;
import java.util.HashMap; import java.util.HashMap;
@ -739,27 +740,10 @@ public class MultiMatchQueryBuilder extends AbstractQueryBuilder<MultiMatchQuery
} }
} }
Map<String, Float> newFieldsBoosts = handleFieldsMatchPattern(context.getMapperService(), fieldsBoosts); Map<String, Float> newFieldsBoosts = QueryStringQueryParser.resolveMappingFields(context, fieldsBoosts);
return multiMatchQuery.parse(type, newFieldsBoosts, value, minimumShouldMatch); return multiMatchQuery.parse(type, newFieldsBoosts, value, minimumShouldMatch);
} }
private static Map<String, Float> handleFieldsMatchPattern(MapperService mapperService, Map<String, Float> fieldsBoosts) {
Map<String, Float> newFieldsBoosts = new TreeMap<>();
for (Map.Entry<String, Float> fieldBoost : fieldsBoosts.entrySet()) {
String fField = fieldBoost.getKey();
Float fBoost = fieldBoost.getValue();
if (Regex.isSimpleMatchPattern(fField)) {
for (String field : mapperService.simpleMatchToIndexNames(fField)) {
newFieldsBoosts.put(field, fBoost);
}
} else {
newFieldsBoosts.put(fField, fBoost);
}
}
return newFieldsBoosts;
}
@Override @Override
protected int doHashCode() { protected int doHashCode() {
return Objects.hash(value, fieldsBoosts, type, operator, analyzer, slop, fuzziness, return Objects.hash(value, fieldsBoosts, type, operator, analyzer, slop, fuzziness,

View File

@ -34,28 +34,17 @@ import org.elasticsearch.common.unit.Fuzziness;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.index.analysis.NamedAnalyzer; import org.elasticsearch.index.analysis.NamedAnalyzer;
import org.elasticsearch.index.mapper.DateFieldMapper;
import org.elasticsearch.index.mapper.IpFieldMapper;
import org.elasticsearch.index.mapper.KeywordFieldMapper;
import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.index.mapper.NumberFieldMapper;
import org.elasticsearch.index.mapper.ScaledFloatFieldMapper;
import org.elasticsearch.index.mapper.TextFieldMapper;
import org.elasticsearch.index.query.support.QueryParsers; import org.elasticsearch.index.query.support.QueryParsers;
import org.elasticsearch.index.search.QueryStringQueryParser; import org.elasticsearch.index.search.QueryStringQueryParser;
import org.joda.time.DateTimeZone; import org.joda.time.DateTimeZone;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collection;
import java.util.HashMap; import java.util.HashMap;
import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.Locale; import java.util.Locale;
import java.util.Map; import java.util.Map;
import java.util.Objects; import java.util.Objects;
import java.util.Set;
import java.util.TreeMap; import java.util.TreeMap;
/** /**
@ -110,24 +99,10 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
private static final ParseField TIME_ZONE_FIELD = new ParseField("time_zone"); private static final ParseField TIME_ZONE_FIELD = new ParseField("time_zone");
private static final ParseField SPLIT_ON_WHITESPACE = new ParseField("split_on_whitespace") private static final ParseField SPLIT_ON_WHITESPACE = new ParseField("split_on_whitespace")
.withAllDeprecated("This setting is ignored, the parser always splits on logical operator"); .withAllDeprecated("This setting is ignored, the parser always splits on logical operator");
private static final ParseField ALL_FIELDS_FIELD = new ParseField("all_fields"); private static final ParseField ALL_FIELDS_FIELD = new ParseField("all_fields")
.withAllDeprecated("Set [default_field] to `*` instead");
private static final ParseField TYPE_FIELD = new ParseField("type"); private static final ParseField TYPE_FIELD = new ParseField("type");
// Mapping types the "all-ish" query can be executed against
public static final Set<String> ALLOWED_QUERY_MAPPER_TYPES;
static {
ALLOWED_QUERY_MAPPER_TYPES = new HashSet<>();
ALLOWED_QUERY_MAPPER_TYPES.add(DateFieldMapper.CONTENT_TYPE);
ALLOWED_QUERY_MAPPER_TYPES.add(IpFieldMapper.CONTENT_TYPE);
ALLOWED_QUERY_MAPPER_TYPES.add(KeywordFieldMapper.CONTENT_TYPE);
for (NumberFieldMapper.NumberType nt : NumberFieldMapper.NumberType.values()) {
ALLOWED_QUERY_MAPPER_TYPES.add(nt.typeName());
}
ALLOWED_QUERY_MAPPER_TYPES.add(ScaledFloatFieldMapper.CONTENT_TYPE);
ALLOWED_QUERY_MAPPER_TYPES.add(TextFieldMapper.CONTENT_TYPE);
}
private final String queryString; private final String queryString;
private String defaultField; private String defaultField;
@ -179,8 +154,6 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
private DateTimeZone timeZone; private DateTimeZone timeZone;
private Boolean useAllFields;
/** To limit effort spent determinizing regexp queries. */ /** To limit effort spent determinizing regexp queries. */
private int maxDeterminizedStates = DEFAULT_MAX_DETERMINED_STATES; private int maxDeterminizedStates = DEFAULT_MAX_DETERMINED_STATES;
@ -240,8 +213,11 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
if (in.getVersion().onOrAfter(Version.V_5_1_1)) { if (in.getVersion().onOrAfter(Version.V_5_1_1)) {
if (in.getVersion().before(Version.V_6_0_0_beta1)) { if (in.getVersion().before(Version.V_6_0_0_beta1)) {
in.readBoolean(); // split_on_whitespace in.readBoolean(); // split_on_whitespace
Boolean useAllField = in.readOptionalBoolean();
if (useAllField != null && useAllField) {
defaultField = "*";
}
} }
useAllFields = in.readOptionalBoolean();
} }
} }
@ -291,8 +267,9 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
if (out.getVersion().onOrAfter(Version.V_5_1_1)) { if (out.getVersion().onOrAfter(Version.V_5_1_1)) {
if (out.getVersion().before(Version.V_6_0_0_beta1)) { if (out.getVersion().before(Version.V_6_0_0_beta1)) {
out.writeBoolean(false); // split_on_whitespace out.writeBoolean(false); // split_on_whitespace
Boolean useAllFields = defaultField == null ? null : Regex.isMatchAllPattern(defaultField);
out.writeOptionalBoolean(useAllFields);
} }
out.writeOptionalBoolean(this.useAllFields);
} }
} }
@ -314,17 +291,19 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
} }
/** /**
* Tell the query_string query to use all fields explicitly, even if _all is * This setting is deprecated, set {@link #defaultField(String)} to "*" instead.
* enabled. If the "default_field" parameter or "fields" are specified, they
* will be ignored.
*/ */
@Deprecated
public QueryStringQueryBuilder useAllFields(Boolean useAllFields) { public QueryStringQueryBuilder useAllFields(Boolean useAllFields) {
this.useAllFields = useAllFields; if (useAllFields) {
this.defaultField = "*";
}
return this; return this;
} }
@Deprecated
public Boolean useAllFields() { public Boolean useAllFields() {
return this.useAllFields; return defaultField == null ? null : Regex.isMatchAllPattern(defaultField);
} }
/** /**
@ -703,9 +682,6 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
builder.field(TIME_ZONE_FIELD.getPreferredName(), this.timeZone.getID()); builder.field(TIME_ZONE_FIELD.getPreferredName(), this.timeZone.getID());
} }
builder.field(ESCAPE_FIELD.getPreferredName(), this.escape); builder.field(ESCAPE_FIELD.getPreferredName(), this.escape);
if (this.useAllFields != null) {
builder.field(ALL_FIELDS_FIELD.getPreferredName(), this.useAllFields);
}
printBoostAndQueryName(builder); printBoostAndQueryName(builder);
builder.endObject(); builder.endObject();
} }
@ -737,7 +713,6 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
Fuzziness fuzziness = QueryStringQueryBuilder.DEFAULT_FUZZINESS; Fuzziness fuzziness = QueryStringQueryBuilder.DEFAULT_FUZZINESS;
String fuzzyRewrite = null; String fuzzyRewrite = null;
String rewrite = null; String rewrite = null;
Boolean useAllFields = null;
Map<String, Float> fieldsAndWeights = new HashMap<>(); Map<String, Float> fieldsAndWeights = new HashMap<>();
while ((token = parser.nextToken()) != XContentParser.Token.END_OBJECT) { while ((token = parser.nextToken()) != XContentParser.Token.END_OBJECT) {
if (token == XContentParser.Token.FIELD_NAME) { if (token == XContentParser.Token.FIELD_NAME) {
@ -812,7 +787,7 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
} else if (LENIENT_FIELD.match(currentFieldName)) { } else if (LENIENT_FIELD.match(currentFieldName)) {
lenient = parser.booleanValue(); lenient = parser.booleanValue();
} else if (ALL_FIELDS_FIELD.match(currentFieldName)) { } else if (ALL_FIELDS_FIELD.match(currentFieldName)) {
useAllFields = parser.booleanValue(); defaultField = "*";
} else if (MAX_DETERMINIZED_STATES_FIELD.match(currentFieldName)) { } else if (MAX_DETERMINIZED_STATES_FIELD.match(currentFieldName)) {
maxDeterminizedStates = parser.intValue(); maxDeterminizedStates = parser.intValue();
} else if (TIME_ZONE_FIELD.match(currentFieldName)) { } else if (TIME_ZONE_FIELD.match(currentFieldName)) {
@ -847,12 +822,6 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
throw new ParsingException(parser.getTokenLocation(), "[" + QueryStringQueryBuilder.NAME + "] must be provided with a [query]"); throw new ParsingException(parser.getTokenLocation(), "[" + QueryStringQueryBuilder.NAME + "] must be provided with a [query]");
} }
if ((useAllFields != null && useAllFields) &&
(defaultField != null || fieldsAndWeights.size() != 0)) {
throw new ParsingException(parser.getTokenLocation(),
"cannot use [all_fields] parameter in conjunction with [default_field] or [fields]");
}
QueryStringQueryBuilder queryStringQuery = new QueryStringQueryBuilder(queryString); QueryStringQueryBuilder queryStringQuery = new QueryStringQueryBuilder(queryString);
queryStringQuery.fields(fieldsAndWeights); queryStringQuery.fields(fieldsAndWeights);
queryStringQuery.defaultField(defaultField); queryStringQuery.defaultField(defaultField);
@ -880,7 +849,6 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
queryStringQuery.timeZone(timeZone); queryStringQuery.timeZone(timeZone);
queryStringQuery.boost(boost); queryStringQuery.boost(boost);
queryStringQuery.queryName(queryName); queryStringQuery.queryName(queryName);
queryStringQuery.useAllFields(useAllFields);
return queryStringQuery; return queryStringQuery;
} }
@ -914,8 +882,7 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
timeZone == null ? other.timeZone == null : other.timeZone != null && timeZone == null ? other.timeZone == null : other.timeZone != null &&
Objects.equals(timeZone.getID(), other.timeZone.getID()) && Objects.equals(timeZone.getID(), other.timeZone.getID()) &&
Objects.equals(escape, other.escape) && Objects.equals(escape, other.escape) &&
Objects.equals(maxDeterminizedStates, other.maxDeterminizedStates) && Objects.equals(maxDeterminizedStates, other.maxDeterminizedStates);
Objects.equals(useAllFields, other.useAllFields);
} }
@Override @Override
@ -924,72 +891,37 @@ public class QueryStringQueryBuilder extends AbstractQueryBuilder<QueryStringQue
quoteFieldSuffix, allowLeadingWildcard, analyzeWildcard, quoteFieldSuffix, allowLeadingWildcard, analyzeWildcard,
enablePositionIncrements, fuzziness, fuzzyPrefixLength, enablePositionIncrements, fuzziness, fuzzyPrefixLength,
fuzzyMaxExpansions, fuzzyRewrite, phraseSlop, type, tieBreaker, rewrite, minimumShouldMatch, lenient, fuzzyMaxExpansions, fuzzyRewrite, phraseSlop, type, tieBreaker, rewrite, minimumShouldMatch, lenient,
timeZone == null ? 0 : timeZone.getID(), escape, maxDeterminizedStates, useAllFields); timeZone == null ? 0 : timeZone.getID(), escape, maxDeterminizedStates);
}
/**
* Given a shard context, return a map of all fields in the mappings that
* can be queried. The map will be field name to a float of 1.0f.
*/
public static Map<String, Float> allQueryableDefaultFields(QueryShardContext context) {
Collection<String> allFields = context.simpleMatchToIndexNames("*");
Map<String, Float> fields = new HashMap<>();
for (String fieldName : allFields) {
if (MapperService.isMetadataField(fieldName)) {
// Ignore our metadata fields
continue;
}
MappedFieldType mft = context.fieldMapper(fieldName);
assert mft != null : "should never have a null mapper for an existing field";
// Ignore fields that are not in the allowed mapper types. Some
// types do not support term queries, and thus we cannot generate
// a special query for them.
String mappingType = mft.typeName();
if (ALLOWED_QUERY_MAPPER_TYPES.contains(mappingType)) {
fields.put(fieldName, 1.0f);
}
}
return fields;
} }
@Override @Override
protected Query doToQuery(QueryShardContext context) throws IOException { protected Query doToQuery(QueryShardContext context) throws IOException {
String rewrittenQueryString = escape ? org.apache.lucene.queryparser.classic.QueryParser.escape(this.queryString) : queryString; String rewrittenQueryString = escape ? org.apache.lucene.queryparser.classic.QueryParser.escape(this.queryString) : queryString;
if ((useAllFields != null && useAllFields) && (fieldsAndWeights.size() != 0 || this.defaultField != null)) { if (fieldsAndWeights.size() > 0 && this.defaultField != null) {
throw addValidationError("cannot use [all_fields] parameter in conjunction with [default_field] or [fields]", null); throw addValidationError("cannot use [fields] parameter in conjunction with [default_field]", null);
} }
QueryStringQueryParser queryParser; QueryStringQueryParser queryParser;
boolean isLenient = lenient == null ? context.queryStringLenient() : lenient; boolean isLenient = lenient == null ? context.queryStringLenient() : lenient;
if (defaultField != null) { if (defaultField != null) {
queryParser = new QueryStringQueryParser(context, defaultField, isLenient); if (Regex.isMatchAllPattern(defaultField)) {
} else if (fieldsAndWeights.size() > 0) { queryParser = new QueryStringQueryParser(context, lenient == null ? true : lenient);
final Map<String, Float> resolvedFields = new TreeMap<>(); } else {
for (Map.Entry<String, Float> fieldsEntry : fieldsAndWeights.entrySet()) { queryParser = new QueryStringQueryParser(context, defaultField, isLenient);
String fieldName = fieldsEntry.getKey();
Float weight = fieldsEntry.getValue();
if (Regex.isSimpleMatchPattern(fieldName)) {
for (String resolvedFieldName : context.getMapperService().simpleMatchToIndexNames(fieldName)) {
resolvedFields.put(resolvedFieldName, weight);
}
} else {
resolvedFields.put(fieldName, weight);
}
} }
} else if (fieldsAndWeights.size() > 0) {
final Map<String, Float> resolvedFields = QueryStringQueryParser.resolveMappingFields(context, fieldsAndWeights);
queryParser = new QueryStringQueryParser(context, resolvedFields, isLenient); queryParser = new QueryStringQueryParser(context, resolvedFields, isLenient);
} else { } else {
// If explicitly required to use all fields, use all fields, OR: // Expand to all fields if:
// Automatically determine the fields (to replace the _all field) if all of the following are true: // - The index default search field is "*"
// - The _all field is disabled, // - The index default search field is "_all" and _all is disabled
// - and the default_field has not been changed in the settings // TODO the index default search field should be "*" for new indices.
// - and default_field is not specified in the request if (Regex.isMatchAllPattern(context.defaultField()) ||
// - and no fields are specified in the request
if ((useAllFields != null && useAllFields) ||
(context.getMapperService().allEnabled() == false && "_all".equals(context.defaultField()))) { (context.getMapperService().allEnabled() == false && "_all".equals(context.defaultField()))) {
// Automatically determine the fields from the index mapping. // Automatically determine the fields from the index mapping.
// Automatically set leniency to "true" if unset so mismatched fields don't cause exceptions; // Automatically set leniency to "true" if unset so mismatched fields don't cause exceptions;
queryParser = new QueryStringQueryParser(context, allQueryableDefaultFields(context), lenient == null ? true : lenient); queryParser = new QueryStringQueryParser(context, lenient == null ? true : lenient);
} else { } else {
queryParser = new QueryStringQueryParser(context, context.defaultField(), isLenient); queryParser = new QueryStringQueryParser(context, context.defaultField(), isLenient);
} }

View File

@ -33,6 +33,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.index.mapper.MappedFieldType; import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.query.SimpleQueryParser.Settings; import org.elasticsearch.index.query.SimpleQueryParser.Settings;
import org.elasticsearch.index.search.QueryStringQueryParser;
import java.io.IOException; import java.io.IOException;
import java.util.HashMap; import java.util.HashMap;
@ -376,7 +377,8 @@ public class SimpleQueryStringBuilder extends AbstractQueryBuilder<SimpleQuerySt
(context.getMapperService().allEnabled() == false && (context.getMapperService().allEnabled() == false &&
"_all".equals(context.defaultField()) && "_all".equals(context.defaultField()) &&
this.fieldsAndWeights.isEmpty())) { this.fieldsAndWeights.isEmpty())) {
resolvedFieldsAndWeights = QueryStringQueryBuilder.allQueryableDefaultFields(context); resolvedFieldsAndWeights = QueryStringQueryParser.resolveMappingField(context, "*", 1.0f,
false, false);
// Need to use lenient mode when using "all-mode" so exceptions aren't thrown due to mismatched types // Need to use lenient mode when using "all-mode" so exceptions aren't thrown due to mismatched types
newSettings.lenient(lenientSet ? settings.lenient() : true); newSettings.lenient(lenientSet ? settings.lenient() : true);
} else { } else {

View File

@ -1,44 +0,0 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index.search;
import org.apache.lucene.index.Term;
import org.apache.lucene.search.Query;
import org.apache.lucene.search.WildcardQuery;
import org.elasticsearch.index.mapper.FieldNamesFieldMapper;
import org.elasticsearch.index.query.ExistsQueryBuilder;
import org.elasticsearch.index.query.QueryShardContext;
public class ExistsFieldQueryExtension implements FieldQueryExtension {
public static final String NAME = "_exists_";
@Override
public Query query(QueryShardContext context, String queryText) {
final FieldNamesFieldMapper.FieldNamesFieldType fieldNamesFieldType =
(FieldNamesFieldMapper.FieldNamesFieldType) context.getMapperService().fullName(FieldNamesFieldMapper.NAME);
if (fieldNamesFieldType.isEnabled() == false) {
// The field_names_field is disabled so we switch to a wildcard query that matches all terms
return new WildcardQuery(new Term(queryText, "*"));
}
return ExistsQueryBuilder.newFilter(context, queryText);
}
}

View File

@ -1,28 +0,0 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index.search;
import org.apache.lucene.search.Query;
import org.elasticsearch.index.query.QueryShardContext;
public interface FieldQueryExtension {
Query query(QueryShardContext context, String queryText);
}

View File

@ -29,6 +29,7 @@ import org.apache.lucene.search.BooleanClause.Occur;
import org.apache.lucene.search.BooleanQuery; import org.apache.lucene.search.BooleanQuery;
import org.apache.lucene.search.BoostQuery; import org.apache.lucene.search.BoostQuery;
import org.apache.lucene.search.FuzzyQuery; import org.apache.lucene.search.FuzzyQuery;
import org.apache.lucene.search.MatchNoDocsQuery;
import org.apache.lucene.search.MultiPhraseQuery; import org.apache.lucene.search.MultiPhraseQuery;
import org.apache.lucene.search.MultiTermQuery; import org.apache.lucene.search.MultiTermQuery;
import org.apache.lucene.search.PhraseQuery; import org.apache.lucene.search.PhraseQuery;
@ -41,9 +42,9 @@ import org.apache.lucene.search.spans.SpanNearQuery;
import org.apache.lucene.search.spans.SpanOrQuery; import org.apache.lucene.search.spans.SpanOrQuery;
import org.apache.lucene.search.spans.SpanQuery; import org.apache.lucene.search.spans.SpanQuery;
import org.apache.lucene.search.spans.SpanTermQuery; import org.apache.lucene.search.spans.SpanTermQuery;
import org.apache.lucene.util.BytesRef;
import org.apache.lucene.util.QueryBuilder; import org.apache.lucene.util.QueryBuilder;
import org.elasticsearch.ElasticsearchException; import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.common.Nullable;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.io.stream.Writeable; import org.elasticsearch.common.io.stream.Writeable;
@ -58,6 +59,9 @@ import org.elasticsearch.index.query.support.QueryParsers;
import java.io.IOException; import java.io.IOException;
import static org.elasticsearch.common.lucene.search.Queries.newLenientFieldQuery;
import static org.elasticsearch.common.lucene.search.Queries.newUnmappedFieldQuery;
public class MatchQuery { public class MatchQuery {
public enum Type implements Writeable { public enum Type implements Writeable {
@ -224,23 +228,18 @@ public class MatchQuery {
protected Analyzer getAnalyzer(MappedFieldType fieldType, boolean quoted) { protected Analyzer getAnalyzer(MappedFieldType fieldType, boolean quoted) {
if (analyzer == null) { if (analyzer == null) {
if (fieldType != null) { return quoted ? context.getSearchQuoteAnalyzer(fieldType) : context.getSearchAnalyzer(fieldType);
return quoted ? context.getSearchQuoteAnalyzer(fieldType) : context.getSearchAnalyzer(fieldType);
}
return quoted ? context.getMapperService().searchQuoteAnalyzer() : context.getMapperService().searchAnalyzer();
} else { } else {
return analyzer; return analyzer;
} }
} }
public Query parse(Type type, String fieldName, Object value) throws IOException { public Query parse(Type type, String fieldName, Object value) throws IOException {
final String field;
MappedFieldType fieldType = context.fieldMapper(fieldName); MappedFieldType fieldType = context.fieldMapper(fieldName);
if (fieldType != null) { if (fieldType == null) {
field = fieldType.name(); return newUnmappedFieldQuery(fieldName);
} else {
field = fieldName;
} }
final String field = fieldType.name();
/* /*
* If the user forced an analyzer we really don't care if they are * If the user forced an analyzer we really don't care if they are
@ -251,7 +250,7 @@ public class MatchQuery {
* passing through QueryBuilder. * passing through QueryBuilder.
*/ */
boolean noForcedAnalyzer = this.analyzer == null; boolean noForcedAnalyzer = this.analyzer == null;
if (fieldType != null && fieldType.tokenized() == false && noForcedAnalyzer) { if (fieldType.tokenized() == false && noForcedAnalyzer) {
return blendTermQuery(new Term(fieldName, value.toString()), fieldType); return blendTermQuery(new Term(fieldName, value.toString()), fieldType);
} }
@ -286,12 +285,12 @@ public class MatchQuery {
} }
} }
protected final Query termQuery(MappedFieldType fieldType, Object value, boolean lenient) { protected final Query termQuery(MappedFieldType fieldType, BytesRef value, boolean lenient) {
try { try {
return fieldType.termQuery(value, context); return fieldType.termQuery(value, context);
} catch (RuntimeException e) { } catch (RuntimeException e) {
if (lenient) { if (lenient) {
return null; return newLenientFieldQuery(fieldType.name(), e);
} }
throw e; throw e;
} }
@ -311,7 +310,7 @@ public class MatchQuery {
/** /**
* Creates a new QueryBuilder using the given analyzer. * Creates a new QueryBuilder using the given analyzer.
*/ */
MatchQueryBuilder(Analyzer analyzer, @Nullable MappedFieldType mapper) { MatchQueryBuilder(Analyzer analyzer, MappedFieldType mapper) {
super(analyzer); super(analyzer);
this.mapper = mapper; this.mapper = mapper;
} }
@ -454,30 +453,21 @@ public class MatchQuery {
protected Query blendTermQuery(Term term, MappedFieldType fieldType) { protected Query blendTermQuery(Term term, MappedFieldType fieldType) {
if (fuzziness != null) { if (fuzziness != null) {
if (fieldType != null) { try {
try { Query query = fieldType.fuzzyQuery(term.text(), fuzziness, fuzzyPrefixLength, maxExpansions, transpositions);
Query query = fieldType.fuzzyQuery(term.text(), fuzziness, fuzzyPrefixLength, maxExpansions, transpositions); if (query instanceof FuzzyQuery) {
if (query instanceof FuzzyQuery) { QueryParsers.setRewriteMethod((FuzzyQuery) query, fuzzyRewriteMethod);
QueryParsers.setRewriteMethod((FuzzyQuery) query, fuzzyRewriteMethod); }
} return query;
return query; } catch (RuntimeException e) {
} catch (RuntimeException e) { if (lenient) {
if (lenient) { return newLenientFieldQuery(fieldType.name(), e);
return null; } else {
} else { throw e;
throw e;
}
} }
} }
int edits = fuzziness.asDistance(term.text());
FuzzyQuery query = new FuzzyQuery(term, edits, fuzzyPrefixLength, maxExpansions, transpositions);
QueryParsers.setRewriteMethod(query, fuzzyRewriteMethod);
return query;
} }
if (fieldType != null) { return termQuery(fieldType, term.bytes(), lenient);
return termQuery(fieldType, term.bytes(), lenient);
}
return new TermQuery(term);
} }
} }

View File

@ -139,7 +139,7 @@ public class MultiMatchQuery extends MatchQuery {
return MultiMatchQuery.super.blendTermsQuery(terms, fieldType); return MultiMatchQuery.super.blendTermsQuery(terms, fieldType);
} }
public Query termQuery(MappedFieldType fieldType, Object value) { public Query termQuery(MappedFieldType fieldType, BytesRef value) {
return MultiMatchQuery.this.termQuery(fieldType, value, lenient); return MultiMatchQuery.this.termQuery(fieldType, value, lenient);
} }
} }
@ -154,7 +154,7 @@ public class MultiMatchQuery extends MatchQuery {
@Override @Override
public List<Query> buildGroupedQueries(MultiMatchQueryBuilder.Type type, Map<String, Float> fieldNames, Object value, String minimumShouldMatch) throws IOException { public List<Query> buildGroupedQueries(MultiMatchQueryBuilder.Type type, Map<String, Float> fieldNames, Object value, String minimumShouldMatch) throws IOException {
Map<Analyzer, List<FieldAndFieldType>> groups = new HashMap<>(); Map<Analyzer, List<FieldAndFieldType>> groups = new HashMap<>();
List<Tuple<String, Float>> missing = new ArrayList<>(); List<Query> queries = new ArrayList<>();
for (Map.Entry<String, Float> entry : fieldNames.entrySet()) { for (Map.Entry<String, Float> entry : fieldNames.entrySet()) {
String name = entry.getKey(); String name = entry.getKey();
MappedFieldType fieldType = context.fieldMapper(name); MappedFieldType fieldType = context.fieldMapper(name);
@ -168,15 +168,7 @@ public class MultiMatchQuery extends MatchQuery {
boost = boost == null ? Float.valueOf(1.0f) : boost; boost = boost == null ? Float.valueOf(1.0f) : boost;
groups.get(actualAnalyzer).add(new FieldAndFieldType(fieldType, boost)); groups.get(actualAnalyzer).add(new FieldAndFieldType(fieldType, boost));
} else { } else {
missing.add(new Tuple<>(name, entry.getValue())); queries.add(new MatchNoDocsQuery("unknown field " + name));
}
}
List<Query> queries = new ArrayList<>();
for (Tuple<String, Float> tuple : missing) {
Query q = parseGroup(type.matchQueryType(), tuple.v1(), tuple.v2(), value, minimumShouldMatch);
if (q != null) {
queries.add(q);
} }
} }
for (List<FieldAndFieldType> group : groups.values()) { for (List<FieldAndFieldType> group : groups.values()) {
@ -225,13 +217,13 @@ public class MultiMatchQuery extends MatchQuery {
} }
@Override @Override
public Query termQuery(MappedFieldType fieldType, Object value) { public Query termQuery(MappedFieldType fieldType, BytesRef value) {
/* /*
* Use the string value of the term because we're reusing the * Use the string value of the term because we're reusing the
* portion of the query is usually after the analyzer has run on * portion of the query is usually after the analyzer has run on
* each term. We just skip that analyzer phase. * each term. We just skip that analyzer phase.
*/ */
return blendTerm(new Term(fieldType.name(), value.toString()), fieldType); return blendTerm(new Term(fieldType.name(), value.utf8ToString()), fieldType);
} }
} }
@ -241,7 +233,7 @@ public class MultiMatchQuery extends MatchQuery {
} }
static Query blendTerms(QueryShardContext context, BytesRef[] values, Float commonTermsCutoff, float tieBreaker, static Query blendTerms(QueryShardContext context, BytesRef[] values, Float commonTermsCutoff, float tieBreaker,
FieldAndFieldType... blendedFields) { FieldAndFieldType... blendedFields) {
List<Query> queries = new ArrayList<>(); List<Query> queries = new ArrayList<>();
Term[] terms = new Term[blendedFields.length * values.length]; Term[] terms = new Term[blendedFields.length * values.length];
float[] blendedBoost = new float[blendedFields.length * values.length]; float[] blendedBoost = new float[blendedFields.length * values.length];

View File

@ -38,19 +38,31 @@ import org.apache.lucene.search.MultiTermQuery;
import org.apache.lucene.search.PhraseQuery; import org.apache.lucene.search.PhraseQuery;
import org.apache.lucene.search.Query; import org.apache.lucene.search.Query;
import org.apache.lucene.search.SynonymQuery; import org.apache.lucene.search.SynonymQuery;
import org.apache.lucene.search.WildcardQuery;
import org.apache.lucene.search.spans.SpanNearQuery; import org.apache.lucene.search.spans.SpanNearQuery;
import org.apache.lucene.search.spans.SpanOrQuery; import org.apache.lucene.search.spans.SpanOrQuery;
import org.apache.lucene.search.spans.SpanQuery; import org.apache.lucene.search.spans.SpanQuery;
import org.apache.lucene.util.BytesRef; import org.apache.lucene.util.BytesRef;
import org.apache.lucene.util.IOUtils; import org.apache.lucene.util.IOUtils;
import org.elasticsearch.common.lucene.search.Queries; import org.elasticsearch.common.lucene.search.Queries;
import org.elasticsearch.common.regex.Regex;
import org.elasticsearch.common.unit.Fuzziness; import org.elasticsearch.common.unit.Fuzziness;
import org.elasticsearch.index.analysis.ShingleTokenFilterFactory; import org.elasticsearch.index.analysis.ShingleTokenFilterFactory;
import org.elasticsearch.index.mapper.AllFieldMapper; import org.elasticsearch.index.mapper.AllFieldMapper;
import org.elasticsearch.index.mapper.DateFieldMapper; import org.elasticsearch.index.mapper.DateFieldMapper;
import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.FieldMapper;
import org.elasticsearch.index.mapper.FieldNamesFieldMapper;
import org.elasticsearch.index.mapper.IpFieldMapper;
import org.elasticsearch.index.mapper.KeywordFieldMapper;
import org.elasticsearch.index.mapper.MappedFieldType; import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.MetadataFieldMapper;
import org.elasticsearch.index.mapper.NumberFieldMapper;
import org.elasticsearch.index.mapper.ScaledFloatFieldMapper;
import org.elasticsearch.index.mapper.StringFieldType; import org.elasticsearch.index.mapper.StringFieldType;
import org.elasticsearch.index.mapper.MapperService; import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.index.mapper.TextFieldMapper;
import org.elasticsearch.index.query.ExistsQueryBuilder;
import org.elasticsearch.index.query.MultiMatchQueryBuilder; import org.elasticsearch.index.query.MultiMatchQueryBuilder;
import org.elasticsearch.index.query.QueryShardContext; import org.elasticsearch.index.query.QueryShardContext;
import org.elasticsearch.index.query.support.QueryParsers; import org.elasticsearch.index.query.support.QueryParsers;
@ -61,11 +73,14 @@ import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
import java.util.Collections; import java.util.Collections;
import java.util.HashMap; import java.util.HashMap;
import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Set;
import static java.util.Collections.unmodifiableMap;
import static org.elasticsearch.common.lucene.search.Queries.fixNegativeQueryIfNeeded; import static org.elasticsearch.common.lucene.search.Queries.fixNegativeQueryIfNeeded;
import static org.elasticsearch.common.lucene.search.Queries.newLenientFieldQuery;
import static org.elasticsearch.common.lucene.search.Queries.newUnmappedFieldQuery;
/** /**
* A {@link XQueryParser} that uses the {@link MapperService} in order to build smarter * A {@link XQueryParser} that uses the {@link MapperService} in order to build smarter
@ -74,12 +89,20 @@ import static org.elasticsearch.common.lucene.search.Queries.fixNegativeQueryIfN
* to assemble the result logically. * to assemble the result logically.
*/ */
public class QueryStringQueryParser extends XQueryParser { public class QueryStringQueryParser extends XQueryParser {
private static final Map<String, FieldQueryExtension> FIELD_QUERY_EXTENSIONS; // Mapping types the "all-ish" query can be executed against
private static final Set<String> ALLOWED_QUERY_MAPPER_TYPES;
private static final String EXISTS_FIELD = "_exists_";
static { static {
Map<String, FieldQueryExtension> fieldQueryExtensions = new HashMap<>(); ALLOWED_QUERY_MAPPER_TYPES = new HashSet<>();
fieldQueryExtensions.put(ExistsFieldQueryExtension.NAME, new ExistsFieldQueryExtension()); ALLOWED_QUERY_MAPPER_TYPES.add(DateFieldMapper.CONTENT_TYPE);
FIELD_QUERY_EXTENSIONS = unmodifiableMap(fieldQueryExtensions); ALLOWED_QUERY_MAPPER_TYPES.add(IpFieldMapper.CONTENT_TYPE);
ALLOWED_QUERY_MAPPER_TYPES.add(KeywordFieldMapper.CONTENT_TYPE);
for (NumberFieldMapper.NumberType nt : NumberFieldMapper.NumberType.values()) {
ALLOWED_QUERY_MAPPER_TYPES.add(nt.typeName());
}
ALLOWED_QUERY_MAPPER_TYPES.add(ScaledFloatFieldMapper.CONTENT_TYPE);
ALLOWED_QUERY_MAPPER_TYPES.add(TextFieldMapper.CONTENT_TYPE);
} }
private final QueryShardContext context; private final QueryShardContext context;
@ -134,7 +157,18 @@ public class QueryStringQueryParser extends XQueryParser {
this(context, null, fieldsAndWeights, lenient, context.getMapperService().searchAnalyzer()); this(context, null, fieldsAndWeights, lenient, context.getMapperService().searchAnalyzer());
} }
private QueryStringQueryParser(QueryShardContext context, String defaultField, Map<String, Float> fieldsAndWeights, /**
* Defaults to all queryiable fields extracted from the mapping for query terms
* @param context The query shard context
* @param lenient If set to `true` will cause format based failures (like providing text to a numeric field) to be ignored.
*/
public QueryStringQueryParser(QueryShardContext context, boolean lenient) {
this(context, "*", resolveMappingField(context, "*", 1.0f, false, false),
lenient, context.getMapperService().searchAnalyzer());
}
private QueryStringQueryParser(QueryShardContext context, String defaultField,
Map<String, Float> fieldsAndWeights,
boolean lenient, Analyzer analyzer) { boolean lenient, Analyzer analyzer) {
super(defaultField, analyzer); super(defaultField, analyzer);
this.context = context; this.context = context;
@ -144,6 +178,69 @@ public class QueryStringQueryParser extends XQueryParser {
this.lenient = lenient; this.lenient = lenient;
} }
private static FieldMapper getFieldMapper(MapperService mapperService, String field) {
for (DocumentMapper mapper : mapperService.docMappers(true)) {
FieldMapper fieldMapper = mapper.mappers().smartNameFieldMapper(field);
if (fieldMapper != null) {
return fieldMapper;
}
}
return null;
}
public static Map<String, Float> resolveMappingFields(QueryShardContext context, Map<String, Float> fieldsAndWeights) {
Map<String, Float> resolvedFields = new HashMap<>();
for (Map.Entry<String, Float> fieldEntry : fieldsAndWeights.entrySet()) {
boolean allField = Regex.isMatchAllPattern(fieldEntry.getKey());
boolean multiField = Regex.isSimpleMatchPattern(fieldEntry.getKey());
float weight = fieldEntry.getValue() == null ? 1.0f : fieldEntry.getValue();
Map<String, Float> fieldMap = resolveMappingField(context, fieldEntry.getKey(), weight, !multiField, !allField);
resolvedFields.putAll(fieldMap);
}
return resolvedFields;
}
public static Map<String, Float> resolveMappingField(QueryShardContext context, String field, float weight,
boolean acceptMetadataField, boolean acceptAllTypes) {
return resolveMappingField(context, field, weight, acceptMetadataField, acceptAllTypes, false, null);
}
/**
* Given a shard context, return a map of all fields in the mappings that
* can be queried. The map will be field name to a float of 1.0f.
*/
private static Map<String, Float> resolveMappingField(QueryShardContext context, String field, float weight,
boolean acceptAllTypes, boolean acceptMetadataField,
boolean quoted, String quoteFieldSuffix) {
Collection<String> allFields = context.simpleMatchToIndexNames(field);
Map<String, Float> fields = new HashMap<>();
for (String fieldName : allFields) {
if (quoted && quoteFieldSuffix != null && context.fieldMapper(fieldName + quoteFieldSuffix) != null) {
fieldName = fieldName + quoteFieldSuffix;
}
FieldMapper mapper = getFieldMapper(context.getMapperService(), fieldName);
if (mapper == null) {
// Unmapped fields are not ignored
fields.put(field, weight);
continue;
}
if (acceptMetadataField == false && mapper instanceof MetadataFieldMapper) {
// Ignore metadata fields
continue;
}
// Ignore fields that are not in the allowed mapper types. Some
// types do not support term queries, and thus we cannot generate
// a special query for them.
String mappingType = mapper.fieldType().typeName();
if (acceptAllTypes == false && ALLOWED_QUERY_MAPPER_TYPES.contains(mappingType) == false) {
continue;
}
fields.put(fieldName, weight);
}
return fields;
}
@Override @Override
public void setDefaultOperator(Operator op) { public void setDefaultOperator(Operator op) {
super.setDefaultOperator(op); super.setDefaultOperator(op);
@ -234,18 +331,15 @@ public class QueryStringQueryParser extends XQueryParser {
private Map<String, Float> extractMultiFields(String field, boolean quoted) { private Map<String, Float> extractMultiFields(String field, boolean quoted) {
if (field != null) { if (field != null) {
Collection<String> fields = queryBuilder.context.simpleMatchToIndexNames(field); boolean allFields = Regex.isMatchAllPattern(field);
Map<String, Float> weights = new HashMap<>(); if (allFields && this.field != null && this.field.equals(field)) {
for (String fieldName : fields) { // "*" is the default field
Float weight = fieldsAndWeights.get(fieldName); return fieldsAndWeights;
if (quoted && quoteFieldSuffix != null
&& queryBuilder.context.fieldMapper(fieldName + quoteFieldSuffix) != null) {
fieldName = fieldName + quoteFieldSuffix;
weight = fieldsAndWeights.get(fieldName);
}
weights.put(fieldName, weight == null ? 1.0f : weight);
} }
return weights; boolean multiFields = Regex.isSimpleMatchPattern(field);
// Filters unsupported fields if a pattern is requested
// Filters metadata fields if all fields are requested
return resolveMappingField(context, field, 1.0f, !allFields, !multiFields, quoted, quoteFieldSuffix);
} else { } else {
return fieldsAndWeights; return fieldsAndWeights;
} }
@ -269,14 +363,14 @@ public class QueryStringQueryParser extends XQueryParser {
@Override @Override
public Query getFieldQuery(String field, String queryText, boolean quoted) throws ParseException { public Query getFieldQuery(String field, String queryText, boolean quoted) throws ParseException {
FieldQueryExtension fieldQueryExtension = FIELD_QUERY_EXTENSIONS.get(field);
if (fieldQueryExtension != null) {
return fieldQueryExtension.query(queryBuilder.context, queryText);
}
if (quoted) { if (quoted) {
return getFieldQuery(field, queryText, getPhraseSlop()); return getFieldQuery(field, queryText, getPhraseSlop());
} }
if (field != null && EXISTS_FIELD.equals(field)) {
return existsQuery(queryText);
}
// Detects additional operators '<', '<=', '>', '>=' to handle range query with one side unbounded. // Detects additional operators '<', '<=', '>', '>=' to handle range query with one side unbounded.
// It is required to use a prefix field operator to enable the detection since they are not treated // It is required to use a prefix field operator to enable the detection since they are not treated
// as logical operator by the query parser (e.g. age:>=10). // as logical operator by the query parser (e.g. age:>=10).
@ -305,7 +399,7 @@ public class QueryStringQueryParser extends XQueryParser {
// the requested fields do not match any field in the mapping // the requested fields do not match any field in the mapping
// happens for wildcard fields only since we cannot expand to a valid field name // happens for wildcard fields only since we cannot expand to a valid field name
// if there is no match in the mappings. // if there is no match in the mappings.
return new MatchNoDocsQuery("empty fields"); return newUnmappedFieldQuery(field);
} }
Analyzer oldAnalyzer = queryBuilder.analyzer; Analyzer oldAnalyzer = queryBuilder.analyzer;
try { try {
@ -324,7 +418,7 @@ public class QueryStringQueryParser extends XQueryParser {
protected Query getFieldQuery(String field, String queryText, int slop) throws ParseException { protected Query getFieldQuery(String field, String queryText, int slop) throws ParseException {
Map<String, Float> fields = extractMultiFields(field, true); Map<String, Float> fields = extractMultiFields(field, true);
if (fields.isEmpty()) { if (fields.isEmpty()) {
return new MatchNoDocsQuery("empty fields"); return newUnmappedFieldQuery(field);
} }
final Query query; final Query query;
Analyzer oldAnalyzer = queryBuilder.analyzer; Analyzer oldAnalyzer = queryBuilder.analyzer;
@ -357,20 +451,18 @@ public class QueryStringQueryParser extends XQueryParser {
} }
Map<String, Float> fields = extractMultiFields(field, false); Map<String, Float> fields = extractMultiFields(field, false);
if (fields == null) { if (fields.isEmpty()) {
return getRangeQuerySingle(field, part1, part2, startInclusive, endInclusive, queryBuilder.context); return newUnmappedFieldQuery(field);
} }
List<Query> queries = new ArrayList<>(); List<Query> queries = new ArrayList<>();
for (Map.Entry<String, Float> entry : fields.entrySet()) { for (Map.Entry<String, Float> entry : fields.entrySet()) {
Query q = getRangeQuerySingle(entry.getKey(), part1, part2, startInclusive, endInclusive, context); Query q = getRangeQuerySingle(entry.getKey(), part1, part2, startInclusive, endInclusive, context);
if (q != null) { assert q != null;
queries.add(applyBoost(q, entry.getValue())); queries.add(applyBoost(q, entry.getValue()));
}
} }
if (queries.size() == 0) {
return null; if (queries.size() == 1) {
} else if (queries.size() == 1) {
return queries.get(0); return queries.get(0);
} }
float tiebreaker = groupTieBreaker == null ? type.tieBreaker() : groupTieBreaker; float tiebreaker = groupTieBreaker == null ? type.tieBreaker() : groupTieBreaker;
@ -380,28 +472,28 @@ public class QueryStringQueryParser extends XQueryParser {
private Query getRangeQuerySingle(String field, String part1, String part2, private Query getRangeQuerySingle(String field, String part1, String part2,
boolean startInclusive, boolean endInclusive, QueryShardContext context) { boolean startInclusive, boolean endInclusive, QueryShardContext context) {
currentFieldType = context.fieldMapper(field); currentFieldType = context.fieldMapper(field);
if (currentFieldType != null) { if (currentFieldType == null) {
try { return newUnmappedFieldQuery(field);
Analyzer normalizer = forceAnalyzer == null ? queryBuilder.context.getSearchAnalyzer(currentFieldType) : forceAnalyzer; }
BytesRef part1Binary = part1 == null ? null : normalizer.normalize(field, part1); try {
BytesRef part2Binary = part2 == null ? null : normalizer.normalize(field, part2); Analyzer normalizer = forceAnalyzer == null ? queryBuilder.context.getSearchAnalyzer(currentFieldType) : forceAnalyzer;
Query rangeQuery; BytesRef part1Binary = part1 == null ? null : normalizer.normalize(field, part1);
if (currentFieldType instanceof DateFieldMapper.DateFieldType && timeZone != null) { BytesRef part2Binary = part2 == null ? null : normalizer.normalize(field, part2);
DateFieldMapper.DateFieldType dateFieldType = (DateFieldMapper.DateFieldType) this.currentFieldType; Query rangeQuery;
rangeQuery = dateFieldType.rangeQuery(part1Binary, part2Binary, if (currentFieldType instanceof DateFieldMapper.DateFieldType && timeZone != null) {
startInclusive, endInclusive, timeZone, null, context); DateFieldMapper.DateFieldType dateFieldType = (DateFieldMapper.DateFieldType) this.currentFieldType;
} else { rangeQuery = dateFieldType.rangeQuery(part1Binary, part2Binary,
rangeQuery = currentFieldType.rangeQuery(part1Binary, part2Binary, startInclusive, endInclusive, context); startInclusive, endInclusive, timeZone, null, context);
} } else {
return rangeQuery; rangeQuery = currentFieldType.rangeQuery(part1Binary, part2Binary, startInclusive, endInclusive, context);
} catch (RuntimeException e) { }
if (lenient) { return rangeQuery;
return null; } catch (RuntimeException e) {
} if (lenient) {
throw e; return newLenientFieldQuery(field, e);
} }
throw e;
} }
return newRangeQuery(field, part1, part2, startInclusive, endInclusive);
} }
@Override @Override
@ -415,39 +507,40 @@ public class QueryStringQueryParser extends XQueryParser {
@Override @Override
protected Query getFuzzyQuery(String field, String termStr, float minSimilarity) throws ParseException { protected Query getFuzzyQuery(String field, String termStr, float minSimilarity) throws ParseException {
Map<String, Float> fields = extractMultiFields(field, false); Map<String, Float> fields = extractMultiFields(field, false);
float tiebreaker = groupTieBreaker == null ? type.tieBreaker() : groupTieBreaker; if (fields.isEmpty()) {
return newUnmappedFieldQuery(field);
}
List<Query> queries = new ArrayList<>(); List<Query> queries = new ArrayList<>();
for (Map.Entry<String, Float> entry : fields.entrySet()) { for (Map.Entry<String, Float> entry : fields.entrySet()) {
Query q = getFuzzyQuerySingle(entry.getKey(), termStr, minSimilarity); Query q = getFuzzyQuerySingle(entry.getKey(), termStr, minSimilarity);
if (q != null) { assert q != null;
queries.add(applyBoost(q, entry.getValue())); queries.add(applyBoost(q, entry.getValue()));
}
} }
if (queries.size() == 0) {
return null; if (queries.size() == 1) {
} else if (queries.size() == 1) {
return queries.get(0); return queries.get(0);
} else { } else {
float tiebreaker = groupTieBreaker == null ? type.tieBreaker() : groupTieBreaker;
return new DisjunctionMaxQuery(queries, tiebreaker); return new DisjunctionMaxQuery(queries, tiebreaker);
} }
} }
private Query getFuzzyQuerySingle(String field, String termStr, float minSimilarity) throws ParseException { private Query getFuzzyQuerySingle(String field, String termStr, float minSimilarity) throws ParseException {
currentFieldType = context.fieldMapper(field); currentFieldType = context.fieldMapper(field);
if (currentFieldType != null) { if (currentFieldType == null) {
try { return newUnmappedFieldQuery(field);
Analyzer normalizer = forceAnalyzer == null ? queryBuilder.context.getSearchAnalyzer(currentFieldType) : forceAnalyzer; }
BytesRef term = termStr == null ? null : normalizer.normalize(field, termStr); try {
return currentFieldType.fuzzyQuery(term, Fuzziness.fromEdits((int) minSimilarity), Analyzer normalizer = forceAnalyzer == null ? queryBuilder.context.getSearchAnalyzer(currentFieldType) : forceAnalyzer;
getFuzzyPrefixLength(), fuzzyMaxExpansions, FuzzyQuery.defaultTranspositions); BytesRef term = termStr == null ? null : normalizer.normalize(field, termStr);
} catch (RuntimeException e) { return currentFieldType.fuzzyQuery(term, Fuzziness.fromEdits((int) minSimilarity),
if (lenient) { getFuzzyPrefixLength(), fuzzyMaxExpansions, FuzzyQuery.defaultTranspositions);
return null; } catch (RuntimeException e) {
} if (lenient) {
throw e; return newLenientFieldQuery(field, e);
} }
throw e;
} }
return super.getFuzzyQuery(field, termStr, minSimilarity);
} }
@Override @Override
@ -462,24 +555,20 @@ public class QueryStringQueryParser extends XQueryParser {
@Override @Override
protected Query getPrefixQuery(String field, String termStr) throws ParseException { protected Query getPrefixQuery(String field, String termStr) throws ParseException {
Map<String, Float> fields = extractMultiFields(field, false); Map<String, Float> fields = extractMultiFields(field, false);
if (fields != null) { if (fields.isEmpty()) {
float tiebreaker = groupTieBreaker == null ? type.tieBreaker() : groupTieBreaker; return newUnmappedFieldQuery(termStr);
List<Query> queries = new ArrayList<>(); }
for (Map.Entry<String, Float> entry : fields.entrySet()) { List<Query> queries = new ArrayList<>();
Query q = getPrefixQuerySingle(entry.getKey(), termStr); for (Map.Entry<String, Float> entry : fields.entrySet()) {
if (q != null) { Query q = getPrefixQuerySingle(entry.getKey(), termStr);
queries.add(applyBoost(q, entry.getValue())); assert q != null;
} queries.add(applyBoost(q, entry.getValue()));
} }
if (queries.size() == 0) { if (queries.size() == 1) {
return null; return queries.get(0);
} else if (queries.size() == 1) {
return queries.get(0);
} else {
return new DisjunctionMaxQuery(queries, tiebreaker);
}
} else { } else {
return getPrefixQuerySingle(field, termStr); float tiebreaker = groupTieBreaker == null ? type.tieBreaker() : groupTieBreaker;
return new DisjunctionMaxQuery(queries, tiebreaker);
} }
} }
@ -502,7 +591,7 @@ public class QueryStringQueryParser extends XQueryParser {
return getPossiblyAnalyzedPrefixQuery(field, termStr); return getPossiblyAnalyzedPrefixQuery(field, termStr);
} catch (RuntimeException e) { } catch (RuntimeException e) {
if (lenient) { if (lenient) {
return null; return newLenientFieldQuery(field, e);
} }
throw e; throw e;
} finally { } finally {
@ -551,7 +640,7 @@ public class QueryStringQueryParser extends XQueryParser {
} }
if (tlist.size() == 0) { if (tlist.size() == 0) {
return null; return new MatchNoDocsQuery("analysis was empty for " + field + ":" + termStr);
} }
if (tlist.size() == 1 && tlist.get(0).size() == 1) { if (tlist.size() == 1 && tlist.get(0).size() == 1) {
@ -591,6 +680,17 @@ public class QueryStringQueryParser extends XQueryParser {
return getBooleanQuery(clauses); return getBooleanQuery(clauses);
} }
private Query existsQuery(String fieldName) {
final FieldNamesFieldMapper.FieldNamesFieldType fieldNamesFieldType =
(FieldNamesFieldMapper.FieldNamesFieldType) context.getMapperService().fullName(FieldNamesFieldMapper.NAME);
if (fieldNamesFieldType.isEnabled() == false) {
// The field_names_field is disabled so we switch to a wildcard query that matches all terms
return new WildcardQuery(new Term(fieldName, "*"));
}
return ExistsQueryBuilder.newFilter(context, fieldName);
}
@Override @Override
protected Query getWildcardQuery(String field, String termStr) throws ParseException { protected Query getWildcardQuery(String field, String termStr) throws ParseException {
if (termStr.equals("*") && field != null) { if (termStr.equals("*") && field != null) {
@ -598,7 +698,7 @@ public class QueryStringQueryParser extends XQueryParser {
* We rewrite _all:* to a match all query. * We rewrite _all:* to a match all query.
* TODO: We can remove this special case when _all is completely removed. * TODO: We can remove this special case when _all is completely removed.
*/ */
if ("*".equals(field) || AllFieldMapper.NAME.equals(field)) { if (Regex.isMatchAllPattern(field) || AllFieldMapper.NAME.equals(field)) {
return newMatchAllDocsQuery(); return newMatchAllDocsQuery();
} }
String actualField = field; String actualField = field;
@ -606,35 +706,31 @@ public class QueryStringQueryParser extends XQueryParser {
actualField = this.field; actualField = this.field;
} }
// effectively, we check if a field exists or not // effectively, we check if a field exists or not
return FIELD_QUERY_EXTENSIONS.get(ExistsFieldQueryExtension.NAME).query(queryBuilder.context, actualField); return existsQuery(actualField);
} }
Map<String, Float> fields = extractMultiFields(field, false); Map<String, Float> fields = extractMultiFields(field, false);
if (fields != null) { if (fields.isEmpty()) {
float tiebreaker = groupTieBreaker == null ? type.tieBreaker() : groupTieBreaker; return newUnmappedFieldQuery(termStr);
List<Query> queries = new ArrayList<>(); }
for (Map.Entry<String, Float> entry : fields.entrySet()) { List<Query> queries = new ArrayList<>();
Query q = getWildcardQuerySingle(entry.getKey(), termStr); for (Map.Entry<String, Float> entry : fields.entrySet()) {
if (q != null) { Query q = getWildcardQuerySingle(entry.getKey(), termStr);
queries.add(applyBoost(q, entry.getValue())); assert q != null;
} queries.add(applyBoost(q, entry.getValue()));
} }
if (queries.size() == 0) { if (queries.size() == 1) {
return null; return queries.get(0);
} else if (queries.size() == 1) {
return queries.get(0);
} else {
return new DisjunctionMaxQuery(queries, tiebreaker);
}
} else { } else {
return getWildcardQuerySingle(field, termStr); float tiebreaker = groupTieBreaker == null ? type.tieBreaker() : groupTieBreaker;
return new DisjunctionMaxQuery(queries, tiebreaker);
} }
} }
private Query getWildcardQuerySingle(String field, String termStr) throws ParseException { private Query getWildcardQuerySingle(String field, String termStr) throws ParseException {
if ("*".equals(termStr)) { if ("*".equals(termStr)) {
// effectively, we check if a field exists or not // effectively, we check if a field exists or not
return FIELD_QUERY_EXTENSIONS.get(ExistsFieldQueryExtension.NAME).query(queryBuilder.context, field); return existsQuery(field);
} }
String indexedNameField = field; String indexedNameField = field;
currentFieldType = null; currentFieldType = null;
@ -648,7 +744,7 @@ public class QueryStringQueryParser extends XQueryParser {
return super.getWildcardQuery(indexedNameField, termStr); return super.getWildcardQuery(indexedNameField, termStr);
} catch (RuntimeException e) { } catch (RuntimeException e) {
if (lenient) { if (lenient) {
return null; return newLenientFieldQuery(field, e);
} }
throw e; throw e;
} finally { } finally {
@ -659,24 +755,20 @@ public class QueryStringQueryParser extends XQueryParser {
@Override @Override
protected Query getRegexpQuery(String field, String termStr) throws ParseException { protected Query getRegexpQuery(String field, String termStr) throws ParseException {
Map<String, Float> fields = extractMultiFields(field, false); Map<String, Float> fields = extractMultiFields(field, false);
if (fields != null) { if (fields.isEmpty()) {
float tiebreaker = groupTieBreaker == null ? type.tieBreaker() : groupTieBreaker; return newUnmappedFieldQuery(termStr);
List<Query> queries = new ArrayList<>(); }
for (Map.Entry<String, Float> entry : fields.entrySet()) { List<Query> queries = new ArrayList<>();
Query q = getRegexpQuerySingle(entry.getKey(), termStr); for (Map.Entry<String, Float> entry : fields.entrySet()) {
if (q != null) { Query q = getRegexpQuerySingle(entry.getKey(), termStr);
queries.add(applyBoost(q, entry.getValue())); assert q != null;
} queries.add(applyBoost(q, entry.getValue()));
} }
if (queries.size() == 0) { if (queries.size() == 1) {
return null; return queries.get(0);
} else if (queries.size() == 1) {
return queries.get(0);
} else {
return new DisjunctionMaxQuery(queries, tiebreaker);
}
} else { } else {
return getRegexpQuerySingle(field, termStr); float tiebreaker = groupTieBreaker == null ? type.tieBreaker() : groupTieBreaker;
return new DisjunctionMaxQuery(queries, tiebreaker);
} }
} }
@ -693,7 +785,7 @@ public class QueryStringQueryParser extends XQueryParser {
return super.getRegexpQuery(field, termStr); return super.getRegexpQuery(field, termStr);
} catch (RuntimeException e) { } catch (RuntimeException e) {
if (lenient) { if (lenient) {
return null; return newLenientFieldQuery(field, e);
} }
throw e; throw e;
} finally { } finally {

View File

@ -215,14 +215,14 @@ public class MultiMatchQueryBuilderTests extends AbstractQueryTestCase<MultiMatc
DisjunctionMaxQuery dQuery = (DisjunctionMaxQuery) query; DisjunctionMaxQuery dQuery = (DisjunctionMaxQuery) query;
assertThat(dQuery.getTieBreakerMultiplier(), equalTo(1.0f)); assertThat(dQuery.getTieBreakerMultiplier(), equalTo(1.0f));
assertThat(dQuery.getDisjuncts().size(), equalTo(2)); assertThat(dQuery.getDisjuncts().size(), equalTo(2));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 0).getTerm(), equalTo(new Term(STRING_FIELD_NAME, "test"))); assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 0).getTerm(), equalTo(new Term(STRING_FIELD_NAME_2, "test")));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 1).getTerm(), equalTo(new Term(STRING_FIELD_NAME_2, "test"))); assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 1).getTerm(), equalTo(new Term(STRING_FIELD_NAME, "test")));
} }
public void testToQueryFieldMissing() throws Exception { public void testToQueryFieldMissing() throws Exception {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0); assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
assertThat(multiMatchQuery("test").field(MISSING_WILDCARD_FIELD_NAME).toQuery(createShardContext()), instanceOf(MatchNoDocsQuery.class)); assertThat(multiMatchQuery("test").field(MISSING_WILDCARD_FIELD_NAME).toQuery(createShardContext()), instanceOf(MatchNoDocsQuery.class));
assertThat(multiMatchQuery("test").field(MISSING_FIELD_NAME).toQuery(createShardContext()), instanceOf(TermQuery.class)); assertThat(multiMatchQuery("test").field(MISSING_FIELD_NAME).toQuery(createShardContext()), instanceOf(MatchNoDocsQuery.class));
} }
public void testFromJson() throws IOException { public void testFromJson() throws IOException {

View File

@ -167,14 +167,11 @@ public class QueryStringQueryBuilderTests extends AbstractQueryTestCase<QueryStr
@Override @Override
protected void doAssertLuceneQuery(QueryStringQueryBuilder queryBuilder, protected void doAssertLuceneQuery(QueryStringQueryBuilder queryBuilder,
Query query, SearchContext context) throws IOException { Query query, SearchContext context) throws IOException {
if ("".equals(queryBuilder.queryString())) { assertThat(query, either(instanceOf(TermQuery.class)).or(instanceOf(AllTermQuery.class))
assertThat(query, instanceOf(MatchNoDocsQuery.class)); .or(instanceOf(BooleanQuery.class)).or(instanceOf(DisjunctionMaxQuery.class))
} else { .or(instanceOf(PhraseQuery.class)).or(instanceOf(BoostQuery.class))
assertThat(query, either(instanceOf(TermQuery.class)).or(instanceOf(AllTermQuery.class)) .or(instanceOf(MultiPhrasePrefixQuery.class)).or(instanceOf(PrefixQuery.class)).or(instanceOf(SpanQuery.class))
.or(instanceOf(BooleanQuery.class)).or(instanceOf(DisjunctionMaxQuery.class)) .or(instanceOf(MatchNoDocsQuery.class)));
.or(instanceOf(PhraseQuery.class)).or(instanceOf(BoostQuery.class))
.or(instanceOf(MultiPhrasePrefixQuery.class)).or(instanceOf(PrefixQuery.class)).or(instanceOf(SpanQuery.class)));
}
} }
public void testIllegalArguments() { public void testIllegalArguments() {
@ -293,9 +290,9 @@ public class QueryStringQueryBuilderTests extends AbstractQueryTestCase<QueryStr
DisjunctionMaxQuery dQuery = (DisjunctionMaxQuery) query; DisjunctionMaxQuery dQuery = (DisjunctionMaxQuery) query;
assertThat(dQuery.getDisjuncts().size(), equalTo(2)); assertThat(dQuery.getDisjuncts().size(), equalTo(2));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 0).getTerm(), assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 0).getTerm(),
equalTo(new Term(STRING_FIELD_NAME, "test")));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 1).getTerm(),
equalTo(new Term(STRING_FIELD_NAME_2, "test"))); equalTo(new Term(STRING_FIELD_NAME_2, "test")));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 1).getTerm(),
equalTo(new Term(STRING_FIELD_NAME, "test")));
} }
public void testToQueryDisMaxQuery() throws Exception { public void testToQueryDisMaxQuery() throws Exception {
@ -310,7 +307,7 @@ public class QueryStringQueryBuilderTests extends AbstractQueryTestCase<QueryStr
assertTermOrBoostQuery(disjuncts.get(1), STRING_FIELD_NAME_2, "test", 1.0f); assertTermOrBoostQuery(disjuncts.get(1), STRING_FIELD_NAME_2, "test", 1.0f);
} }
public void testToQueryWildcarQuery() throws Exception { public void testToQueryWildcardQuery() throws Exception {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0); assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
for (Operator op : Operator.values()) { for (Operator op : Operator.values()) {
BooleanClause.Occur defaultOp = op.toBooleanClauseOccur(); BooleanClause.Occur defaultOp = op.toBooleanClauseOccur();
@ -676,10 +673,10 @@ public class QueryStringQueryBuilderTests extends AbstractQueryTestCase<QueryStr
assertThat(expectedQuery, equalTo(query)); assertThat(expectedQuery, equalTo(query));
queryStringQueryBuilder = queryStringQueryBuilder =
new QueryStringQueryBuilder("field:foo bar").field("invalid*"); new QueryStringQueryBuilder(STRING_FIELD_NAME + ":foo bar").field("invalid*");
query = queryStringQueryBuilder.toQuery(createShardContext()); query = queryStringQueryBuilder.toQuery(createShardContext());
expectedQuery = new BooleanQuery.Builder() expectedQuery = new BooleanQuery.Builder()
.add(new TermQuery(new Term("field", "foo")), Occur.SHOULD) .add(new TermQuery(new Term(STRING_FIELD_NAME, "foo")), Occur.SHOULD)
.add(new MatchNoDocsQuery("empty fields"), Occur.SHOULD) .add(new MatchNoDocsQuery("empty fields"), Occur.SHOULD)
.build(); .build();
assertThat(expectedQuery, equalTo(query)); assertThat(expectedQuery, equalTo(query));
@ -783,8 +780,6 @@ public class QueryStringQueryBuilderTests extends AbstractQueryTestCase<QueryStr
public void testExistsFieldQuery() throws Exception { public void testExistsFieldQuery() throws Exception {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0); assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
assumeTrue("5.x behaves differently, so skip on non-6.x indices",
indexVersionCreated.onOrAfter(Version.V_6_0_0_alpha1));
QueryShardContext context = createShardContext(); QueryShardContext context = createShardContext();
QueryStringQueryBuilder queryBuilder = new QueryStringQueryBuilder("foo:*"); QueryStringQueryBuilder queryBuilder = new QueryStringQueryBuilder("foo:*");
@ -804,11 +799,7 @@ public class QueryStringQueryBuilderTests extends AbstractQueryTestCase<QueryStr
queryBuilder = new QueryStringQueryBuilder("*"); queryBuilder = new QueryStringQueryBuilder("*");
query = queryBuilder.toQuery(context); query = queryBuilder.toQuery(context);
List<Query> fieldQueries = new ArrayList<> (); expected = new MatchAllDocsQuery();
for (String type : QueryStringQueryBuilder.allQueryableDefaultFields(context).keySet()) {
fieldQueries.add(new ConstantScoreQuery(new TermQuery(new Term("_field_names", type))));
}
expected = new DisjunctionMaxQuery(fieldQueries, 0f);
assertThat(query, equalTo(expected)); assertThat(query, equalTo(expected));
} }
@ -863,6 +854,7 @@ public class QueryStringQueryBuilderTests extends AbstractQueryTestCase<QueryStr
} }
public void testExpandedTerms() throws Exception { public void testExpandedTerms() throws Exception {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
// Prefix // Prefix
Query query = new QueryStringQueryBuilder("aBc*") Query query = new QueryStringQueryBuilder("aBc*")
.field(STRING_FIELD_NAME) .field(STRING_FIELD_NAME)
@ -914,31 +906,59 @@ public class QueryStringQueryBuilderTests extends AbstractQueryTestCase<QueryStr
assertEquals(new TermRangeQuery(STRING_FIELD_NAME, new BytesRef("abc"), new BytesRef("bcd"), true, true), query); assertEquals(new TermRangeQuery(STRING_FIELD_NAME, new BytesRef("abc"), new BytesRef("bcd"), true, true), query);
} }
public void testAllFieldsWithFields() throws IOException { public void testDefaultFieldsWithFields() throws IOException {
String json = QueryShardContext context = createShardContext();
"{\n" + QueryStringQueryBuilder builder = new QueryStringQueryBuilder("aBc*")
" \"query_string\" : {\n" + .field("field")
" \"query\" : \"this AND that OR thus\",\n" + .defaultField("*");
" \"fields\" : [\"foo\"],\n" + QueryValidationException e = expectThrows(QueryValidationException.class, () -> builder.toQuery(context));
" \"all_fields\" : true\n" +
" }\n" +
"}";
ParsingException e = expectThrows(ParsingException.class, () -> parseQuery(json));
assertThat(e.getMessage(), assertThat(e.getMessage(),
containsString("cannot use [all_fields] parameter in conjunction with [default_field] or [fields]")); containsString("cannot use [fields] parameter in conjunction with [default_field]"));
}
String json2 = public void testLenientRewriteToMatchNoDocs() throws IOException {
"{\n" + assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
" \"query_string\" : {\n" + // Term
" \"query\" : \"this AND that OR thus\",\n" + Query query = new QueryStringQueryBuilder("hello")
" \"default_field\" : \"foo\",\n" + .field(INT_FIELD_NAME)
" \"all_fields\" : true\n" + .lenient(true)
" }\n" + .toQuery(createShardContext());
"}"; assertEquals(new MatchNoDocsQuery(""), query);
e = expectThrows(ParsingException.class, () -> parseQuery(json2)); // prefix
assertThat(e.getMessage(), query = new QueryStringQueryBuilder("hello*")
containsString("cannot use [all_fields] parameter in conjunction with [default_field] or [fields]")); .field(INT_FIELD_NAME)
.lenient(true)
.toQuery(createShardContext());
assertEquals(new MatchNoDocsQuery(""), query);
// Fuzzy
query = new QueryStringQueryBuilder("hello~2")
.field(INT_FIELD_NAME)
.lenient(true)
.toQuery(createShardContext());
assertEquals(new MatchNoDocsQuery(""), query);
}
public void testUnmappedFieldRewriteToMatchNoDocs() throws IOException {
// Default unmapped field
Query query = new QueryStringQueryBuilder("hello")
.field("unmapped_field")
.lenient(true)
.toQuery(createShardContext());
assertEquals(new MatchNoDocsQuery(""), query);
// Unmapped prefix field
query = new QueryStringQueryBuilder("unmapped_field:hello")
.lenient(true)
.toQuery(createShardContext());
assertEquals(new MatchNoDocsQuery(""), query);
// Unmapped fields
query = new QueryStringQueryBuilder("hello")
.lenient(true)
.field("unmapped_field")
.toQuery(createShardContext());
assertEquals(new MatchNoDocsQuery(""), query);
} }
} }

View File

@ -27,6 +27,7 @@ import org.apache.lucene.search.BooleanQuery;
import org.apache.lucene.search.BoostQuery; import org.apache.lucene.search.BoostQuery;
import org.apache.lucene.search.DisjunctionMaxQuery; import org.apache.lucene.search.DisjunctionMaxQuery;
import org.apache.lucene.search.MatchAllDocsQuery; import org.apache.lucene.search.MatchAllDocsQuery;
import org.apache.lucene.search.MatchNoDocsQuery;
import org.apache.lucene.search.Query; import org.apache.lucene.search.Query;
import org.apache.lucene.search.SynonymQuery; import org.apache.lucene.search.SynonymQuery;
import org.apache.lucene.search.TermQuery; import org.apache.lucene.search.TermQuery;
@ -97,8 +98,8 @@ public class MultiMatchQueryTests extends ESSingleNodeTestCase {
Query tq2 = new BoostQuery(new TermQuery(new Term("name.last", "banon")), 3); Query tq2 = new BoostQuery(new TermQuery(new Term("name.last", "banon")), 3);
Query expected = new DisjunctionMaxQuery( Query expected = new DisjunctionMaxQuery(
Arrays.asList( Arrays.asList(
new TermQuery(new Term("foobar", "banon")), new MatchNoDocsQuery("unknown field foobar"),
new DisjunctionMaxQuery(Arrays.asList(tq1, tq2), 0f) new DisjunctionMaxQuery(Arrays.asList(tq2, tq1), 0f)
), 0f); ), 0f);
assertEquals(expected, rewrittenQuery); assertEquals(expected, rewrittenQuery);
} }

View File

@ -207,31 +207,41 @@ public class QueryStringIT extends ESIntegTestCase {
assertHitCount(resp, 3L); assertHitCount(resp, 3L);
} }
public void testExplicitAllFieldsRequested() throws Exception { public void testAllFields() throws Exception {
String indexBody = copyToStringFromClasspath("/org/elasticsearch/search/query/all-query-index-with-all.json"); String indexBodyWithAll = copyToStringFromClasspath("/org/elasticsearch/search/query/all-query-index-with-all.json");
prepareCreate("test2").setSource(indexBody, XContentType.JSON).get(); String indexBody = copyToStringFromClasspath("/org/elasticsearch/search/query/all-query-index.json");
ensureGreen("test2");
// Defaults to index.query.default_field=_all
prepareCreate("test_1").setSource(indexBodyWithAll, XContentType.JSON).get();
Settings.Builder settings = Settings.builder().put("index.query.default_field", "*");
prepareCreate("test_2").setSource(indexBody, XContentType.JSON).setSettings(settings).get();
ensureGreen("test_1","test_2");
List<IndexRequestBuilder> reqs = new ArrayList<>(); List<IndexRequestBuilder> reqs = new ArrayList<>();
reqs.add(client().prepareIndex("test2", "doc", "1").setSource("f1", "foo", "f2", "eggplant")); reqs.add(client().prepareIndex("test_1", "doc", "1").setSource("f1", "foo", "f2", "eggplant"));
reqs.add(client().prepareIndex("test_2", "doc", "1").setSource("f1", "foo", "f2", "eggplant"));
indexRandom(true, false, reqs); indexRandom(true, false, reqs);
SearchResponse resp = client().prepareSearch("test2").setQuery( SearchResponse resp = client().prepareSearch("test_1").setQuery(
queryStringQuery("foo eggplant").defaultOperator(Operator.AND)).get(); queryStringQuery("foo eggplant").defaultOperator(Operator.AND)).get();
assertHitCount(resp, 0L); assertHitCount(resp, 0L);
resp = client().prepareSearch("test2").setQuery( resp = client().prepareSearch("test_2").setQuery(
queryStringQuery("foo eggplant").defaultOperator(Operator.OR).useAllFields(true)).get(); queryStringQuery("foo eggplant").defaultOperator(Operator.AND)).get();
assertHitCount(resp, 0L);
resp = client().prepareSearch("test_1").setQuery(
queryStringQuery("foo eggplant").defaultOperator(Operator.OR)).get();
assertHits(resp.getHits(), "1"); assertHits(resp.getHits(), "1");
assertHitCount(resp, 1L); assertHitCount(resp, 1L);
Exception e = expectThrows(Exception.class, () -> resp = client().prepareSearch("test_2").setQuery(
client().prepareSearch("test2").setQuery( queryStringQuery("foo eggplant").defaultOperator(Operator.OR)).get();
queryStringQuery("blah").field("f1").useAllFields(true)).get()); assertHits(resp.getHits(), "1");
assertThat(ExceptionsHelper.detailedMessage(e), assertHitCount(resp, 1L);
containsString("cannot use [all_fields] parameter in conjunction with [default_field] or [fields]"));
} }
@LuceneTestCase.AwaitsFix(bugUrl="currently can't perform phrase queries on fields that don't support positions") @LuceneTestCase.AwaitsFix(bugUrl="currently can't perform phrase queries on fields that don't support positions")
public void testPhraseQueryOnFieldWithNoPositions() throws Exception { public void testPhraseQueryOnFieldWithNoPositions() throws Exception {
List<IndexRequestBuilder> reqs = new ArrayList<>(); List<IndexRequestBuilder> reqs = new ArrayList<>();

View File

@ -111,7 +111,7 @@ public class SimpleValidateQueryIT extends ESIntegTestCase {
.execute().actionGet(); .execute().actionGet();
assertThat(response.isValid(), equalTo(true)); assertThat(response.isValid(), equalTo(true));
assertThat(response.getQueryExplanation().size(), equalTo(1)); assertThat(response.getQueryExplanation().size(), equalTo(1));
assertThat(response.getQueryExplanation().get(0).getExplanation(), equalTo("(foo:foo | baz:foo)")); assertThat(response.getQueryExplanation().get(0).getExplanation(), equalTo("(MatchNoDocsQuery(\"failed [bar] query, caused by number_format_exception:[For input string: \"foo\"]\") | foo:foo | baz:foo)"));
assertThat(response.getQueryExplanation().get(0).getError(), nullValue()); assertThat(response.getQueryExplanation().get(0).getError(), nullValue());
} }
} }

View File

@ -68,6 +68,10 @@
use an explicit quoted query instead. use an explicit quoted query instead.
If provided, it will be ignored and issue a deprecation warning. If provided, it will be ignored and issue a deprecation warning.
* The `all_fields` parameter for the `query_string` has been removed.
Set `default_field` to *` instead.
If provided, `default_field` will be automatically set to `*`
* The `index` parameter in the terms filter, used to look up terms in a dedicated index is * The `index` parameter in the terms filter, used to look up terms in a dedicated index is
now mandatory. Previously, the index defaulted to the index the query was executed on. Now this index now mandatory. Previously, the index defaulted to the index the query was executed on. Now this index
must be explicitly set in the request. must be explicitly set in the request.

View File

@ -50,7 +50,10 @@ The `query_string` top level parameters include:
|`default_field` |The default field for query terms if no prefix field |`default_field` |The default field for query terms if no prefix field
is specified. Defaults to the `index.query.default_field` index is specified. Defaults to the `index.query.default_field` index
settings, which in turn defaults to `_all`. settings, which in turn defaults to `*`.
`*` extracts all fields in the mapping that are eligible to term queries
and filters the metadata fields. All extracted fields are then combined
to build a query when no prefix field is provided.
|`default_operator` |The default operator used if no explicit operator |`default_operator` |The default operator used if no explicit operator
is specified. For example, with a default operator of `OR`, the query is specified. For example, with a default operator of `OR`, the query
@ -107,7 +110,8 @@ the query string. This allows to use a field that has a different analysis chain
for exact matching. Look <<mixing-exact-search-with-stemming,here>> for a for exact matching. Look <<mixing-exact-search-with-stemming,here>> for a
comprehensive example. comprehensive example.
|`all_fields` | Perform the query on all fields detected in the mapping that can |`all_fields` | deprecated[6.0.0, set `default_field` to `*` instead]
Perform the query on all fields detected in the mapping that can
be queried. Will be used by default when the `_all` field is disabled and no be queried. Will be used by default when the `_all` field is disabled and no
`default_field` is specified (either in the index settings or in the request `default_field` is specified (either in the index settings or in the request
body) and no `fields` are specified. body) and no `fields` are specified.
@ -124,11 +128,9 @@ parameter.
When not explicitly specifying the field to search on in the query When not explicitly specifying the field to search on in the query
string syntax, the `index.query.default_field` will be used to derive string syntax, the `index.query.default_field` will be used to derive
which field to search on. It defaults to `_all` field. which field to search on. If the `index.query.default_field` is not specified,
the `query_string` will automatically attempt to determine the existing fields in the index's
If the `_all` field is disabled, the `query_string` query will automatically mapping that are queryable, and perform the search on those fields. Note that this will not
attempt to determine the existing fields in the index's mapping that are
queryable, and perform the search on those fields. Note that this will not
include nested documents, use a nested query to search those documents. include nested documents, use a nested query to search those documents.
[float] [float]