inner_hits: Added another more compact syntax for inner hits.
Closes #8770
This commit is contained in:
parent
cc71f7730a
commit
d8054ec299
|
@ -11,32 +11,21 @@ it's very useful to know which inner nested objects (in the case of nested or ch
|
|||
of parent/child) caused certain information to be returned. The inner hits feature can be used for this. This feature
|
||||
returns per search hit in the search response additional nested hits that caused a search hit to match in a different scope.
|
||||
|
||||
The following snippet explains the basic structure of inner hits:
|
||||
Inner hits can be used by defining a `inner_hits` definition on a `nested`, `has_child` or `has_parent` query and filter.
|
||||
The structure looks like this:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
"inner_hits" : {
|
||||
"<inner_hits_name>" : {
|
||||
"<path|type>" : {
|
||||
"<path-to-nested-object-field|child-or-parent-type>" : {
|
||||
<inner_hits_body>
|
||||
[,"inner_hits" : { [<sub_inner_hits>]+ } ]?
|
||||
}
|
||||
"<query>" : {
|
||||
"inner_hits" : {
|
||||
<inner_hits_options>
|
||||
}
|
||||
}
|
||||
[,"<inner_hits_name_2>" : { ... } ]*
|
||||
}
|
||||
--------------------------------------------------
|
||||
|
||||
Inside the `inner_hits` definition, first the name if the inner hit is defined then whether the inner_hit
|
||||
is a nested by defining `path` or a parent/child based definition by defining `type`. The next object layer contains
|
||||
the name of the nested object field if the inner_hits is nested or the parent or child type if the inner_hit definition
|
||||
is parent/child based.
|
||||
|
||||
Multiple inner hit definitions can be defined in a single request. In the `<inner_hits_body>` any option for features
|
||||
that `inner_hits` support can be defined. Optionally another `inner_hits` definition can be defined in the `<inner_hits_body>`.
|
||||
|
||||
If `inner_hits` is defined, each search will contain a `inner_hits` json object with the following structure:
|
||||
If `_inner_hits` is defined on a query that supports it then each search hit will contain a `inner_hits` json object with the following structure:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
|
@ -71,15 +60,13 @@ If `inner_hits` is defined, each search will contain a `inner_hits` json object
|
|||
Inner hits support the following options:
|
||||
|
||||
[horizontal]
|
||||
`path`:: Defines the nested scope where hits will be collected from.
|
||||
`type`:: Defines the parent or child type score where hits will be collected from.
|
||||
`query`:: Defines the query that will run in the defined nested, parent or child scope to collect and score hits. By default all document in the scope will be matched.
|
||||
`from`:: The offset from where the first hit to fetch for each `inner_hits` in the returned regular search hits.
|
||||
`size`:: The maximum number of hits to return per `inner_hits`. By default the top three matching hits are returned.
|
||||
`sort`:: How the inner hits should be sorted per `inner_hits`. By default the hits are sorted by the score.
|
||||
|
||||
Either `path` or `type` must be defined. The `path` or `type` defines the scope from where hits are fetched and
|
||||
used as inner hits.
|
||||
`name`:: The name to be used for the particular inner hit definition in the response. Useful when multiple inner hits
|
||||
have been defined in a single search request. The default depends in which query the inner hit is defined.
|
||||
For `has_child` query and filter this is the child type, `has_parent` query and filter this is the parent type
|
||||
and the nested query and filter this is the nested path.
|
||||
|
||||
Inner hits also supports the following per document features:
|
||||
|
||||
|
@ -105,29 +92,14 @@ The example below assumes that there is a nested object field defined with the n
|
|||
"path" : "comments",
|
||||
"query" : {
|
||||
"match" : {"comments.message" : "[actual query]"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"inner_hits" : {
|
||||
"comment" : {
|
||||
"path" : { <1>
|
||||
"comments" : { <2>
|
||||
"query" : {
|
||||
"match" : {"comments.message" : "[actual query]"}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"inner_hits" : {} <1>
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------------------------
|
||||
|
||||
<1> The inner hit definition is nested and requires the `path` option.
|
||||
<2> The path option refers to the nested object field `comments`
|
||||
|
||||
Above, the query is repeated in both the query and the `comment` inner hit definition. At the moment there is
|
||||
no query referencing support, so in order to make sure that only inner nested objects are returned that contributed to
|
||||
the matching of the regular hits, the inner query in the `nested` query needs to also be defined on the inner hits definition.
|
||||
<1> The inner hit definition in the nested query. No other options need to be defined.
|
||||
|
||||
An example of a response snippet that could be generated from the above search request:
|
||||
|
||||
|
@ -143,7 +115,7 @@ An example of a response snippet that could be generated from the above search r
|
|||
"_id": "1",
|
||||
"_source": ...,
|
||||
"inner_hits": {
|
||||
"comment": {
|
||||
"comments": { <1>
|
||||
"hits": {
|
||||
"total": ...,
|
||||
"hits": [
|
||||
|
@ -165,6 +137,8 @@ An example of a response snippet that could be generated from the above search r
|
|||
...
|
||||
--------------------------------------------------
|
||||
|
||||
<1> The name used in the inner hit definition in the search request. A custom key can be used via the `name` option.
|
||||
|
||||
The `_nested` metadata is crucial in the above example, because it defines from what inner nested object this inner hit
|
||||
came from. The `field` defines the object array field the nested hit is from and the `offset` relative to its location
|
||||
in the `_source`. Due to sorting and scoring the actual location of the hit objects in the `inner_hits` is usually
|
||||
|
@ -193,25 +167,14 @@ The examples below assumes that there is a `_parent` field mapping in the `comme
|
|||
"type" : "comment",
|
||||
"query" : {
|
||||
"match" : {"message" : "[actual query]"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"inner_hits" : {
|
||||
"comment" : {
|
||||
"type" : { <1>
|
||||
"comment" : { <2>
|
||||
"query" : {
|
||||
"match" : {"message" : "[actual query]"}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"inner_hits" : {} <1>
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------------------------
|
||||
|
||||
<1> This is a parent/child inner hit definition and requires the `type` option.
|
||||
<2> Refers to the document type `comment`
|
||||
<1> The inner hit definition like in the nested example.
|
||||
|
||||
An example of a response snippet that could be generated from the above search request:
|
||||
|
||||
|
@ -243,4 +206,81 @@ An example of a response snippet that could be generated from the above search r
|
|||
}
|
||||
},
|
||||
...
|
||||
--------------------------------------------------
|
||||
--------------------------------------------------
|
||||
|
||||
[[top-level-inner-hits]]
|
||||
==== top level inner hits
|
||||
|
||||
Besides defining inner hits on query and filters, inner hits can also be defined as a top level construct alongside the
|
||||
`query` and `aggregations` definition. The main reason for using the top level inner hits definition is to let the
|
||||
inner hits return documents that don't match with the main query. Also inner hits definitions can be nested via the
|
||||
top level notation. Other then that the inner hit definition inside the query should be used, because that is the most
|
||||
compact way for defining inner hits.
|
||||
|
||||
The following snippet explains the basic structure of inner hits defined at the top level of the search request body:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
"inner_hits" : {
|
||||
"<inner_hits_name>" : {
|
||||
"<path|type>" : {
|
||||
"<path-to-nested-object-field|child-or-parent-type>" : {
|
||||
<inner_hits_body>
|
||||
[,"inner_hits" : { [<sub_inner_hits>]+ } ]?
|
||||
}
|
||||
}
|
||||
}
|
||||
[,"<inner_hits_name_2>" : { ... } ]*
|
||||
}
|
||||
--------------------------------------------------
|
||||
|
||||
Inside the `inner_hits` definition, first the name if the inner hit is defined then whether the inner_hit
|
||||
is a nested by defining `path` or a parent/child based definition by defining `type`. The next object layer contains
|
||||
the name of the nested object field if the inner_hits is nested or the parent or child type if the inner_hit definition
|
||||
is parent/child based.
|
||||
|
||||
Multiple inner hit definitions can be defined in a single request. In the `<inner_hits_body>` any option for features
|
||||
that `inner_hits` support can be defined. Optionally another `inner_hits` definition can be defined in the `<inner_hits_body>`.
|
||||
|
||||
An example that shows the use of nested inner hits via the top level notation:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
{
|
||||
"query" : {
|
||||
"nested" : {
|
||||
"path" : "comments",
|
||||
"query" : {
|
||||
"match" : {"comments.message" : "[actual query]"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"inner_hits" : {
|
||||
"comment" : {
|
||||
"path" : { <1>
|
||||
"comments" : { <2>
|
||||
"query" : {
|
||||
"match" : {"comments.message" : "[different query]"} <3>
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------------------------
|
||||
|
||||
<1> The inner hit definition is nested and requires the `path` option.
|
||||
<2> The path option refers to the nested object field `comments`
|
||||
<3> A query that runs to collect the nested inner documents for each search hit returned. If no query is defined all nested
|
||||
inner documents will be included belonging to a search hit. This shows that it only make sense to the top level
|
||||
inner hit definition if no query or a different query is specified.
|
||||
|
||||
Additional options that are only available when using the top level inner hits notation:
|
||||
|
||||
[horizontal]
|
||||
`path`:: Defines the nested scope where hits will be collected from.
|
||||
`type`:: Defines the parent or child type score where hits will be collected from.
|
||||
`query`:: Defines the query that will run in the defined nested, parent or child scope to collect and score hits. By default all document in the scope will be matched.
|
||||
|
||||
Either `path` or `type` must be defined. The `path` or `type` defines the scope from where hits are fetched and
|
||||
used as inner hits.
|
|
@ -19,8 +19,6 @@
|
|||
|
||||
package org.elasticsearch.action.deletebyquery;
|
||||
|
||||
import com.google.common.collect.ImmutableMap;
|
||||
import org.apache.lucene.search.Filter;
|
||||
import org.elasticsearch.ElasticsearchIllegalStateException;
|
||||
import org.elasticsearch.action.support.ActionFilters;
|
||||
import org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction;
|
||||
|
@ -110,7 +108,7 @@ public class TransportShardDeleteByQueryAction extends TransportShardReplication
|
|||
pageCacheRecycler, bigArrays, threadPool.estimatedTimeInMillisCounter()));
|
||||
try {
|
||||
Engine.DeleteByQuery deleteByQuery = indexShard.prepareDeleteByQuery(request.source(), request.filteringAliases(), Engine.Operation.Origin.PRIMARY, request.types());
|
||||
SearchContext.current().parsedQuery(new ParsedQuery(deleteByQuery.query(), ImmutableMap.<String, Filter>of()));
|
||||
SearchContext.current().parsedQuery(new ParsedQuery(deleteByQuery.query()));
|
||||
indexShard.deleteByQuery(deleteByQuery);
|
||||
} finally {
|
||||
try (SearchContext searchContext = SearchContext.current()) {
|
||||
|
@ -132,7 +130,7 @@ public class TransportShardDeleteByQueryAction extends TransportShardReplication
|
|||
pageCacheRecycler, bigArrays, threadPool.estimatedTimeInMillisCounter()));
|
||||
try {
|
||||
Engine.DeleteByQuery deleteByQuery = indexShard.prepareDeleteByQuery(request.source(), request.filteringAliases(), Engine.Operation.Origin.REPLICA, request.types());
|
||||
SearchContext.current().parsedQuery(new ParsedQuery(deleteByQuery.query(), ImmutableMap.<String, Filter>of()));
|
||||
SearchContext.current().parsedQuery(new ParsedQuery(deleteByQuery.query()));
|
||||
indexShard.deleteByQuery(deleteByQuery);
|
||||
} finally {
|
||||
try (SearchContext searchContext = SearchContext.current()) {
|
||||
|
|
|
@ -701,7 +701,16 @@ public class MapperService extends AbstractIndexComponent {
|
|||
}
|
||||
ObjectMappers mappers = objectMapper(smartName);
|
||||
if (mappers != null) {
|
||||
return new SmartNameObjectMapper(mappers.mapper(), null);
|
||||
return new SmartNameObjectMapper(mappers.mapper(), guessDocMapper(smartName));
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
private DocumentMapper guessDocMapper(String path) {
|
||||
for (DocumentMapper documentMapper : docMappers(false)) {
|
||||
if (documentMapper.objectMappers().containsKey(path)) {
|
||||
return documentMapper;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
|
|
@ -19,6 +19,7 @@
|
|||
package org.elasticsearch.index.query;
|
||||
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.support.QueryInnerHitBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -34,7 +35,7 @@ public class HasChildFilterBuilder extends BaseFilterBuilder {
|
|||
private Integer shortCircuitCutoff;
|
||||
private Integer minChildren;
|
||||
private Integer maxChildren;
|
||||
|
||||
private QueryInnerHitBuilder innerHit = null;
|
||||
|
||||
public HasChildFilterBuilder(String type, QueryBuilder queryBuilder) {
|
||||
this.childType = type;
|
||||
|
@ -96,6 +97,14 @@ public class HasChildFilterBuilder extends BaseFilterBuilder {
|
|||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets inner hit definition in the scope of this filter and reusing the defined type and query.
|
||||
*/
|
||||
public HasChildFilterBuilder innerHit(QueryInnerHitBuilder innerHit) {
|
||||
this.innerHit = innerHit;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void doXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject(HasChildFilterParser.NAME);
|
||||
|
@ -119,6 +128,11 @@ public class HasChildFilterBuilder extends BaseFilterBuilder {
|
|||
if (shortCircuitCutoff != null) {
|
||||
builder.field("short_circuit_cutoff", shortCircuitCutoff);
|
||||
}
|
||||
if (innerHit != null) {
|
||||
builder.startObject("inner_hits");
|
||||
builder.value(innerHit);
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
}
|
||||
}
|
||||
|
|
|
@ -23,17 +23,21 @@ import org.apache.lucene.search.FilteredQuery;
|
|||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.join.BitDocIdSetFilter;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.collect.Tuple;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.index.fielddata.plain.ParentChildIndexFieldData;
|
||||
import org.elasticsearch.index.mapper.DocumentMapper;
|
||||
import org.elasticsearch.index.mapper.internal.ParentFieldMapper;
|
||||
import org.elasticsearch.index.query.support.InnerHitsQueryParserHelper;
|
||||
import org.elasticsearch.index.query.support.XContentStructure;
|
||||
import org.elasticsearch.index.search.child.ChildrenConstantScoreQuery;
|
||||
import org.elasticsearch.index.search.child.ChildrenQuery;
|
||||
import org.elasticsearch.index.search.child.CustomQueryWrappingFilter;
|
||||
import org.elasticsearch.index.search.child.ScoreType;
|
||||
import org.elasticsearch.index.search.nested.NonNestedDocsFilter;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -46,8 +50,11 @@ public class HasChildFilterParser implements FilterParser {
|
|||
|
||||
public static final String NAME = "has_child";
|
||||
|
||||
private final InnerHitsQueryParserHelper innerHitsQueryParserHelper;
|
||||
|
||||
@Inject
|
||||
public HasChildFilterParser() {
|
||||
public HasChildFilterParser(InnerHitsQueryParserHelper innerHitsQueryParserHelper) {
|
||||
this.innerHitsQueryParserHelper = innerHitsQueryParserHelper;
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -66,6 +73,7 @@ public class HasChildFilterParser implements FilterParser {
|
|||
int shortCircuitParentDocSet = 8192; // Tests show a cut of point between 8192 and 16384.
|
||||
int minChildren = 0;
|
||||
int maxChildren = 0;
|
||||
Tuple<String, SubSearchContext> innerHits = null;
|
||||
|
||||
String filterName = null;
|
||||
String currentFieldName = null;
|
||||
|
@ -86,6 +94,8 @@ public class HasChildFilterParser implements FilterParser {
|
|||
} else if ("filter".equals(currentFieldName)) {
|
||||
innerFilter = new XContentStructure.InnerFilter(parseContext, childType == null ? null : new String[] {childType});
|
||||
filterFound = true;
|
||||
} else if ("inner_hits".equals(currentFieldName)) {
|
||||
innerHits = innerHitsQueryParserHelper.parse(parseContext);
|
||||
} else {
|
||||
throw new QueryParsingException(parseContext.index(), "[has_child] filter does not support [" + currentFieldName + "]");
|
||||
}
|
||||
|
@ -131,6 +141,11 @@ public class HasChildFilterParser implements FilterParser {
|
|||
if (childDocMapper == null) {
|
||||
throw new QueryParsingException(parseContext.index(), "No mapping for for type [" + childType + "]");
|
||||
}
|
||||
if (innerHits != null) {
|
||||
InnerHitsContext.ParentChildInnerHits parentChildInnerHits = new InnerHitsContext.ParentChildInnerHits(innerHits.v2(), query, null, childDocMapper);
|
||||
String name = innerHits.v1() != null ? innerHits.v1() : childType;
|
||||
parseContext.addInnerHits(name, parentChildInnerHits);
|
||||
}
|
||||
ParentFieldMapper parentFieldMapper = childDocMapper.parentFieldMapper();
|
||||
if (!parentFieldMapper.active()) {
|
||||
throw new QueryParsingException(parseContext.index(), "Type [" + childType + "] does not have parent mapping");
|
||||
|
|
|
@ -19,6 +19,7 @@
|
|||
package org.elasticsearch.index.query;
|
||||
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.support.QueryInnerHitBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -43,6 +44,8 @@ public class HasChildQueryBuilder extends BaseQueryBuilder implements BoostableQ
|
|||
|
||||
private String queryName;
|
||||
|
||||
private QueryInnerHitBuilder innerHit = null;
|
||||
|
||||
public HasChildQueryBuilder(String type, QueryBuilder queryBuilder) {
|
||||
this.childType = type;
|
||||
this.queryBuilder = queryBuilder;
|
||||
|
@ -98,6 +101,14 @@ public class HasChildQueryBuilder extends BaseQueryBuilder implements BoostableQ
|
|||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets inner hit definition in the scope of this query and reusing the defined type and query.
|
||||
*/
|
||||
public HasChildQueryBuilder innerHit(QueryInnerHitBuilder innerHit) {
|
||||
this.innerHit = innerHit;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void doXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject(HasChildQueryParser.NAME);
|
||||
|
@ -122,6 +133,11 @@ public class HasChildQueryBuilder extends BaseQueryBuilder implements BoostableQ
|
|||
if (queryName != null) {
|
||||
builder.field("_name", queryName);
|
||||
}
|
||||
if (innerHit != null) {
|
||||
builder.startObject("inner_hits");
|
||||
builder.value(innerHit);
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
}
|
||||
}
|
||||
|
|
|
@ -19,22 +19,26 @@
|
|||
|
||||
package org.elasticsearch.index.query;
|
||||
|
||||
import org.apache.lucene.search.FilteredQuery;
|
||||
import org.apache.lucene.search.Filter;
|
||||
import org.apache.lucene.search.FilteredQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.join.BitDocIdSetFilter;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.collect.Tuple;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.index.fielddata.plain.ParentChildIndexFieldData;
|
||||
import org.elasticsearch.index.mapper.DocumentMapper;
|
||||
import org.elasticsearch.index.mapper.internal.ParentFieldMapper;
|
||||
import org.elasticsearch.index.query.support.InnerHitsQueryParserHelper;
|
||||
import org.elasticsearch.index.query.support.XContentStructure;
|
||||
import org.elasticsearch.index.search.child.ChildrenConstantScoreQuery;
|
||||
import org.elasticsearch.index.search.child.ChildrenQuery;
|
||||
import org.elasticsearch.index.search.child.CustomQueryWrappingFilter;
|
||||
import org.elasticsearch.index.search.child.ScoreType;
|
||||
import org.elasticsearch.index.search.nested.NonNestedDocsFilter;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -47,8 +51,11 @@ public class HasChildQueryParser implements QueryParser {
|
|||
|
||||
public static final String NAME = "has_child";
|
||||
|
||||
private final InnerHitsQueryParserHelper innerHitsQueryParserHelper;
|
||||
|
||||
@Inject
|
||||
public HasChildQueryParser() {
|
||||
public HasChildQueryParser(InnerHitsQueryParserHelper innerHitsQueryParserHelper) {
|
||||
this.innerHitsQueryParserHelper = innerHitsQueryParserHelper;
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -69,6 +76,7 @@ public class HasChildQueryParser implements QueryParser {
|
|||
int maxChildren = 0;
|
||||
int shortCircuitParentDocSet = 8192;
|
||||
String queryName = null;
|
||||
Tuple<String, SubSearchContext> innerHits = null;
|
||||
|
||||
String currentFieldName = null;
|
||||
XContentParser.Token token;
|
||||
|
@ -84,6 +92,8 @@ public class HasChildQueryParser implements QueryParser {
|
|||
if ("query".equals(currentFieldName)) {
|
||||
iq = new XContentStructure.InnerQuery(parseContext, childType == null ? null : new String[] { childType });
|
||||
queryFound = true;
|
||||
} else if ("inner_hits".equals(currentFieldName)) {
|
||||
innerHits = innerHitsQueryParserHelper.parse(parseContext);
|
||||
} else {
|
||||
throw new QueryParsingException(parseContext.index(), "[has_child] query does not support [" + currentFieldName + "]");
|
||||
}
|
||||
|
@ -131,6 +141,12 @@ public class HasChildQueryParser implements QueryParser {
|
|||
throw new QueryParsingException(parseContext.index(), "[has_child] Type [" + childType + "] does not have parent mapping");
|
||||
}
|
||||
|
||||
if (innerHits != null) {
|
||||
InnerHitsContext.ParentChildInnerHits parentChildInnerHits = new InnerHitsContext.ParentChildInnerHits(innerHits.v2(), innerQuery, null, childDocMapper);
|
||||
String name = innerHits.v1() != null ? innerHits.v1() : childType;
|
||||
parseContext.addInnerHits(name, parentChildInnerHits);
|
||||
}
|
||||
|
||||
ParentFieldMapper parentFieldMapper = childDocMapper.parentFieldMapper();
|
||||
if (!parentFieldMapper.active()) {
|
||||
throw new QueryParsingException(parseContext.index(), "[has_child] _parent field not configured");
|
||||
|
|
|
@ -19,6 +19,7 @@
|
|||
package org.elasticsearch.index.query;
|
||||
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.support.QueryInnerHitBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -31,6 +32,7 @@ public class HasParentFilterBuilder extends BaseFilterBuilder {
|
|||
private final FilterBuilder filterBuilder;
|
||||
private final String parentType;
|
||||
private String filterName;
|
||||
private QueryInnerHitBuilder innerHit = null;
|
||||
|
||||
/**
|
||||
* @param parentType The parent type
|
||||
|
@ -74,6 +76,14 @@ public class HasParentFilterBuilder extends BaseFilterBuilder {
|
|||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets inner hit definition in the scope of this filter and reusing the defined type and query.
|
||||
*/
|
||||
public HasParentFilterBuilder innerHit(QueryInnerHitBuilder innerHit) {
|
||||
this.innerHit = innerHit;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void doXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject(HasParentFilterParser.NAME);
|
||||
|
@ -88,6 +98,11 @@ public class HasParentFilterBuilder extends BaseFilterBuilder {
|
|||
if (filterName != null) {
|
||||
builder.field("_name", filterName);
|
||||
}
|
||||
if (innerHit != null) {
|
||||
builder.startObject("inner_hits");
|
||||
builder.value(innerHit);
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
}
|
||||
}
|
||||
|
|
|
@ -21,10 +21,13 @@ package org.elasticsearch.index.query;
|
|||
import org.apache.lucene.search.Filter;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.collect.Tuple;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.index.query.support.InnerHitsQueryParserHelper;
|
||||
import org.elasticsearch.index.query.support.XContentStructure;
|
||||
import org.elasticsearch.index.search.child.CustomQueryWrappingFilter;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -38,8 +41,11 @@ public class HasParentFilterParser implements FilterParser {
|
|||
|
||||
public static final String NAME = "has_parent";
|
||||
|
||||
private final InnerHitsQueryParserHelper innerHitsQueryParserHelper;
|
||||
|
||||
@Inject
|
||||
public HasParentFilterParser() {
|
||||
public HasParentFilterParser(InnerHitsQueryParserHelper innerHitsQueryParserHelper) {
|
||||
this.innerHitsQueryParserHelper = innerHitsQueryParserHelper;
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -55,6 +61,7 @@ public class HasParentFilterParser implements FilterParser {
|
|||
boolean queryFound = false;
|
||||
boolean filterFound = false;
|
||||
String parentType = null;
|
||||
Tuple<String, SubSearchContext> innerHits = null;
|
||||
|
||||
String filterName = null;
|
||||
String currentFieldName = null;
|
||||
|
@ -75,6 +82,8 @@ public class HasParentFilterParser implements FilterParser {
|
|||
} else if ("filter".equals(currentFieldName)) {
|
||||
innerFilter = new XContentStructure.InnerFilter(parseContext, parentType == null ? null : new String[] {parentType});
|
||||
filterFound = true;
|
||||
} else if ("inner_hits".equals(currentFieldName)) {
|
||||
innerHits = innerHitsQueryParserHelper.parse(parseContext);
|
||||
} else {
|
||||
throw new QueryParsingException(parseContext.index(), "[has_parent] filter does not support [" + currentFieldName + "]");
|
||||
}
|
||||
|
@ -110,7 +119,7 @@ public class HasParentFilterParser implements FilterParser {
|
|||
return null;
|
||||
}
|
||||
|
||||
Query parentQuery = createParentQuery(innerQuery, parentType, false, parseContext);
|
||||
Query parentQuery = createParentQuery(innerQuery, parentType, false, parseContext, innerHits);
|
||||
if (parentQuery == null) {
|
||||
return null;
|
||||
}
|
||||
|
|
|
@ -19,6 +19,7 @@
|
|||
package org.elasticsearch.index.query;
|
||||
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.support.QueryInnerHitBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -32,6 +33,7 @@ public class HasParentQueryBuilder extends BaseQueryBuilder implements Boostable
|
|||
private String scoreType;
|
||||
private float boost = 1.0f;
|
||||
private String queryName;
|
||||
private QueryInnerHitBuilder innerHit = null;
|
||||
|
||||
/**
|
||||
* @param parentType The parent type
|
||||
|
@ -63,6 +65,14 @@ public class HasParentQueryBuilder extends BaseQueryBuilder implements Boostable
|
|||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets inner hit definition in the scope of this query and reusing the defined type and query.
|
||||
*/
|
||||
public HasParentQueryBuilder innerHit(QueryInnerHitBuilder innerHit) {
|
||||
this.innerHit = innerHit;
|
||||
return this;
|
||||
}
|
||||
|
||||
protected void doXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject(HasParentQueryParser.NAME);
|
||||
builder.field("query");
|
||||
|
@ -77,7 +87,11 @@ public class HasParentQueryBuilder extends BaseQueryBuilder implements Boostable
|
|||
if (queryName != null) {
|
||||
builder.field("_name", queryName);
|
||||
}
|
||||
|
||||
if (innerHit != null) {
|
||||
builder.startObject("inner_hits");
|
||||
builder.value(innerHit);
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
}
|
||||
}
|
||||
|
|
|
@ -23,6 +23,7 @@ import org.apache.lucene.search.Filter;
|
|||
import org.apache.lucene.search.FilteredQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.collect.Tuple;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.lucene.search.NotFilter;
|
||||
import org.elasticsearch.common.lucene.search.XBooleanFilter;
|
||||
|
@ -30,10 +31,13 @@ import org.elasticsearch.common.xcontent.XContentParser;
|
|||
import org.elasticsearch.index.fielddata.plain.ParentChildIndexFieldData;
|
||||
import org.elasticsearch.index.mapper.DocumentMapper;
|
||||
import org.elasticsearch.index.mapper.internal.ParentFieldMapper;
|
||||
import org.elasticsearch.index.query.support.InnerHitsQueryParserHelper;
|
||||
import org.elasticsearch.index.query.support.XContentStructure;
|
||||
import org.elasticsearch.index.search.child.CustomQueryWrappingFilter;
|
||||
import org.elasticsearch.index.search.child.ParentConstantScoreQuery;
|
||||
import org.elasticsearch.index.search.child.ParentQuery;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.HashSet;
|
||||
|
@ -45,8 +49,11 @@ public class HasParentQueryParser implements QueryParser {
|
|||
|
||||
public static final String NAME = "has_parent";
|
||||
|
||||
private final InnerHitsQueryParserHelper innerHitsQueryParserHelper;
|
||||
|
||||
@Inject
|
||||
public HasParentQueryParser() {
|
||||
public HasParentQueryParser(InnerHitsQueryParserHelper innerHitsQueryParserHelper) {
|
||||
this.innerHitsQueryParserHelper = innerHitsQueryParserHelper;
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -64,6 +71,7 @@ public class HasParentQueryParser implements QueryParser {
|
|||
String parentType = null;
|
||||
boolean score = false;
|
||||
String queryName = null;
|
||||
Tuple<String, SubSearchContext> innerHits = null;
|
||||
|
||||
String currentFieldName = null;
|
||||
XContentParser.Token token;
|
||||
|
@ -79,6 +87,8 @@ public class HasParentQueryParser implements QueryParser {
|
|||
if ("query".equals(currentFieldName)) {
|
||||
iq = new XContentStructure.InnerQuery(parseContext, parentType == null ? null : new String[] {parentType});
|
||||
queryFound = true;
|
||||
} else if ("inner_hits".equals(currentFieldName)) {
|
||||
innerHits = innerHitsQueryParserHelper.parse(parseContext);
|
||||
} else {
|
||||
throw new QueryParsingException(parseContext.index(), "[has_parent] query does not support [" + currentFieldName + "]");
|
||||
}
|
||||
|
@ -122,7 +132,7 @@ public class HasParentQueryParser implements QueryParser {
|
|||
}
|
||||
|
||||
innerQuery.setBoost(boost);
|
||||
Query query = createParentQuery(innerQuery, parentType, score, parseContext);
|
||||
Query query = createParentQuery(innerQuery, parentType, score, parseContext, innerHits);
|
||||
if (query == null) {
|
||||
return null;
|
||||
}
|
||||
|
@ -134,12 +144,18 @@ public class HasParentQueryParser implements QueryParser {
|
|||
return query;
|
||||
}
|
||||
|
||||
static Query createParentQuery(Query innerQuery, String parentType, boolean score, QueryParseContext parseContext) {
|
||||
static Query createParentQuery(Query innerQuery, String parentType, boolean score, QueryParseContext parseContext, Tuple<String, SubSearchContext> innerHits) {
|
||||
DocumentMapper parentDocMapper = parseContext.mapperService().documentMapper(parentType);
|
||||
if (parentDocMapper == null) {
|
||||
throw new QueryParsingException(parseContext.index(), "[has_parent] query configured 'parent_type' [" + parentType + "] is not a valid type");
|
||||
}
|
||||
|
||||
if (innerHits != null) {
|
||||
InnerHitsContext.ParentChildInnerHits parentChildInnerHits = new InnerHitsContext.ParentChildInnerHits(innerHits.v2(), innerQuery, null, parentDocMapper);
|
||||
String name = innerHits.v1() != null ? innerHits.v1() : parentType;
|
||||
parseContext.addInnerHits(name, parentChildInnerHits);
|
||||
}
|
||||
|
||||
Set<String> parentTypes = new HashSet<>(5);
|
||||
parentTypes.add(parentDocMapper.type());
|
||||
ParentChildIndexFieldData parentChildIndexFieldData = null;
|
||||
|
|
|
@ -26,6 +26,7 @@ import org.elasticsearch.common.inject.Scopes;
|
|||
import org.elasticsearch.common.inject.assistedinject.FactoryProvider;
|
||||
import org.elasticsearch.common.inject.multibindings.MapBinder;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.index.query.support.InnerHitsQueryParserHelper;
|
||||
|
||||
import java.util.LinkedList;
|
||||
import java.util.Map;
|
||||
|
@ -146,6 +147,7 @@ public class IndexQueryParserModule extends AbstractModule {
|
|||
protected void configure() {
|
||||
|
||||
bind(IndexQueryParserService.class).asEagerSingleton();
|
||||
bind(InnerHitsQueryParserHelper.class).asEagerSingleton();
|
||||
|
||||
// handle XContenQueryParsers
|
||||
MapBinder<String, QueryParserFactory> queryBinder
|
||||
|
|
|
@ -20,6 +20,7 @@
|
|||
package org.elasticsearch.index.query;
|
||||
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.support.QueryInnerHitBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -35,6 +36,8 @@ public class NestedFilterBuilder extends BaseFilterBuilder {
|
|||
private String cacheKey;
|
||||
private String filterName;
|
||||
|
||||
private QueryInnerHitBuilder innerHit = null;
|
||||
|
||||
public NestedFilterBuilder(String path, QueryBuilder queryBuilder) {
|
||||
this.path = path;
|
||||
this.queryBuilder = queryBuilder;
|
||||
|
@ -73,6 +76,14 @@ public class NestedFilterBuilder extends BaseFilterBuilder {
|
|||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets inner hit definition in the scope of this nested filter and reusing the defined path and query.
|
||||
*/
|
||||
public NestedFilterBuilder innerHit(QueryInnerHitBuilder innerHit) {
|
||||
this.innerHit = innerHit;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void doXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject(NestedFilterParser.NAME);
|
||||
|
@ -96,6 +107,11 @@ public class NestedFilterBuilder extends BaseFilterBuilder {
|
|||
if (cacheKey != null) {
|
||||
builder.field("_cache_key", cacheKey);
|
||||
}
|
||||
if (innerHit != null) {
|
||||
builder.startObject("inner_hits");
|
||||
builder.value(innerHit);
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
}
|
||||
}
|
|
@ -27,13 +27,18 @@ import org.apache.lucene.search.join.BitDocIdSetFilter;
|
|||
import org.apache.lucene.search.join.ScoreMode;
|
||||
import org.apache.lucene.search.join.ToParentBlockJoinQuery;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.collect.Tuple;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.lucene.HashedBytesRef;
|
||||
import org.elasticsearch.common.lucene.search.Queries;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.index.mapper.DocumentMapper;
|
||||
import org.elasticsearch.index.mapper.MapperService;
|
||||
import org.elasticsearch.index.mapper.object.ObjectMapper;
|
||||
import org.elasticsearch.index.query.support.InnerHitsQueryParserHelper;
|
||||
import org.elasticsearch.index.search.nested.NonNestedDocsFilter;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -41,8 +46,11 @@ public class NestedFilterParser implements FilterParser {
|
|||
|
||||
public static final String NAME = "nested";
|
||||
|
||||
private final InnerHitsQueryParserHelper innerHitsQueryParserHelper;
|
||||
|
||||
@Inject
|
||||
public NestedFilterParser() {
|
||||
public NestedFilterParser(InnerHitsQueryParserHelper innerHitsQueryParserHelper) {
|
||||
this.innerHitsQueryParserHelper = innerHitsQueryParserHelper;
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -63,6 +71,7 @@ public class NestedFilterParser implements FilterParser {
|
|||
boolean cache = false;
|
||||
HashedBytesRef cacheKey = null;
|
||||
String filterName = null;
|
||||
Tuple<String, SubSearchContext> innerHits = null;
|
||||
|
||||
// we need a late binding filter so we can inject a parent nested filter inner nested queries
|
||||
NestedQueryParser.LateBindingParentFilter currentParentFilterContext = NestedQueryParser.parentFilterContext.get();
|
||||
|
@ -83,6 +92,8 @@ public class NestedFilterParser implements FilterParser {
|
|||
} else if ("filter".equals(currentFieldName)) {
|
||||
filterFound = true;
|
||||
filter = parseContext.parseInnerFilter();
|
||||
} else if ("inner_hits".equals(currentFieldName)) {
|
||||
innerHits = innerHitsQueryParserHelper.parse(parseContext);
|
||||
} else {
|
||||
throw new QueryParsingException(parseContext.index(), "[nested] filter does not support [" + currentFieldName + "]");
|
||||
}
|
||||
|
@ -135,6 +146,13 @@ public class NestedFilterParser implements FilterParser {
|
|||
usAsParentFilter.filter = childFilter;
|
||||
// wrap the child query to only work on the nested path type
|
||||
query = new FilteredQuery(query, childFilter);
|
||||
if (innerHits != null) {
|
||||
DocumentMapper childDocumentMapper = mapper.docMapper();
|
||||
ObjectMapper parentObjectMapper = childDocumentMapper.findParentObjectMapper(objectMapper);
|
||||
InnerHitsContext.NestedInnerHits nestedInnerHits = new InnerHitsContext.NestedInnerHits(innerHits.v2(), query, null, parentObjectMapper, objectMapper);
|
||||
String name = innerHits.v1() != null ? innerHits.v1() : path;
|
||||
parseContext.addInnerHits(name, nestedInnerHits);
|
||||
}
|
||||
|
||||
BitDocIdSetFilter parentFilter = currentParentFilterContext;
|
||||
if (parentFilter == null) {
|
||||
|
|
|
@ -20,6 +20,7 @@
|
|||
package org.elasticsearch.index.query;
|
||||
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.support.QueryInnerHitBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -36,6 +37,8 @@ public class NestedQueryBuilder extends BaseQueryBuilder implements BoostableQue
|
|||
|
||||
private String queryName;
|
||||
|
||||
private QueryInnerHitBuilder innerHit;
|
||||
|
||||
public NestedQueryBuilder(String path, QueryBuilder queryBuilder) {
|
||||
this.path = path;
|
||||
this.queryBuilder = queryBuilder;
|
||||
|
@ -73,6 +76,14 @@ public class NestedQueryBuilder extends BaseQueryBuilder implements BoostableQue
|
|||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets inner hit definition in the scope of this nested query and reusing the defined path and query.
|
||||
*/
|
||||
public NestedQueryBuilder innerHit(QueryInnerHitBuilder innerHit) {
|
||||
this.innerHit = innerHit;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void doXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject(NestedQueryParser.NAME);
|
||||
|
@ -93,6 +104,12 @@ public class NestedQueryBuilder extends BaseQueryBuilder implements BoostableQue
|
|||
if (queryName != null) {
|
||||
builder.field("_name", queryName);
|
||||
}
|
||||
if (innerHit != null) {
|
||||
builder.startObject("inner_hits");
|
||||
builder.value(innerHit);
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -29,11 +29,16 @@ import org.apache.lucene.search.join.ScoreMode;
|
|||
import org.apache.lucene.search.join.ToParentBlockJoinQuery;
|
||||
import org.apache.lucene.util.BitDocIdSet;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.collect.Tuple;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.index.mapper.DocumentMapper;
|
||||
import org.elasticsearch.index.mapper.MapperService;
|
||||
import org.elasticsearch.index.mapper.object.ObjectMapper;
|
||||
import org.elasticsearch.index.query.support.InnerHitsQueryParserHelper;
|
||||
import org.elasticsearch.index.search.nested.NonNestedDocsFilter;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext.NestedInnerHits;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
|
@ -41,8 +46,11 @@ public class NestedQueryParser implements QueryParser {
|
|||
|
||||
public static final String NAME = "nested";
|
||||
|
||||
private final InnerHitsQueryParserHelper innerHitsQueryParserHelper;
|
||||
|
||||
@Inject
|
||||
public NestedQueryParser() {
|
||||
public NestedQueryParser(InnerHitsQueryParserHelper innerHitsQueryParserHelper) {
|
||||
this.innerHitsQueryParserHelper = innerHitsQueryParserHelper;
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -62,6 +70,7 @@ public class NestedQueryParser implements QueryParser {
|
|||
String path = null;
|
||||
ScoreMode scoreMode = ScoreMode.Avg;
|
||||
String queryName = null;
|
||||
Tuple<String, SubSearchContext> innerHits = null;
|
||||
|
||||
// we need a late binding filter so we can inject a parent nested filter inner nested queries
|
||||
LateBindingParentFilter currentParentFilterContext = parentFilterContext.get();
|
||||
|
@ -82,6 +91,8 @@ public class NestedQueryParser implements QueryParser {
|
|||
} else if ("filter".equals(currentFieldName)) {
|
||||
filterFound = true;
|
||||
filter = parseContext.parseInnerFilter();
|
||||
} else if ("inner_hits".equals(currentFieldName)) {
|
||||
innerHits = innerHitsQueryParserHelper.parse(parseContext);
|
||||
} else {
|
||||
throw new QueryParsingException(parseContext.index(), "[nested] query does not support [" + currentFieldName + "]");
|
||||
}
|
||||
|
@ -141,6 +152,13 @@ public class NestedQueryParser implements QueryParser {
|
|||
usAsParentFilter.filter = childFilter;
|
||||
// wrap the child query to only work on the nested path type
|
||||
query = new FilteredQuery(query, childFilter);
|
||||
if (innerHits != null) {
|
||||
DocumentMapper childDocumentMapper = mapper.docMapper();
|
||||
ObjectMapper parentObjectMapper = childDocumentMapper.findParentObjectMapper(objectMapper);
|
||||
NestedInnerHits nestedInnerHits = new NestedInnerHits(innerHits.v2(), query, null, parentObjectMapper, objectMapper);
|
||||
String name = innerHits.v1() != null ? innerHits.v1() : path;
|
||||
parseContext.addInnerHits(name, nestedInnerHits);
|
||||
}
|
||||
|
||||
BitDocIdSetFilter parentFilter = currentParentFilterContext;
|
||||
if (parentFilter == null) {
|
||||
|
|
|
@ -32,7 +32,6 @@ import org.elasticsearch.common.lucene.search.Queries;
|
|||
public class ParsedQuery {
|
||||
|
||||
private final Query query;
|
||||
|
||||
private final ImmutableMap<String, Filter> namedFilters;
|
||||
|
||||
public ParsedQuery(Query query, ImmutableMap<String, Filter> namedFilters) {
|
||||
|
@ -45,6 +44,11 @@ public class ParsedQuery {
|
|||
this.namedFilters = parsedQuery.namedFilters;
|
||||
}
|
||||
|
||||
public ParsedQuery(Query query) {
|
||||
this.query = query;
|
||||
this.namedFilters = ImmutableMap.of();
|
||||
}
|
||||
|
||||
/**
|
||||
* The query parsed.
|
||||
*/
|
||||
|
@ -55,7 +59,7 @@ public class ParsedQuery {
|
|||
public ImmutableMap<String, Filter> namedFilters() {
|
||||
return this.namedFilters;
|
||||
}
|
||||
|
||||
|
||||
public static ParsedQuery parsedMatchAllQuery() {
|
||||
return new ParsedQuery(Queries.newMatchAllQuery(), ImmutableMap.<String, Filter>of());
|
||||
}
|
||||
|
|
|
@ -21,7 +21,6 @@ package org.elasticsearch.index.query;
|
|||
|
||||
import com.google.common.collect.ImmutableMap;
|
||||
import com.google.common.collect.Maps;
|
||||
|
||||
import org.apache.lucene.index.LeafReaderContext;
|
||||
import org.apache.lucene.queryparser.classic.MapperQueryParser;
|
||||
import org.apache.lucene.queryparser.classic.QueryParserSettings;
|
||||
|
@ -51,15 +50,12 @@ import org.elasticsearch.index.mapper.MapperService;
|
|||
import org.elasticsearch.index.search.child.CustomQueryWrappingFilter;
|
||||
import org.elasticsearch.index.similarity.SimilarityService;
|
||||
import org.elasticsearch.script.ScriptService;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
import org.elasticsearch.search.lookup.SearchLookup;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collection;
|
||||
import java.util.EnumSet;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.*;
|
||||
|
||||
/**
|
||||
*
|
||||
|
@ -251,12 +247,21 @@ public class QueryParseContext {
|
|||
}
|
||||
|
||||
public ImmutableMap<String, Filter> copyNamedFilters() {
|
||||
if (namedFilters.isEmpty()) {
|
||||
return ImmutableMap.of();
|
||||
}
|
||||
return ImmutableMap.copyOf(namedFilters);
|
||||
}
|
||||
|
||||
public void addInnerHits(String name, InnerHitsContext.BaseInnerHits context) {
|
||||
SearchContext sc = SearchContext.current();
|
||||
InnerHitsContext innerHitsContext;
|
||||
if (sc.innerHits() == null) {
|
||||
innerHitsContext = new InnerHitsContext(new HashMap<String, InnerHitsContext.BaseInnerHits>());
|
||||
sc.innerHits(innerHitsContext);
|
||||
} else {
|
||||
innerHitsContext = sc.innerHits();
|
||||
}
|
||||
innerHitsContext.addInnerHitDefinition(name, context);
|
||||
}
|
||||
|
||||
@Nullable
|
||||
public Query parseInnerQuery() throws IOException, QueryParsingException {
|
||||
// move to START object
|
||||
|
|
|
@ -0,0 +1,394 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.index.query.support;
|
||||
|
||||
import org.elasticsearch.common.Nullable;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.QueryBuilder;
|
||||
import org.elasticsearch.search.builder.SearchSourceBuilder;
|
||||
import org.elasticsearch.search.highlight.HighlightBuilder;
|
||||
import org.elasticsearch.search.sort.SortBuilder;
|
||||
import org.elasticsearch.search.sort.SortOrder;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public abstract class BaseInnerHitBuilder<T extends BaseInnerHitBuilder> implements ToXContent {
|
||||
|
||||
protected SearchSourceBuilder sourceBuilder;
|
||||
|
||||
/**
|
||||
* The index to start to return hits from. Defaults to <tt>0</tt>.
|
||||
*/
|
||||
public T setFrom(int from) {
|
||||
sourceBuilder().from(from);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* The number of search hits to return. Defaults to <tt>10</tt>.
|
||||
*/
|
||||
public T setSize(int size) {
|
||||
sourceBuilder().size(size);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Applies when sorting, and controls if scores will be tracked as well. Defaults to
|
||||
* <tt>false</tt>.
|
||||
*/
|
||||
public T setTrackScores(boolean trackScores) {
|
||||
sourceBuilder().trackScores(trackScores);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Should each {@link org.elasticsearch.search.SearchHit} be returned with an
|
||||
* explanation of the hit (ranking).
|
||||
*/
|
||||
public T setExplain(boolean explain) {
|
||||
sourceBuilder().explain(explain);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Should each {@link org.elasticsearch.search.SearchHit} be returned with its
|
||||
* version.
|
||||
*/
|
||||
public T setVersion(boolean version) {
|
||||
sourceBuilder().version(version);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets no fields to be loaded, resulting in only id and type to be returned per field.
|
||||
*/
|
||||
public T setNoFields() {
|
||||
sourceBuilder().noFields();
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicates whether the response should contain the stored _source for every hit
|
||||
*/
|
||||
public T setFetchSource(boolean fetch) {
|
||||
sourceBuilder().fetchSource(fetch);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicate that _source should be returned with every hit, with an "include" and/or "exclude" set which can include simple wildcard
|
||||
* elements.
|
||||
*
|
||||
* @param include An optional include (optionally wildcarded) pattern to filter the returned _source
|
||||
* @param exclude An optional exclude (optionally wildcarded) pattern to filter the returned _source
|
||||
*/
|
||||
public T setFetchSource(@Nullable String include, @Nullable String exclude) {
|
||||
sourceBuilder().fetchSource(include, exclude);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicate that _source should be returned with every hit, with an "include" and/or "exclude" set which can include simple wildcard
|
||||
* elements.
|
||||
*
|
||||
* @param includes An optional list of include (optionally wildcarded) pattern to filter the returned _source
|
||||
* @param excludes An optional list of exclude (optionally wildcarded) pattern to filter the returned _source
|
||||
*/
|
||||
public T setFetchSource(@Nullable String[] includes, @Nullable String[] excludes) {
|
||||
sourceBuilder().fetchSource(includes, excludes);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field data based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The field to get from the field data cache
|
||||
*/
|
||||
public T addFieldDataField(String name) {
|
||||
sourceBuilder().fieldDataField(name);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a script based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The name that will represent this value in the return hit
|
||||
* @param script The script to use
|
||||
*/
|
||||
public T addScriptField(String name, String script) {
|
||||
sourceBuilder().scriptField(name, script);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a script based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The name that will represent this value in the return hit
|
||||
* @param script The script to use
|
||||
* @param params Parameters that the script can use.
|
||||
*/
|
||||
public T addScriptField(String name, String script, Map<String, Object> params) {
|
||||
sourceBuilder().scriptField(name, script, params);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a script based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The name that will represent this value in the return hit
|
||||
* @param lang The language of the script
|
||||
* @param script The script to use
|
||||
* @param params Parameters that the script can use (can be <tt>null</tt>).
|
||||
*/
|
||||
public T addScriptField(String name, String lang, String script, Map<String, Object> params) {
|
||||
sourceBuilder().scriptField(name, lang, script, params);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a sort against the given field name and the sort ordering.
|
||||
*
|
||||
* @param field The name of the field
|
||||
* @param order The sort ordering
|
||||
*/
|
||||
public T addSort(String field, SortOrder order) {
|
||||
sourceBuilder().sort(field, order);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a generic sort builder.
|
||||
*
|
||||
* @see org.elasticsearch.search.sort.SortBuilders
|
||||
*/
|
||||
public T addSort(SortBuilder sort) {
|
||||
sourceBuilder().sort(sort);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
public HighlightBuilder highlightBuilder() {
|
||||
return sourceBuilder().highlighter();
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with default fragment size of 100 characters, and
|
||||
* default number of fragments of 5.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
*/
|
||||
public T addHighlightedField(String name) {
|
||||
highlightBuilder().field(name);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with a provided fragment size (in characters), and
|
||||
* default number of fragments of 5.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
* @param fragmentSize The size of a fragment in characters
|
||||
*/
|
||||
public T addHighlightedField(String name, int fragmentSize) {
|
||||
highlightBuilder().field(name, fragmentSize);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with a provided fragment size (in characters), and
|
||||
* a provided (maximum) number of fragments.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
* @param fragmentSize The size of a fragment in characters
|
||||
* @param numberOfFragments The (maximum) number of fragments
|
||||
*/
|
||||
public T addHighlightedField(String name, int fragmentSize, int numberOfFragments) {
|
||||
highlightBuilder().field(name, fragmentSize, numberOfFragments);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with a provided fragment size (in characters),
|
||||
* a provided (maximum) number of fragments and an offset for the highlight.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
* @param fragmentSize The size of a fragment in characters
|
||||
* @param numberOfFragments The (maximum) number of fragments
|
||||
*/
|
||||
public T addHighlightedField(String name, int fragmentSize, int numberOfFragments,
|
||||
int fragmentOffset) {
|
||||
highlightBuilder().field(name, fragmentSize, numberOfFragments, fragmentOffset);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a highlighted field.
|
||||
*/
|
||||
public T addHighlightedField(HighlightBuilder.Field field) {
|
||||
highlightBuilder().field(field);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Set a tag scheme that encapsulates a built in pre and post tags. The allows schemes
|
||||
* are <tt>styled</tt> and <tt>default</tt>.
|
||||
*
|
||||
* @param schemaName The tag scheme name
|
||||
*/
|
||||
public T setHighlighterTagsSchema(String schemaName) {
|
||||
highlightBuilder().tagsSchema(schemaName);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
public T setHighlighterFragmentSize(Integer fragmentSize) {
|
||||
highlightBuilder().fragmentSize(fragmentSize);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
public T setHighlighterNumOfFragments(Integer numOfFragments) {
|
||||
highlightBuilder().numOfFragments(numOfFragments);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
public T setHighlighterFilter(Boolean highlightFilter) {
|
||||
highlightBuilder().highlightFilter(highlightFilter);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The encoder to set for highlighting
|
||||
*/
|
||||
public T setHighlighterEncoder(String encoder) {
|
||||
highlightBuilder().encoder(encoder);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Explicitly set the pre tags that will be used for highlighting.
|
||||
*/
|
||||
public T setHighlighterPreTags(String... preTags) {
|
||||
highlightBuilder().preTags(preTags);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Explicitly set the post tags that will be used for highlighting.
|
||||
*/
|
||||
public T setHighlighterPostTags(String... postTags) {
|
||||
highlightBuilder().postTags(postTags);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The order of fragments per field. By default, ordered by the order in the
|
||||
* highlighted text. Can be <tt>score</tt>, which then it will be ordered
|
||||
* by score of the fragments.
|
||||
*/
|
||||
public T setHighlighterOrder(String order) {
|
||||
highlightBuilder().order(order);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
public T setHighlighterRequireFieldMatch(boolean requireFieldMatch) {
|
||||
highlightBuilder().requireFieldMatch(requireFieldMatch);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
public T setHighlighterBoundaryMaxScan(Integer boundaryMaxScan) {
|
||||
highlightBuilder().boundaryMaxScan(boundaryMaxScan);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
public T setHighlighterBoundaryChars(char[] boundaryChars) {
|
||||
highlightBuilder().boundaryChars(boundaryChars);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The highlighter type to use.
|
||||
*/
|
||||
public T setHighlighterType(String type) {
|
||||
highlightBuilder().highlighterType(type);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
public T setHighlighterFragmenter(String fragmenter) {
|
||||
highlightBuilder().fragmenter(fragmenter);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets a query to be used for highlighting all fields instead of the search query.
|
||||
*/
|
||||
public T setHighlighterQuery(QueryBuilder highlightQuery) {
|
||||
highlightBuilder().highlightQuery(highlightQuery);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the size of the fragment to return from the beginning of the field if there are no matches to
|
||||
* highlight and the field doesn't also define noMatchSize.
|
||||
* @param noMatchSize integer to set or null to leave out of request. default is null.
|
||||
* @return this builder for chaining
|
||||
*/
|
||||
public T setHighlighterNoMatchSize(Integer noMatchSize) {
|
||||
highlightBuilder().noMatchSize(noMatchSize);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the maximum number of phrases the fvh will consider if the field doesn't also define phraseLimit.
|
||||
*/
|
||||
public T setHighlighterPhraseLimit(Integer phraseLimit) {
|
||||
highlightBuilder().phraseLimit(phraseLimit);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
public T setHighlighterOptions(Map<String, Object> options) {
|
||||
highlightBuilder().options(options);
|
||||
return (T) this;
|
||||
}
|
||||
|
||||
protected SearchSourceBuilder sourceBuilder() {
|
||||
if (sourceBuilder == null) {
|
||||
sourceBuilder = new SearchSourceBuilder();
|
||||
}
|
||||
return sourceBuilder;
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
if (sourceBuilder != null) {
|
||||
sourceBuilder.innerToXContent(builder, params);
|
||||
}
|
||||
return builder;
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,132 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.index.query.support;
|
||||
|
||||
import org.elasticsearch.ElasticsearchIllegalArgumentException;
|
||||
import org.elasticsearch.common.collect.Tuple;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.index.query.QueryParseContext;
|
||||
import org.elasticsearch.index.query.QueryParsingException;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsParseElement;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsParseElement;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceParseElement;
|
||||
import org.elasticsearch.search.highlight.HighlighterParseElement;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
import org.elasticsearch.search.sort.SortParseElement;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
public class InnerHitsQueryParserHelper {
|
||||
|
||||
private final SortParseElement sortParseElement;
|
||||
private final FetchSourceParseElement sourceParseElement;
|
||||
private final HighlighterParseElement highlighterParseElement;
|
||||
private final ScriptFieldsParseElement scriptFieldsParseElement;
|
||||
private final FieldDataFieldsParseElement fieldDataFieldsParseElement;
|
||||
|
||||
@Inject
|
||||
public InnerHitsQueryParserHelper(SortParseElement sortParseElement, FetchSourceParseElement sourceParseElement, HighlighterParseElement highlighterParseElement, ScriptFieldsParseElement scriptFieldsParseElement, FieldDataFieldsParseElement fieldDataFieldsParseElement) {
|
||||
this.sortParseElement = sortParseElement;
|
||||
this.sourceParseElement = sourceParseElement;
|
||||
this.highlighterParseElement = highlighterParseElement;
|
||||
this.scriptFieldsParseElement = scriptFieldsParseElement;
|
||||
this.fieldDataFieldsParseElement = fieldDataFieldsParseElement;
|
||||
}
|
||||
|
||||
public Tuple<String, SubSearchContext> parse(QueryParseContext parserContext) throws IOException, QueryParsingException {
|
||||
String fieldName = null;
|
||||
XContentParser.Token token;
|
||||
String innerHitName = null;
|
||||
SubSearchContext subSearchContext = new SubSearchContext(SearchContext.current());
|
||||
try {
|
||||
XContentParser parser = parserContext.parser();
|
||||
while ((token = parser.nextToken()) != XContentParser.Token.END_OBJECT) {
|
||||
if (token == XContentParser.Token.FIELD_NAME) {
|
||||
fieldName = parser.currentName();
|
||||
} else if (token.isValue()) {
|
||||
if ("name".equals(fieldName)) {
|
||||
innerHitName = parser.textOrNull();
|
||||
} else {
|
||||
parseCommonInnerHitOptions(parser, token, fieldName, subSearchContext, sortParseElement, sourceParseElement, highlighterParseElement, scriptFieldsParseElement, fieldDataFieldsParseElement);
|
||||
}
|
||||
} else {
|
||||
parseCommonInnerHitOptions(parser, token, fieldName, subSearchContext, sortParseElement, sourceParseElement, highlighterParseElement, scriptFieldsParseElement, fieldDataFieldsParseElement);
|
||||
}
|
||||
}
|
||||
} catch (Exception e) {
|
||||
throw new QueryParsingException(parserContext.index(), "Failed to parse [_inner_hits]", e);
|
||||
}
|
||||
return new Tuple<>(innerHitName, subSearchContext);
|
||||
}
|
||||
|
||||
public static void parseCommonInnerHitOptions(XContentParser parser, XContentParser.Token token, String fieldName, SubSearchContext subSearchContext,
|
||||
SortParseElement sortParseElement, FetchSourceParseElement sourceParseElement, HighlighterParseElement highlighterParseElement,
|
||||
ScriptFieldsParseElement scriptFieldsParseElement, FieldDataFieldsParseElement fieldDataFieldsParseElement) throws Exception {
|
||||
if ("sort".equals(fieldName)) {
|
||||
sortParseElement.parse(parser, subSearchContext);
|
||||
} else if ("_source".equals(fieldName)) {
|
||||
sourceParseElement.parse(parser, subSearchContext);
|
||||
} else if (token == XContentParser.Token.START_OBJECT) {
|
||||
switch (fieldName) {
|
||||
case "highlight":
|
||||
highlighterParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
case "scriptFields":
|
||||
case "script_fields":
|
||||
scriptFieldsParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
default:
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown key for a " + token + " for nested query: [" + fieldName + "].");
|
||||
}
|
||||
} else if (token == XContentParser.Token.START_ARRAY) {
|
||||
switch (fieldName) {
|
||||
case "fielddataFields":
|
||||
case "fielddata_fields":
|
||||
fieldDataFieldsParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
default:
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown key for a " + token + " for nested query: [" + fieldName + "].");
|
||||
}
|
||||
} else if (token.isValue()) {
|
||||
switch (fieldName) {
|
||||
case "from":
|
||||
subSearchContext.from(parser.intValue());
|
||||
break;
|
||||
case "size":
|
||||
subSearchContext.size(parser.intValue());
|
||||
break;
|
||||
case "track_scores":
|
||||
case "trackScores":
|
||||
subSearchContext.trackScores(parser.booleanValue());
|
||||
break;
|
||||
case "version":
|
||||
subSearchContext.version(parser.booleanValue());
|
||||
break;
|
||||
case "explain":
|
||||
subSearchContext.explain(parser.booleanValue());
|
||||
break;
|
||||
default:
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown key for a " + token + " for nested query: [" + fieldName + "].");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,51 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.index.query.support;
|
||||
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class QueryInnerHitBuilder extends BaseInnerHitBuilder<QueryInnerHitBuilder> {
|
||||
|
||||
private String name;
|
||||
|
||||
/**
|
||||
* Set the key name to be used in the response.
|
||||
*
|
||||
* Defaults to the path if used in nested query, child type if used in has_child query and parent type if used in has_parent.
|
||||
*/
|
||||
public QueryInnerHitBuilder setName(String name) {
|
||||
this.name = name;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
super.toXContent(builder, params);
|
||||
if (name != null) {
|
||||
builder.field("name", name);
|
||||
}
|
||||
return builder;
|
||||
}
|
||||
|
||||
}
|
|
@ -19,20 +19,11 @@
|
|||
package org.elasticsearch.percolator;
|
||||
|
||||
import com.carrotsearch.hppc.ByteObjectOpenHashMap;
|
||||
import com.google.common.collect.ImmutableMap;
|
||||
|
||||
import org.apache.lucene.index.LeafReaderContext;
|
||||
import org.apache.lucene.index.ReaderUtil;
|
||||
import org.apache.lucene.index.memory.ExtendedMemoryIndex;
|
||||
import org.apache.lucene.index.memory.MemoryIndex;
|
||||
import org.apache.lucene.search.Collector;
|
||||
import org.apache.lucene.search.ConstantScoreQuery;
|
||||
import org.apache.lucene.search.Filter;
|
||||
import org.apache.lucene.search.FilteredQuery;
|
||||
import org.apache.lucene.search.MatchAllDocsQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.ScoreDoc;
|
||||
import org.apache.lucene.search.TopDocs;
|
||||
import org.apache.lucene.search.*;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.apache.lucene.util.CloseableThreadLocal;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
|
@ -63,6 +54,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
|
|||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.common.xcontent.XContentType;
|
||||
import org.elasticsearch.index.IndexService;
|
||||
import org.elasticsearch.index.engine.Engine;
|
||||
import org.elasticsearch.index.fielddata.IndexFieldData;
|
||||
import org.elasticsearch.index.fielddata.SortedBinaryDocValues;
|
||||
|
@ -74,13 +66,9 @@ import org.elasticsearch.index.mapper.internal.IdFieldMapper;
|
|||
import org.elasticsearch.index.percolator.stats.ShardPercolateService;
|
||||
import org.elasticsearch.index.query.ParsedQuery;
|
||||
import org.elasticsearch.index.search.nested.NonNestedDocsFilter;
|
||||
import org.elasticsearch.index.IndexService;
|
||||
import org.elasticsearch.index.shard.IndexShard;
|
||||
import org.elasticsearch.indices.IndicesService;
|
||||
import org.elasticsearch.percolator.QueryCollector.Count;
|
||||
import org.elasticsearch.percolator.QueryCollector.Match;
|
||||
import org.elasticsearch.percolator.QueryCollector.MatchAndScore;
|
||||
import org.elasticsearch.percolator.QueryCollector.MatchAndSort;
|
||||
import org.elasticsearch.percolator.QueryCollector.*;
|
||||
import org.elasticsearch.script.ScriptService;
|
||||
import org.elasticsearch.search.SearchParseElement;
|
||||
import org.elasticsearch.search.SearchShardTarget;
|
||||
|
@ -98,9 +86,7 @@ import java.util.List;
|
|||
import java.util.Map;
|
||||
|
||||
import static org.elasticsearch.index.mapper.SourceToParse.source;
|
||||
import static org.elasticsearch.percolator.QueryCollector.count;
|
||||
import static org.elasticsearch.percolator.QueryCollector.match;
|
||||
import static org.elasticsearch.percolator.QueryCollector.matchAndScore;
|
||||
import static org.elasticsearch.percolator.QueryCollector.*;
|
||||
|
||||
/**
|
||||
*/
|
||||
|
@ -539,7 +525,7 @@ public class PercolatorService extends AbstractComponent {
|
|||
|
||||
for (Map.Entry<BytesRef, Query> entry : context.percolateQueries().entrySet()) {
|
||||
if (context.highlight() != null) {
|
||||
context.parsedQuery(new ParsedQuery(entry.getValue(), ImmutableMap.<String, Filter>of()));
|
||||
context.parsedQuery(new ParsedQuery(entry.getValue()));
|
||||
context.hitContext().cache().clear();
|
||||
}
|
||||
try {
|
||||
|
@ -758,7 +744,7 @@ public class PercolatorService extends AbstractComponent {
|
|||
matches.add(BytesRef.deepCopyOf(bytes));
|
||||
if (hls != null) {
|
||||
Query query = context.percolateQueries().get(bytes);
|
||||
context.parsedQuery(new ParsedQuery(query, ImmutableMap.<String, Filter>of()));
|
||||
context.parsedQuery(new ParsedQuery(query));
|
||||
context.hitContext().cache().clear();
|
||||
highlightPhase.hitExecute(context, context.hitContext());
|
||||
hls.add(i, context.hitContext().hit().getHighlightFields());
|
||||
|
|
|
@ -20,7 +20,6 @@ package org.elasticsearch.percolator;
|
|||
|
||||
import com.carrotsearch.hppc.FloatArrayList;
|
||||
import com.google.common.collect.ImmutableList;
|
||||
import com.google.common.collect.ImmutableMap;
|
||||
import org.apache.lucene.index.LeafReaderContext;
|
||||
import org.apache.lucene.search.*;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
|
@ -186,7 +185,7 @@ abstract class QueryCollector extends SimpleCollector {
|
|||
// run the query
|
||||
try {
|
||||
if (context.highlight() != null) {
|
||||
context.parsedQuery(new ParsedQuery(query, ImmutableMap.<String, Filter>of()));
|
||||
context.parsedQuery(new ParsedQuery(query));
|
||||
context.hitContext().cache().clear();
|
||||
}
|
||||
|
||||
|
@ -308,7 +307,7 @@ abstract class QueryCollector extends SimpleCollector {
|
|||
// run the query
|
||||
try {
|
||||
if (context.highlight() != null) {
|
||||
context.parsedQuery(new ParsedQuery(query, ImmutableMap.<String, Filter>of()));
|
||||
context.parsedQuery(new ParsedQuery(query));
|
||||
context.hitContext().cache().clear();
|
||||
}
|
||||
if (isNestedDoc) {
|
||||
|
|
|
@ -19,14 +19,10 @@
|
|||
|
||||
package org.elasticsearch.search.fetch.innerhits;
|
||||
|
||||
import org.elasticsearch.common.Nullable;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.QueryBuilder;
|
||||
import org.elasticsearch.search.builder.SearchSourceBuilder;
|
||||
import org.elasticsearch.search.highlight.HighlightBuilder;
|
||||
import org.elasticsearch.search.sort.SortBuilder;
|
||||
import org.elasticsearch.search.sort.SortOrder;
|
||||
import org.elasticsearch.index.query.support.BaseInnerHitBuilder;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.HashMap;
|
||||
|
@ -53,378 +49,44 @@ public class InnerHitsBuilder implements ToXContent {
|
|||
innerHits.put(name, innerHit);
|
||||
}
|
||||
|
||||
public static class InnerHit implements ToXContent {
|
||||
public static class InnerHit extends BaseInnerHitBuilder<InnerHit> {
|
||||
|
||||
private String path;
|
||||
private String type;
|
||||
|
||||
private SearchSourceBuilder sourceBuilder;
|
||||
|
||||
/**
|
||||
* Sets the query to run for collecting the inner hits.
|
||||
*/
|
||||
public InnerHit setQuery(QueryBuilder query) {
|
||||
sourceBuilder().query(query);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* For parent/child inner hits the type to collect inner hits for.
|
||||
*/
|
||||
public InnerHit setPath(String path) {
|
||||
this.path = path;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* For nested inner hits the path to collect child nested docs for.
|
||||
*/
|
||||
public InnerHit setType(String type) {
|
||||
this.type = type;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The index to start to return hits from. Defaults to <tt>0</tt>.
|
||||
* Adds a nested inner hit definition that collects inner hits for hits
|
||||
* on this inner hit level.
|
||||
*/
|
||||
public InnerHit setFrom(int from) {
|
||||
sourceBuilder().from(from);
|
||||
return this;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* The number of search hits to return. Defaults to <tt>10</tt>.
|
||||
*/
|
||||
public InnerHit setSize(int size) {
|
||||
sourceBuilder().size(size);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Applies when sorting, and controls if scores will be tracked as well. Defaults to
|
||||
* <tt>false</tt>.
|
||||
*/
|
||||
public InnerHit setTrackScores(boolean trackScores) {
|
||||
sourceBuilder().trackScores(trackScores);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Should each {@link org.elasticsearch.search.SearchHit} be returned with an
|
||||
* explanation of the hit (ranking).
|
||||
*/
|
||||
public InnerHit setExplain(boolean explain) {
|
||||
sourceBuilder().explain(explain);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Should each {@link org.elasticsearch.search.SearchHit} be returned with its
|
||||
* version.
|
||||
*/
|
||||
public InnerHit setVersion(boolean version) {
|
||||
sourceBuilder().version(version);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets no fields to be loaded, resulting in only id and type to be returned per field.
|
||||
*/
|
||||
public InnerHit setNoFields() {
|
||||
sourceBuilder().noFields();
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicates whether the response should contain the stored _source for every hit
|
||||
*/
|
||||
public InnerHit setFetchSource(boolean fetch) {
|
||||
sourceBuilder().fetchSource(fetch);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicate that _source should be returned with every hit, with an "include" and/or "exclude" set which can include simple wildcard
|
||||
* elements.
|
||||
*
|
||||
* @param include An optional include (optionally wildcarded) pattern to filter the returned _source
|
||||
* @param exclude An optional exclude (optionally wildcarded) pattern to filter the returned _source
|
||||
*/
|
||||
public InnerHit setFetchSource(@Nullable String include, @Nullable String exclude) {
|
||||
sourceBuilder().fetchSource(include, exclude);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicate that _source should be returned with every hit, with an "include" and/or "exclude" set which can include simple wildcard
|
||||
* elements.
|
||||
*
|
||||
* @param includes An optional list of include (optionally wildcarded) pattern to filter the returned _source
|
||||
* @param excludes An optional list of exclude (optionally wildcarded) pattern to filter the returned _source
|
||||
*/
|
||||
public InnerHit setFetchSource(@Nullable String[] includes, @Nullable String[] excludes) {
|
||||
sourceBuilder().fetchSource(includes, excludes);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field data based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The field to get from the field data cache
|
||||
*/
|
||||
public InnerHit addFieldDataField(String name) {
|
||||
sourceBuilder().fieldDataField(name);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a script based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The name that will represent this value in the return hit
|
||||
* @param script The script to use
|
||||
*/
|
||||
public InnerHit addScriptField(String name, String script) {
|
||||
sourceBuilder().scriptField(name, script);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a script based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The name that will represent this value in the return hit
|
||||
* @param script The script to use
|
||||
* @param params Parameters that the script can use.
|
||||
*/
|
||||
public InnerHit addScriptField(String name, String script, Map<String, Object> params) {
|
||||
sourceBuilder().scriptField(name, script, params);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a script based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The name that will represent this value in the return hit
|
||||
* @param lang The language of the script
|
||||
* @param script The script to use
|
||||
* @param params Parameters that the script can use (can be <tt>null</tt>).
|
||||
*/
|
||||
public InnerHit addScriptField(String name, String lang, String script, Map<String, Object> params) {
|
||||
sourceBuilder().scriptField(name, lang, script, params);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a sort against the given field name and the sort ordering.
|
||||
*
|
||||
* @param field The name of the field
|
||||
* @param order The sort ordering
|
||||
*/
|
||||
public InnerHit addSort(String field, SortOrder order) {
|
||||
sourceBuilder().sort(field, order);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a generic sort builder.
|
||||
*
|
||||
* @see org.elasticsearch.search.sort.SortBuilders
|
||||
*/
|
||||
public InnerHit addSort(SortBuilder sort) {
|
||||
sourceBuilder().sort(sort);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with default fragment size of 100 characters, and
|
||||
* default number of fragments of 5.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
*/
|
||||
public InnerHit addHighlightedField(String name) {
|
||||
highlightBuilder().field(name);
|
||||
return this;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with a provided fragment size (in characters), and
|
||||
* default number of fragments of 5.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
* @param fragmentSize The size of a fragment in characters
|
||||
*/
|
||||
public InnerHit addHighlightedField(String name, int fragmentSize) {
|
||||
highlightBuilder().field(name, fragmentSize);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with a provided fragment size (in characters), and
|
||||
* a provided (maximum) number of fragments.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
* @param fragmentSize The size of a fragment in characters
|
||||
* @param numberOfFragments The (maximum) number of fragments
|
||||
*/
|
||||
public InnerHit addHighlightedField(String name, int fragmentSize, int numberOfFragments) {
|
||||
highlightBuilder().field(name, fragmentSize, numberOfFragments);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with a provided fragment size (in characters),
|
||||
* a provided (maximum) number of fragments and an offset for the highlight.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
* @param fragmentSize The size of a fragment in characters
|
||||
* @param numberOfFragments The (maximum) number of fragments
|
||||
*/
|
||||
public InnerHit addHighlightedField(String name, int fragmentSize, int numberOfFragments,
|
||||
int fragmentOffset) {
|
||||
highlightBuilder().field(name, fragmentSize, numberOfFragments, fragmentOffset);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a highlighted field.
|
||||
*/
|
||||
public InnerHit addHighlightedField(HighlightBuilder.Field field) {
|
||||
highlightBuilder().field(field);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Set a tag scheme that encapsulates a built in pre and post tags. The allows schemes
|
||||
* are <tt>styled</tt> and <tt>default</tt>.
|
||||
*
|
||||
* @param schemaName The tag scheme name
|
||||
*/
|
||||
public InnerHit setHighlighterTagsSchema(String schemaName) {
|
||||
highlightBuilder().tagsSchema(schemaName);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterFragmentSize(Integer fragmentSize) {
|
||||
highlightBuilder().fragmentSize(fragmentSize);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterNumOfFragments(Integer numOfFragments) {
|
||||
highlightBuilder().numOfFragments(numOfFragments);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterFilter(Boolean highlightFilter) {
|
||||
highlightBuilder().highlightFilter(highlightFilter);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The encoder to set for highlighting
|
||||
*/
|
||||
public InnerHit setHighlighterEncoder(String encoder) {
|
||||
highlightBuilder().encoder(encoder);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Explicitly set the pre tags that will be used for highlighting.
|
||||
*/
|
||||
public InnerHit setHighlighterPreTags(String... preTags) {
|
||||
highlightBuilder().preTags(preTags);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Explicitly set the post tags that will be used for highlighting.
|
||||
*/
|
||||
public InnerHit setHighlighterPostTags(String... postTags) {
|
||||
highlightBuilder().postTags(postTags);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The order of fragments per field. By default, ordered by the order in the
|
||||
* highlighted text. Can be <tt>score</tt>, which then it will be ordered
|
||||
* by score of the fragments.
|
||||
*/
|
||||
public InnerHit setHighlighterOrder(String order) {
|
||||
highlightBuilder().order(order);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterRequireFieldMatch(boolean requireFieldMatch) {
|
||||
highlightBuilder().requireFieldMatch(requireFieldMatch);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterBoundaryMaxScan(Integer boundaryMaxScan) {
|
||||
highlightBuilder().boundaryMaxScan(boundaryMaxScan);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterBoundaryChars(char[] boundaryChars) {
|
||||
highlightBuilder().boundaryChars(boundaryChars);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The highlighter type to use.
|
||||
*/
|
||||
public InnerHit setHighlighterType(String type) {
|
||||
highlightBuilder().highlighterType(type);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterFragmenter(String fragmenter) {
|
||||
highlightBuilder().fragmenter(fragmenter);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets a query to be used for highlighting all fields instead of the search query.
|
||||
*/
|
||||
public InnerHit setHighlighterQuery(QueryBuilder highlightQuery) {
|
||||
highlightBuilder().highlightQuery(highlightQuery);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the size of the fragment to return from the beginning of the field if there are no matches to
|
||||
* highlight and the field doesn't also define noMatchSize.
|
||||
* @param noMatchSize integer to set or null to leave out of request. default is null.
|
||||
* @return this builder for chaining
|
||||
*/
|
||||
public InnerHit setHighlighterNoMatchSize(Integer noMatchSize) {
|
||||
highlightBuilder().noMatchSize(noMatchSize);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the maximum number of phrases the fvh will consider if the field doesn't also define phraseLimit.
|
||||
*/
|
||||
public InnerHit setHighlighterPhraseLimit(Integer phraseLimit) {
|
||||
highlightBuilder().phraseLimit(phraseLimit);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterOptions(Map<String, Object> options) {
|
||||
highlightBuilder().options(options);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit addInnerHit(String name, InnerHit innerHit) {
|
||||
sourceBuilder().innerHitsBuilder().addInnerHit(name, innerHit);
|
||||
return this;
|
||||
}
|
||||
|
||||
private SearchSourceBuilder sourceBuilder() {
|
||||
if (sourceBuilder == null) {
|
||||
sourceBuilder = new SearchSourceBuilder();
|
||||
}
|
||||
return sourceBuilder;
|
||||
}
|
||||
|
||||
public HighlightBuilder highlightBuilder() {
|
||||
return sourceBuilder().highlighter();
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
if (path != null) {
|
||||
|
@ -432,9 +94,7 @@ public class InnerHitsBuilder implements ToXContent {
|
|||
} else {
|
||||
builder.startObject("type").startObject(type);
|
||||
}
|
||||
if (sourceBuilder != null) {
|
||||
sourceBuilder.innerToXContent(builder, params);
|
||||
}
|
||||
super.toXContent(builder, params);
|
||||
return builder.endObject().endObject();
|
||||
}
|
||||
}
|
||||
|
|
|
@ -49,7 +49,7 @@ import java.util.Map;
|
|||
*/
|
||||
public final class InnerHitsContext {
|
||||
|
||||
private Map<String, BaseInnerHits> innerHits;
|
||||
private final Map<String, BaseInnerHits> innerHits;
|
||||
|
||||
public InnerHitsContext(Map<String, BaseInnerHits> innerHits) {
|
||||
this.innerHits = innerHits;
|
||||
|
@ -59,6 +59,10 @@ public final class InnerHitsContext {
|
|||
return innerHits;
|
||||
}
|
||||
|
||||
public void addInnerHitDefinition(String name, BaseInnerHits innerHit) {
|
||||
innerHits.put(name, innerHit);
|
||||
}
|
||||
|
||||
public static abstract class BaseInnerHits extends FilteredSearchContext {
|
||||
|
||||
protected final Query query;
|
||||
|
|
|
@ -76,10 +76,8 @@ public class InnerHitsFetchSubPhase implements FetchSubPhase {
|
|||
|
||||
@Override
|
||||
public void hitExecute(SearchContext context, HitContext hitContext) throws ElasticsearchException {
|
||||
InnerHitsContext innerHitsContext = context.innerHits();
|
||||
Map<String, InternalSearchHits> results = new HashMap<>();
|
||||
Map<String, InnerHitsContext.BaseInnerHits> innerHitsByKey = innerHitsContext.getInnerHits();
|
||||
for (Map.Entry<String, InnerHitsContext.BaseInnerHits> entry : innerHitsByKey.entrySet()) {
|
||||
for (Map.Entry<String, InnerHitsContext.BaseInnerHits> entry : context.innerHits().getInnerHits().entrySet()) {
|
||||
InnerHitsContext.BaseInnerHits innerHits = entry.getValue();
|
||||
TopDocs topDocs = innerHits.topDocs(context, hitContext);
|
||||
innerHits.queryResult().topDocs(topDocs);
|
||||
|
|
|
@ -39,6 +39,8 @@ import org.elasticsearch.search.sort.SortParseElement;
|
|||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
import static org.elasticsearch.index.query.support.InnerHitsQueryParserHelper.parseCommonInnerHitOptions;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class InnerHitsParseElement implements SearchParseElement {
|
||||
|
@ -70,7 +72,7 @@ public class InnerHitsParseElement implements SearchParseElement {
|
|||
Map<String, InnerHitsContext.BaseInnerHits> innerHitsMap = null;
|
||||
while ((token = parser.nextToken()) != XContentParser.Token.END_OBJECT) {
|
||||
if (token != XContentParser.Token.FIELD_NAME) {
|
||||
throw new ElasticsearchIllegalArgumentException("Unexpected token " + token + " in [inner_hits]: aggregations definitions must start with the name of the aggregation.");
|
||||
throw new ElasticsearchIllegalArgumentException("Unexpected token " + token + " in [inner_hits]: inner_hit definitions must start with the name of the inner_hit.");
|
||||
}
|
||||
final String innerHitName = parser.currentName();
|
||||
token = parser.nextToken();
|
||||
|
@ -138,63 +140,16 @@ public class InnerHitsParseElement implements SearchParseElement {
|
|||
while ((token = parser.nextToken()) != XContentParser.Token.END_OBJECT) {
|
||||
if (token == XContentParser.Token.FIELD_NAME) {
|
||||
fieldName = parser.currentName();
|
||||
} else if ("sort".equals(fieldName)) {
|
||||
sortParseElement.parse(parser, subSearchContext);
|
||||
} else if ("_source".equals(fieldName)) {
|
||||
sourceParseElement.parse(parser, subSearchContext);
|
||||
} else if (token == XContentParser.Token.START_OBJECT) {
|
||||
switch (fieldName) {
|
||||
case "highlight":
|
||||
highlighterParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
case "scriptFields":
|
||||
case "script_fields":
|
||||
scriptFieldsParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
case "inner_hits":
|
||||
childInnerHits = parseInnerHits(parser, subSearchContext);
|
||||
break;
|
||||
case "query":
|
||||
query = context.queryParserService().parse(parser).query();
|
||||
break;
|
||||
default:
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown key for a " + token + " in [" + innerHitName + "]: [" + fieldName + "].");
|
||||
}
|
||||
} else if (token == XContentParser.Token.START_ARRAY) {
|
||||
switch (fieldName) {
|
||||
case "fielddataFields":
|
||||
case "fielddata_fields":
|
||||
fieldDataFieldsParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
default:
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown key for a " + token + " in [" + innerHitName + "]: [" + fieldName + "].");
|
||||
}
|
||||
} else if (token.isValue()) {
|
||||
switch (fieldName) {
|
||||
case "query" :
|
||||
query = context.queryParserService().parse(parser).query();
|
||||
break;
|
||||
case "from":
|
||||
subSearchContext.from(parser.intValue());
|
||||
break;
|
||||
case "size":
|
||||
subSearchContext.size(parser.intValue());
|
||||
break;
|
||||
case "track_scores":
|
||||
case "trackScores":
|
||||
subSearchContext.trackScores(parser.booleanValue());
|
||||
break;
|
||||
case "version":
|
||||
subSearchContext.version(parser.booleanValue());
|
||||
break;
|
||||
case "explain":
|
||||
subSearchContext.explain(parser.booleanValue());
|
||||
break;
|
||||
default:
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown key for a " + token + " in [" + innerHitName + "]: [" + fieldName + "].");
|
||||
if ("query".equals(fieldName)) {
|
||||
query = context.queryParserService().parse(parser).query();
|
||||
} else if ("inner_hits".equals(fieldName)) {
|
||||
childInnerHits = parseInnerHits(parser, context);
|
||||
} else {
|
||||
parseCommonInnerHitOptions(parser, token, fieldName, subSearchContext, sortParseElement, sourceParseElement, highlighterParseElement, scriptFieldsParseElement, fieldDataFieldsParseElement);
|
||||
}
|
||||
} else {
|
||||
throw new ElasticsearchIllegalArgumentException("Unexpected token " + token + " in [" + innerHitName + "].");
|
||||
parseCommonInnerHitOptions(parser, token, fieldName, subSearchContext, sortParseElement, sourceParseElement, highlighterParseElement, scriptFieldsParseElement, fieldDataFieldsParseElement);
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -224,14 +179,6 @@ public class InnerHitsParseElement implements SearchParseElement {
|
|||
throw new ElasticsearchIllegalArgumentException("path [" + nestedPath +"] isn't nested");
|
||||
}
|
||||
DocumentMapper childDocumentMapper = smartNameObjectMapper.docMapper();
|
||||
if (childDocumentMapper == null) {
|
||||
for (DocumentMapper documentMapper : context.mapperService().docMappers(false)) {
|
||||
if (documentMapper.objectMappers().containsKey(nestedPath)) {
|
||||
childDocumentMapper = documentMapper;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (currentFilter != null && childDocumentMapper != null) {
|
||||
currentFilter.filter = context.bitsetFilterCache().getBitDocIdSetFilter(childObjectMapper.nestedTypeFilter());
|
||||
NestedQueryParser.parentFilterContext.set(parentFilter);
|
||||
|
|
|
@ -20,8 +20,11 @@
|
|||
package org.elasticsearch.search.innerhits;
|
||||
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchRequest;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.BoolQueryBuilder;
|
||||
import org.elasticsearch.index.query.support.QueryInnerHitBuilder;
|
||||
import org.elasticsearch.search.SearchHit;
|
||||
import org.elasticsearch.search.SearchHits;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsBuilder;
|
||||
|
@ -34,11 +37,11 @@ import java.util.List;
|
|||
import java.util.Locale;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.hasChildFilter;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.nestedFilter;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.*;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
import static org.hamcrest.Matchers.containsString;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.nullValue;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
*/
|
||||
|
@ -79,63 +82,86 @@ public class InnerHitsTests extends ElasticsearchIntegrationTest {
|
|||
.endObject()));
|
||||
indexRandom(true, requests);
|
||||
|
||||
SearchResponse response = client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", matchQuery("comments.message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setPath("comments").setQuery(matchQuery("comments.message", "fox")))
|
||||
.get();
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("1"));
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(2l));
|
||||
assertThat(innerHits.getHits().length, equalTo(2));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getOffset(), equalTo(0));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getOffset(), equalTo(1));
|
||||
// Inner hits can be defined in two ways: 1) with the query 2) as seperate inner_hit definition
|
||||
SearchRequest[] searchRequests = new SearchRequest[]{
|
||||
client().prepareSearch("articles").setQuery(nestedQuery("comments", matchQuery("comments.message", "fox")).innerHit(new QueryInnerHitBuilder().setName("comment"))).request(),
|
||||
client().prepareSearch("articles").setQuery(nestedQuery("comments", matchQuery("comments.message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setPath("comments").setQuery(matchQuery("comments.message", "fox"))).request()
|
||||
};
|
||||
for (SearchRequest searchRequest : searchRequests) {
|
||||
SearchResponse response = client().search(searchRequest).actionGet();
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("1"));
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(2l));
|
||||
assertThat(innerHits.getHits().length, equalTo(2));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getOffset(), equalTo(0));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getOffset(), equalTo(1));
|
||||
}
|
||||
|
||||
response = client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", matchQuery("comments.message", "elephant")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setPath("comments").setQuery(matchQuery("comments.message", "elephant")))
|
||||
.get();
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("2"));
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(3l));
|
||||
assertThat(innerHits.getHits().length, equalTo(3));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getOffset(), equalTo(0));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getOffset(), equalTo(1));
|
||||
assertThat(innerHits.getAt(2).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(2).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(2).getNestedIdentity().getOffset(), equalTo(2));
|
||||
searchRequests = new SearchRequest[] {
|
||||
client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", matchQuery("comments.message", "elephant")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setPath("comments").setQuery(matchQuery("comments.message", "elephant"))).request(),
|
||||
client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", matchQuery("comments.message", "elephant")).innerHit(new QueryInnerHitBuilder().setName("comment"))).request()
|
||||
};
|
||||
for (SearchRequest searchRequest : searchRequests) {
|
||||
SearchResponse response = client().search(searchRequest).actionGet();
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("2"));
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(3l));
|
||||
assertThat(innerHits.getHits().length, equalTo(3));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getOffset(), equalTo(0));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getOffset(), equalTo(1));
|
||||
assertThat(innerHits.getAt(2).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(2).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(2).getNestedIdentity().getOffset(), equalTo(2));
|
||||
}
|
||||
|
||||
response = client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", matchQuery("comments.message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setPath("comments")
|
||||
.setQuery(matchQuery("comments.message", "fox"))
|
||||
.addHighlightedField("comments.message")
|
||||
.setExplain(true)
|
||||
.addFieldDataField("comments.message")
|
||||
.addScriptField("script", "doc['comments.message'].value")
|
||||
.setSize(1)
|
||||
).get();
|
||||
searchRequests = new SearchRequest[] {
|
||||
client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", matchQuery("comments.message", "fox")))
|
||||
.addInnerHit("comments", new InnerHitsBuilder.InnerHit().setPath("comments")
|
||||
.setQuery(matchQuery("comments.message", "fox"))
|
||||
.addHighlightedField("comments.message")
|
||||
.setExplain(true)
|
||||
.addFieldDataField("comments.message")
|
||||
.addScriptField("script", "doc['comments.message'].value")
|
||||
.setSize(1)).request(),
|
||||
client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", matchQuery("comments.message", "fox")).innerHit(new QueryInnerHitBuilder()
|
||||
.addHighlightedField("comments.message")
|
||||
.setExplain(true)
|
||||
.addFieldDataField("comments.message")
|
||||
.addScriptField("script", "doc['comments.message'].value")
|
||||
.setSize(1))).request()
|
||||
};
|
||||
|
||||
assertNoFailures(response);
|
||||
innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.getHits().length, equalTo(1));
|
||||
assertThat(innerHits.getAt(0).getHighlightFields().get("comments.message").getFragments()[0].string(), equalTo("<em>fox</em> eat quick"));
|
||||
assertThat(innerHits.getAt(0).explanation().toString(), containsString("(MATCH) weight(comments.message:fox in"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("comments.message").getValue().toString(), equalTo("eat"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("script").getValue().toString(), equalTo("eat"));
|
||||
for (SearchRequest searchRequest : searchRequests) {
|
||||
SearchResponse response = client().search(searchRequest).actionGet();
|
||||
assertNoFailures(response);
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comments");
|
||||
assertThat(innerHits.getTotalHits(), equalTo(2l));
|
||||
assertThat(innerHits.getHits().length, equalTo(1));
|
||||
assertThat(innerHits.getAt(0).getHighlightFields().get("comments.message").getFragments()[0].string(), equalTo("<em>fox</em> eat quick"));
|
||||
assertThat(innerHits.getAt(0).explanation().toString(), containsString("(MATCH) weight(comments.message:fox in"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("comments.message").getValue().toString(), equalTo("eat"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("script").getValue().toString(), equalTo("eat"));
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
|
@ -147,29 +173,44 @@ public class InnerHitsTests extends ElasticsearchIntegrationTest {
|
|||
int[] field1InnerObjects = new int[numDocs];
|
||||
int[] field2InnerObjects = new int[numDocs];
|
||||
for (int i = 0; i < numDocs; i++) {
|
||||
int numInnerObjects = field1InnerObjects[i] = scaledRandomIntBetween(0, numDocs);
|
||||
int numInnerObjects = field1InnerObjects[i] = scaledRandomIntBetween(1, numDocs);
|
||||
XContentBuilder source = jsonBuilder().startObject().startArray("field1");
|
||||
for (int j = 0; j < numInnerObjects; j++) {
|
||||
source.startObject().field("x", "y").endObject();
|
||||
}
|
||||
numInnerObjects = field2InnerObjects[i] = scaledRandomIntBetween(0, numDocs);
|
||||
numInnerObjects = field2InnerObjects[i] = scaledRandomIntBetween(1, numDocs);
|
||||
source.endArray().startArray("field2");
|
||||
for (int j = 0; j < numInnerObjects; j++) {
|
||||
source.startObject().field("x", "y").endObject();
|
||||
}
|
||||
source.endArray().endObject();
|
||||
|
||||
requestBuilders.add(client().prepareIndex("idx", "type", String.format(Locale.ENGLISH, "%03d", i)).setSource(source));
|
||||
}
|
||||
|
||||
indexRandom(true, requestBuilders);
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("idx")
|
||||
.setSize(numDocs)
|
||||
.addSort("_uid", SortOrder.ASC)
|
||||
.addInnerHit("a", new InnerHitsBuilder.InnerHit().setPath("field1").addSort("_doc", SortOrder.DESC).setSize(numDocs)) // Sort order is DESC, because we reverse the inner objects during indexing!
|
||||
.addInnerHit("b", new InnerHitsBuilder.InnerHit().setPath("field2").addSort("_doc", SortOrder.DESC).setSize(numDocs))
|
||||
.get();
|
||||
SearchResponse searchResponse;
|
||||
if (randomBoolean()) {
|
||||
searchResponse = client().prepareSearch("idx")
|
||||
.setSize(numDocs)
|
||||
.addSort("_uid", SortOrder.ASC)
|
||||
.addInnerHit("a", new InnerHitsBuilder.InnerHit().setPath("field1").addSort("_doc", SortOrder.DESC).setSize(numDocs)) // Sort order is DESC, because we reverse the inner objects during indexing!
|
||||
.addInnerHit("b", new InnerHitsBuilder.InnerHit().setPath("field2").addSort("_doc", SortOrder.DESC).setSize(numDocs))
|
||||
.get();
|
||||
} else {
|
||||
BoolQueryBuilder boolQuery = new BoolQueryBuilder();
|
||||
if (randomBoolean()) {
|
||||
boolQuery.should(nestedQuery("field1", matchAllQuery()).innerHit(new QueryInnerHitBuilder().setName("a").addSort("_doc", SortOrder.DESC).setSize(numDocs)));
|
||||
boolQuery.should(nestedQuery("field2", matchAllQuery()).innerHit(new QueryInnerHitBuilder().setName("b").addSort("_doc", SortOrder.DESC).setSize(numDocs)));
|
||||
} else {
|
||||
boolQuery.should(constantScoreQuery(nestedFilter("field1", matchAllQuery()).innerHit(new QueryInnerHitBuilder().setName("a").addSort("_doc", SortOrder.DESC).setSize(numDocs))));
|
||||
boolQuery.should(constantScoreQuery(nestedFilter("field2", matchAllQuery()).innerHit(new QueryInnerHitBuilder().setName("b").addSort("_doc", SortOrder.DESC).setSize(numDocs))));
|
||||
}
|
||||
searchResponse = client().prepareSearch("idx")
|
||||
.setQuery(boolQuery)
|
||||
.setSize(numDocs)
|
||||
.addSort("_uid", SortOrder.ASC)
|
||||
.get();
|
||||
}
|
||||
|
||||
assertHitCount(searchResponse, numDocs);
|
||||
assertThat(searchResponse.getHits().getHits().length, equalTo(numDocs));
|
||||
|
@ -213,62 +254,89 @@ public class InnerHitsTests extends ElasticsearchIntegrationTest {
|
|||
requests.add(client().prepareIndex("articles", "comment", "6").setParent("2").setSource("message", "elephant scared by mice x y"));
|
||||
indexRandom(true, requests);
|
||||
|
||||
SearchResponse response = client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setType("comment").setQuery(matchQuery("message", "fox")))
|
||||
.get();
|
||||
SearchRequest[] searchRequests = new SearchRequest[]{
|
||||
client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setType("comment").setQuery(matchQuery("message", "fox")))
|
||||
.request(),
|
||||
client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "fox")).innerHit(new QueryInnerHitBuilder().setName("comment")))
|
||||
.request()
|
||||
};
|
||||
for (SearchRequest searchRequest : searchRequests) {
|
||||
SearchResponse response = client().search(searchRequest).actionGet();
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("1"));
|
||||
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("1"));
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(2l));
|
||||
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(2l));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(0).type(), equalTo("comment"));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(1).type(), equalTo("comment"));
|
||||
}
|
||||
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(0).type(), equalTo("comment"));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(1).type(), equalTo("comment"));
|
||||
searchRequests = new SearchRequest[] {
|
||||
client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "elephant")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setType("comment").setQuery(matchQuery("message", "elephant")))
|
||||
.request(),
|
||||
client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "elephant")).innerHit(new QueryInnerHitBuilder()))
|
||||
.request()
|
||||
};
|
||||
for (SearchRequest searchRequest : searchRequests) {
|
||||
SearchResponse response = client().search(searchRequest).actionGet();
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("2"));
|
||||
|
||||
response = client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "elephant")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setType("comment").setQuery(matchQuery("message", "elephant")))
|
||||
.get();
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(3l));
|
||||
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("2"));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("4"));
|
||||
assertThat(innerHits.getAt(0).type(), equalTo("comment"));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("5"));
|
||||
assertThat(innerHits.getAt(1).type(), equalTo("comment"));
|
||||
assertThat(innerHits.getAt(2).getId(), equalTo("6"));
|
||||
assertThat(innerHits.getAt(2).type(), equalTo("comment"));
|
||||
}
|
||||
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(3l));
|
||||
searchRequests = new SearchRequest[] {
|
||||
client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setType("comment")
|
||||
.setQuery(matchQuery("message", "fox"))
|
||||
.addHighlightedField("message")
|
||||
.setExplain(true)
|
||||
.addFieldDataField("message")
|
||||
.addScriptField("script", "doc['message'].value")
|
||||
.setSize(1)
|
||||
).request(),
|
||||
client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "fox")).innerHit(new QueryInnerHitBuilder()
|
||||
.addHighlightedField("message")
|
||||
.setExplain(true)
|
||||
.addFieldDataField("message")
|
||||
.addScriptField("script", "doc['message'].value")
|
||||
.setSize(1))
|
||||
).request()
|
||||
};
|
||||
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("4"));
|
||||
assertThat(innerHits.getAt(0).type(), equalTo("comment"));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("5"));
|
||||
assertThat(innerHits.getAt(1).type(), equalTo("comment"));
|
||||
assertThat(innerHits.getAt(2).getId(), equalTo("6"));
|
||||
assertThat(innerHits.getAt(2).type(), equalTo("comment"));
|
||||
|
||||
response = client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setType("comment")
|
||||
.setQuery(matchQuery("message", "fox"))
|
||||
.addHighlightedField("message")
|
||||
.setExplain(true)
|
||||
.addFieldDataField("message")
|
||||
.addScriptField("script", "doc['message'].value")
|
||||
.setSize(1)
|
||||
).get();
|
||||
|
||||
assertNoFailures(response);
|
||||
innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.getHits().length, equalTo(1));
|
||||
assertThat(innerHits.getAt(0).getHighlightFields().get("message").getFragments()[0].string(), equalTo("<em>fox</em> eat quick"));
|
||||
assertThat(innerHits.getAt(0).explanation().toString(), containsString("(MATCH) weight(message:fox"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("message").getValue().toString(), equalTo("eat"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("script").getValue().toString(), equalTo("eat"));
|
||||
for (SearchRequest searchRequest : searchRequests) {
|
||||
SearchResponse response = client().search(searchRequest).actionGet();
|
||||
assertNoFailures(response);
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.getHits().length, equalTo(1));
|
||||
assertThat(innerHits.getAt(0).getHighlightFields().get("message").getFragments()[0].string(), equalTo("<em>fox</em> eat quick"));
|
||||
assertThat(innerHits.getAt(0).explanation().toString(), containsString("(MATCH) weight(message:fox"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("message").getValue().toString(), equalTo("eat"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("script").getValue().toString(), equalTo("eat"));
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
|
@ -289,12 +357,12 @@ public class InnerHitsTests extends ElasticsearchIntegrationTest {
|
|||
String parentId = String.format(Locale.ENGLISH, "%03d", parent);
|
||||
requestBuilders.add(client().prepareIndex("idx", "parent", parentId).setSource("{}"));
|
||||
|
||||
int numChildDocs = child1InnerObjects[parent] = scaledRandomIntBetween(0, numDocs);
|
||||
int numChildDocs = child1InnerObjects[parent] = scaledRandomIntBetween(1, numDocs);
|
||||
int limit = child1 + numChildDocs;
|
||||
for (; child1 < limit; child1++) {
|
||||
requestBuilders.add(client().prepareIndex("idx", "child1", String.format(Locale.ENGLISH, "%04d", child1)).setParent(parentId).setSource("{}"));
|
||||
}
|
||||
numChildDocs = child2InnerObjects[parent] = scaledRandomIntBetween(0, numDocs);
|
||||
numChildDocs = child2InnerObjects[parent] = scaledRandomIntBetween(1, numDocs);
|
||||
limit = child2 + numChildDocs;
|
||||
for (; child2 < limit; child2++) {
|
||||
requestBuilders.add(client().prepareIndex("idx", "child2", String.format(Locale.ENGLISH, "%04d", child2)).setParent(parentId).setSource("{}"));
|
||||
|
@ -302,13 +370,31 @@ public class InnerHitsTests extends ElasticsearchIntegrationTest {
|
|||
}
|
||||
indexRandom(true, requestBuilders);
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("idx")
|
||||
.setSize(numDocs)
|
||||
.setTypes("parent")
|
||||
.addSort("_uid", SortOrder.ASC)
|
||||
.addInnerHit("a", new InnerHitsBuilder.InnerHit().setType("child1").addSort("_uid", SortOrder.ASC).setSize(numDocs))
|
||||
.addInnerHit("b", new InnerHitsBuilder.InnerHit().setType("child2").addSort("_uid", SortOrder.ASC).setSize(numDocs))
|
||||
.get();
|
||||
SearchResponse searchResponse;
|
||||
if (randomBoolean()) {
|
||||
searchResponse = client().prepareSearch("idx")
|
||||
.setSize(numDocs)
|
||||
.setTypes("parent")
|
||||
.addSort("_uid", SortOrder.ASC)
|
||||
.addInnerHit("a", new InnerHitsBuilder.InnerHit().setType("child1").addSort("_uid", SortOrder.ASC).setSize(numDocs))
|
||||
.addInnerHit("b", new InnerHitsBuilder.InnerHit().setType("child2").addSort("_uid", SortOrder.ASC).setSize(numDocs))
|
||||
.get();
|
||||
} else {
|
||||
BoolQueryBuilder boolQuery = new BoolQueryBuilder();
|
||||
if (randomBoolean()) {
|
||||
boolQuery.should(hasChildQuery("child1", matchAllQuery()).innerHit(new QueryInnerHitBuilder().setName("a").addSort("_uid", SortOrder.ASC).setSize(numDocs)));
|
||||
boolQuery.should(hasChildQuery("child2", matchAllQuery()).innerHit(new QueryInnerHitBuilder().setName("b").addSort("_uid", SortOrder.ASC).setSize(numDocs)));
|
||||
} else {
|
||||
boolQuery.should(constantScoreQuery(hasChildFilter("child1", matchAllQuery()).innerHit(new QueryInnerHitBuilder().setName("a").addSort("_uid", SortOrder.ASC).setSize(numDocs))));
|
||||
boolQuery.should(constantScoreQuery(hasChildFilter("child2", matchAllQuery()).innerHit(new QueryInnerHitBuilder().setName("b").addSort("_uid", SortOrder.ASC).setSize(numDocs))));
|
||||
}
|
||||
searchResponse = client().prepareSearch("idx")
|
||||
.setSize(numDocs)
|
||||
.setTypes("parent")
|
||||
.addSort("_uid", SortOrder.ASC)
|
||||
.setQuery(boolQuery)
|
||||
.get();
|
||||
}
|
||||
|
||||
assertHitCount(searchResponse, numDocs);
|
||||
assertThat(searchResponse.getHits().getHits().length, equalTo(numDocs));
|
||||
|
|
Loading…
Reference in New Issue