Added `inner_hits` feature that allows to include nested hits.
Inner hits allows to embed nested inner objects, children documents or the parent document that contributed to the matching of the returned search hit as inner hits, which would otherwise be hidden. Closes #8153 Closes #3022 Closes #3152
This commit is contained in:
parent
e92ff00192
commit
d7e224da04
|
@ -127,3 +127,5 @@ include::request/index-boost.asciidoc[]
|
|||
include::request/min-score.asciidoc[]
|
||||
|
||||
include::request/named-queries-and-filters.asciidoc[]
|
||||
|
||||
include::request/inner-hits.asciidoc[]
|
||||
|
|
|
@ -0,0 +1,246 @@
|
|||
[[search-request-inner-hits]]
|
||||
=== Inner hits
|
||||
|
||||
The <<mapping-parent-field, parent/child>> and <<mapping-nested-type, nested>> features allow to return documents that
|
||||
have matches in a different scope. In the parent/child case parent document are returned based on matches in child
|
||||
documents or child document are returned based on matches in parent documents. In the nested case documents are returned
|
||||
based on matches in nested inner objects.
|
||||
|
||||
In both cases the actual matches in the different scopes that caused a document to be returned is hidden. In many cases
|
||||
it is very useful to know which inner nested objects in the case of nested or children or parent documents in the case
|
||||
of parent/child caused certain information to be returned. The inner hits feature can be used for this. This feature
|
||||
returns per search hit in the search response additional nested hits that caused a search hit to match in a different scope.
|
||||
|
||||
The following snippet explains the basic structure of inner hits:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
"inner_hits" : {
|
||||
"<inner_hits_name>" : {
|
||||
"<path|type>" : {
|
||||
"<path-to-nested-object-field|child-or-parent-type>" : {
|
||||
<inner_hits_body>
|
||||
[,"inner_hits" : { [<sub_inner_hits>]+ } ]?
|
||||
}
|
||||
}
|
||||
}
|
||||
[,"<inner_hits_name_2>" : { ... } ]*
|
||||
}
|
||||
--------------------------------------------------
|
||||
|
||||
Inside the `inner_hits` definition, first the name if the inner hit is defined then whether the inner_hit
|
||||
is a nested by defining `path` or a parent/child based definition by defining `type`. The next object layer contains
|
||||
the name of the nested object field if the inner_hits is nested or the parent or child type if the inner_hit definition
|
||||
is parent/child based.
|
||||
|
||||
Multiple inner hit definitions can be defined in a single request. In the `<inner_hits_body>` any option for features
|
||||
that `inner_hits` support can be defined. Optionally another `inner_hits` definition can be defined in the `<inner_hits_body>`.
|
||||
|
||||
If `inner_hits` is defined each search will contain a `inner_hits` json object with the following structure:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
"hits": [
|
||||
{
|
||||
"_index": ...,
|
||||
"_type": ...,
|
||||
"_id": ...,
|
||||
"inner_hits": {
|
||||
"<inner_hits_name>": {
|
||||
"hits": {
|
||||
"total": ...,
|
||||
"hits": [
|
||||
{
|
||||
"_type": ...,
|
||||
"_id": ...,
|
||||
...
|
||||
},
|
||||
...
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
...
|
||||
},
|
||||
...
|
||||
]
|
||||
--------------------------------------------------
|
||||
|
||||
==== Options
|
||||
|
||||
Inner hits support the following options:
|
||||
|
||||
[horizontal]
|
||||
`path`:: Defines the nested scope where hits will be collected from.
|
||||
`type`:: Defines the parent or child type score where hits will be collected from.
|
||||
`query`:: Defines the query that will run in the defined nested, parent or child scope to collect and score hits. By default all document in the scope will be matched.
|
||||
`from`:: The offset from where the first hit to fetch for each `inner_hits` in the returned regular search hits.
|
||||
`size`:: The maximum number of hits to return per `inner_hits`. By default the top three matching hits are returned.
|
||||
`sort`:: How the inner hits should be sorted per `inner_hits`. By default the hits are sorted by the score.
|
||||
|
||||
Either `path` or `type` must be defined. The `path` or `type` defines the scope from where hits are fetched and
|
||||
used as inner hits.
|
||||
|
||||
Inner hits also supports the following per document features:
|
||||
|
||||
* <<search-request-highlighting,Highlighting>>
|
||||
* <<search-request-explain,Explain>>
|
||||
* <<search-request-source-filtering,Source filtering>>
|
||||
* <<search-request-script-fields,Script fields>>
|
||||
* <<search-request-fielddata-fields,Fielddata fields>>
|
||||
* <<search-request-version,Include versions>>
|
||||
|
||||
[[nested-inner-hits]]
|
||||
==== Nested inner hits
|
||||
|
||||
The nested `inner_hits` can be used to include nested inner objects as inner hits to a search hit.
|
||||
|
||||
The example below assumes that there is a nested object field defined with the name `comments`:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
{
|
||||
"query" : {
|
||||
"nested" : {
|
||||
"path" : "comments",
|
||||
"query" : {
|
||||
"match" : {"comments.message" : "[actual query]"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"inner_hits" : {
|
||||
"comment" : {
|
||||
"path" : { <1>
|
||||
"comments" : { <2>
|
||||
"query" : {
|
||||
"match" : {"comments.message" : "[actual query]"}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------------------------
|
||||
|
||||
<1> The inner hit definition is nested and requires the `path` option.
|
||||
<2> The path option refers to the nested object field `comments`
|
||||
|
||||
In the above the query is repeated in both the query and the `comment` inner hit definition. At the moment there is
|
||||
no query referencing support, so in order to make sure that only inner nested objects are returned that contributed to
|
||||
the matching of the regular hits, the inner query in the `nested` query needs to also be defined on the inner hits definition.
|
||||
|
||||
An example of a response snippet that could be generated from the above search request:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
...
|
||||
"hits": {
|
||||
...
|
||||
"hits": [
|
||||
{
|
||||
"_index": "my-index",
|
||||
"_type": "question",
|
||||
"_id": "1",
|
||||
"_source": ...,
|
||||
"inner_hits": {
|
||||
"comment": {
|
||||
"hits": {
|
||||
"total": ...,
|
||||
"hits": [
|
||||
{
|
||||
"_type": "question",
|
||||
"_id": "1",
|
||||
"_nested": {
|
||||
"field": "comments",
|
||||
"offset": 2
|
||||
},
|
||||
"_source": ...
|
||||
},
|
||||
...
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
...
|
||||
--------------------------------------------------
|
||||
|
||||
The `_nested` metadata is crucial in the above example, because it defines from what inner nested object this inner hit
|
||||
came from. The `field` defines the object array field the nested hit is from and the `offset` relative to its location
|
||||
in the `_source`. Due to sorting and scoring the actual location of the hit objects in the `inner_hits` is usually
|
||||
different than the location a nested inner object was defined.
|
||||
|
||||
By default the `_source` is returned also for the hit objects in `inner_hits`, but this can be changed. Either via
|
||||
`_source` filtering feature part of the source can be returned or be disabled. If stored fields are defined on the
|
||||
nested level these can also be returned via the `fields` feature.
|
||||
|
||||
An important default is that the `_source` returned in hits inside `inner_hits` is relative to the `_nested` metadata.
|
||||
So in the above example only the comment part is returned per nested hit and not the entire source of the top level
|
||||
document that contained the the comment.
|
||||
|
||||
[[parent-child-inner-hits]]
|
||||
==== Parent/child inner hits
|
||||
|
||||
The parent/child `inner_hits` can be used to include parent or child
|
||||
|
||||
The examples below assumes that there is a `_parent` field mapping in the `comment` type:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
{
|
||||
"query" : {
|
||||
"has_child" : {
|
||||
"type" : "comment",
|
||||
"query" : {
|
||||
"match" : {"message" : "[actual query]"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"inner_hits" : {
|
||||
"comment" : {
|
||||
"type" : { <1>
|
||||
"comment" : { <2>
|
||||
"query" : {
|
||||
"match" : {"message" : "[actual query]"}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------------------------
|
||||
|
||||
<1> This is a parent/child inner hit definition and requires the `type` option.
|
||||
<2> Refers to the document type `comment`
|
||||
|
||||
An example of a response snippet that could be generated from the above search request:
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
...
|
||||
"hits": {
|
||||
...
|
||||
"hits": [
|
||||
{
|
||||
"_index": "my-index",
|
||||
"_type": "question",
|
||||
"_id": "1",
|
||||
"_source": ...,
|
||||
"inner_hits": {
|
||||
"comment": {
|
||||
"hits": {
|
||||
"total": ...,
|
||||
"hits": [
|
||||
{
|
||||
"_type": "comment",
|
||||
"_id": "5",
|
||||
"_source": ...
|
||||
},
|
||||
...
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
...
|
||||
--------------------------------------------------
|
|
@ -34,6 +34,7 @@ import org.elasticsearch.script.ScriptService;
|
|||
import org.elasticsearch.search.Scroll;
|
||||
import org.elasticsearch.search.aggregations.AbstractAggregationBuilder;
|
||||
import org.elasticsearch.search.builder.SearchSourceBuilder;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsBuilder;
|
||||
import org.elasticsearch.search.highlight.HighlightBuilder;
|
||||
import org.elasticsearch.search.rescore.RescoreBuilder;
|
||||
import org.elasticsearch.search.sort.SortBuilder;
|
||||
|
@ -741,6 +742,11 @@ public class SearchRequestBuilder extends ActionRequestBuilder<SearchRequest, Se
|
|||
return this;
|
||||
}
|
||||
|
||||
public SearchRequestBuilder addInnerHit(String name, InnerHitsBuilder.InnerHit innerHit) {
|
||||
innerHitsBuilder().addInnerHit(name, innerHit);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Delegates to {@link org.elasticsearch.search.suggest.SuggestBuilder#setText(String)}.
|
||||
*/
|
||||
|
@ -1032,6 +1038,10 @@ public class SearchRequestBuilder extends ActionRequestBuilder<SearchRequest, Se
|
|||
return sourceBuilder().highlighter();
|
||||
}
|
||||
|
||||
private InnerHitsBuilder innerHitsBuilder() {
|
||||
return sourceBuilder().innerHitsBuilder();
|
||||
}
|
||||
|
||||
private SuggestBuilder suggestBuilder() {
|
||||
return sourceBuilder().suggest();
|
||||
}
|
||||
|
|
|
@ -165,11 +165,13 @@ public class NestedQueryParser implements QueryParser {
|
|||
}
|
||||
}
|
||||
|
||||
static ThreadLocal<LateBindingParentFilter> parentFilterContext = new ThreadLocal<>();
|
||||
// TODO: Change this mechanism in favour of how parent nested object type is resolved in nested and reverse_nested agg
|
||||
// with this also proper validation can be performed on what is a valid nested child nested object type to be used
|
||||
public static ThreadLocal<LateBindingParentFilter> parentFilterContext = new ThreadLocal<>();
|
||||
|
||||
static class LateBindingParentFilter extends BitDocIdSetFilter {
|
||||
public static class LateBindingParentFilter extends BitDocIdSetFilter {
|
||||
|
||||
BitDocIdSetFilter filter;
|
||||
public BitDocIdSetFilter filter;
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
|
@ -178,7 +180,8 @@ public class NestedQueryParser implements QueryParser {
|
|||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
return filter.equals(obj);
|
||||
if (!(obj instanceof LateBindingParentFilter)) return false;
|
||||
return filter.equals(((LateBindingParentFilter) obj).filter);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -55,6 +55,7 @@ import org.elasticsearch.search.dfs.DfsSearchResult;
|
|||
import org.elasticsearch.search.fetch.FetchSearchResult;
|
||||
import org.elasticsearch.search.fetch.FetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceContext;
|
||||
import org.elasticsearch.search.highlight.SearchContextHighlight;
|
||||
|
@ -681,4 +682,14 @@ public class PercolateContext extends SearchContext {
|
|||
public Counter timeEstimateCounter() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void innerHits(InnerHitsContext innerHitsContext) {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
|
||||
@Override
|
||||
public InnerHitsContext innerHits() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
}
|
||||
|
|
|
@ -199,6 +199,11 @@ public interface SearchHit extends Streamable, ToXContent, Iterable<SearchHitFie
|
|||
*/
|
||||
SearchShardTarget getShard();
|
||||
|
||||
/**
|
||||
* @return Inner hits or <code>null</code> if there are none
|
||||
*/
|
||||
Map<String, SearchHits> getInnerHits();
|
||||
|
||||
/**
|
||||
* Encapsulates the nested identity of a hit.
|
||||
*/
|
||||
|
|
|
@ -32,6 +32,7 @@ import org.elasticsearch.search.dfs.DfsPhase;
|
|||
import org.elasticsearch.search.fetch.FetchPhase;
|
||||
import org.elasticsearch.search.fetch.explain.ExplainFetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsFetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsFetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.matchedqueries.MatchedQueriesFetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsFetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceSubPhase;
|
||||
|
@ -66,6 +67,7 @@ public class SearchModule extends AbstractModule implements SpawnModules {
|
|||
bind(VersionFetchSubPhase.class).asEagerSingleton();
|
||||
bind(MatchedQueriesFetchSubPhase.class).asEagerSingleton();
|
||||
bind(HighlightPhase.class).asEagerSingleton();
|
||||
bind(InnerHitsFetchSubPhase.class).asEagerSingleton();
|
||||
|
||||
bind(SearchServiceTransportAction.class).asEagerSingleton();
|
||||
bind(MoreLikeThisFetchService.class).asEagerSingleton();
|
||||
|
|
|
@ -33,6 +33,7 @@ import org.elasticsearch.search.fetch.FetchPhase;
|
|||
import org.elasticsearch.search.fetch.FetchSearchResult;
|
||||
import org.elasticsearch.search.internal.InternalSearchHit;
|
||||
import org.elasticsearch.search.internal.InternalSearchHits;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Map;
|
||||
|
@ -52,17 +53,17 @@ public class TopHitsAggregator extends MetricsAggregator implements ScorerAware
|
|||
}
|
||||
|
||||
private final FetchPhase fetchPhase;
|
||||
private final TopHitsContext topHitsContext;
|
||||
private final SubSearchContext subSearchContext;
|
||||
private final LongObjectPagedHashMap<TopDocsAndLeafCollector> topDocsCollectors;
|
||||
|
||||
private Scorer currentScorer;
|
||||
private LeafReaderContext currentContext;
|
||||
|
||||
public TopHitsAggregator(FetchPhase fetchPhase, TopHitsContext topHitsContext, String name, long estimatedBucketsCount, AggregationContext context, Aggregator parent, Map<String, Object> metaData) {
|
||||
public TopHitsAggregator(FetchPhase fetchPhase, SubSearchContext subSearchContext, String name, long estimatedBucketsCount, AggregationContext context, Aggregator parent, Map<String, Object> metaData) {
|
||||
super(name, estimatedBucketsCount, context, parent, metaData);
|
||||
this.fetchPhase = fetchPhase;
|
||||
topDocsCollectors = new LongObjectPagedHashMap<>(estimatedBucketsCount, context.bigArrays());
|
||||
this.topHitsContext = topHitsContext;
|
||||
this.subSearchContext = subSearchContext;
|
||||
context.registerScorerAware(this);
|
||||
}
|
||||
|
||||
|
@ -82,41 +83,41 @@ public class TopHitsAggregator extends MetricsAggregator implements ScorerAware
|
|||
return buildEmptyAggregation();
|
||||
}
|
||||
|
||||
topHitsContext.queryResult().topDocs(topDocs);
|
||||
subSearchContext.queryResult().topDocs(topDocs);
|
||||
int[] docIdsToLoad = new int[topDocs.scoreDocs.length];
|
||||
for (int i = 0; i < topDocs.scoreDocs.length; i++) {
|
||||
docIdsToLoad[i] = topDocs.scoreDocs[i].doc;
|
||||
}
|
||||
topHitsContext.docIdsToLoad(docIdsToLoad, 0, docIdsToLoad.length);
|
||||
fetchPhase.execute(topHitsContext);
|
||||
FetchSearchResult fetchResult = topHitsContext.fetchResult();
|
||||
subSearchContext.docIdsToLoad(docIdsToLoad, 0, docIdsToLoad.length);
|
||||
fetchPhase.execute(subSearchContext);
|
||||
FetchSearchResult fetchResult = subSearchContext.fetchResult();
|
||||
InternalSearchHit[] internalHits = fetchResult.fetchResult().hits().internalHits();
|
||||
for (int i = 0; i < internalHits.length; i++) {
|
||||
ScoreDoc scoreDoc = topDocs.scoreDocs[i];
|
||||
InternalSearchHit searchHitFields = internalHits[i];
|
||||
searchHitFields.shard(topHitsContext.shardTarget());
|
||||
searchHitFields.shard(subSearchContext.shardTarget());
|
||||
searchHitFields.score(scoreDoc.score);
|
||||
if (scoreDoc instanceof FieldDoc) {
|
||||
FieldDoc fieldDoc = (FieldDoc) scoreDoc;
|
||||
searchHitFields.sortValues(fieldDoc.fields);
|
||||
}
|
||||
}
|
||||
return new InternalTopHits(name, topHitsContext.from(), topHitsContext.size(), topDocs, fetchResult.hits());
|
||||
return new InternalTopHits(name, subSearchContext.from(), subSearchContext.size(), topDocs, fetchResult.hits());
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalTopHits(name, topHitsContext.from(), topHitsContext.size(), Lucene.EMPTY_TOP_DOCS, InternalSearchHits.empty());
|
||||
return new InternalTopHits(name, subSearchContext.from(), subSearchContext.size(), Lucene.EMPTY_TOP_DOCS, InternalSearchHits.empty());
|
||||
}
|
||||
|
||||
@Override
|
||||
public void collect(int docId, long bucketOrdinal) throws IOException {
|
||||
TopDocsAndLeafCollector collectors = topDocsCollectors.get(bucketOrdinal);
|
||||
if (collectors == null) {
|
||||
Sort sort = topHitsContext.sort();
|
||||
int topN = topHitsContext.from() + topHitsContext.size();
|
||||
TopDocsCollector<?> topLevelCollector = sort != null ? TopFieldCollector.create(sort, topN, true, topHitsContext.trackScores(), topHitsContext.trackScores(), false) : TopScoreDocCollector.create(topN, false);
|
||||
Sort sort = subSearchContext.sort();
|
||||
int topN = subSearchContext.from() + subSearchContext.size();
|
||||
TopDocsCollector<?> topLevelCollector = sort != null ? TopFieldCollector.create(sort, topN, true, subSearchContext.trackScores(), subSearchContext.trackScores(), false) : TopScoreDocCollector.create(topN, false);
|
||||
collectors = new TopDocsAndLeafCollector(topLevelCollector);
|
||||
collectors.leafCollector = collectors.topLevelCollector.getLeafCollector(currentContext);
|
||||
collectors.leafCollector.setScorer(currentScorer);
|
||||
|
@ -157,17 +158,17 @@ public class TopHitsAggregator extends MetricsAggregator implements ScorerAware
|
|||
public static class Factory extends AggregatorFactory {
|
||||
|
||||
private final FetchPhase fetchPhase;
|
||||
private final TopHitsContext topHitsContext;
|
||||
private final SubSearchContext subSearchContext;
|
||||
|
||||
public Factory(String name, FetchPhase fetchPhase, TopHitsContext topHitsContext) {
|
||||
public Factory(String name, FetchPhase fetchPhase, SubSearchContext subSearchContext) {
|
||||
super(name, InternalTopHits.TYPE.name());
|
||||
this.fetchPhase = fetchPhase;
|
||||
this.topHitsContext = topHitsContext;
|
||||
this.subSearchContext = subSearchContext;
|
||||
}
|
||||
|
||||
@Override
|
||||
public Aggregator createInternal(AggregationContext aggregationContext, Aggregator parent, long expectedBucketsCount, Map<String, Object> metaData) {
|
||||
return new TopHitsAggregator(fetchPhase, topHitsContext, name, expectedBucketsCount, aggregationContext, parent, metaData);
|
||||
return new TopHitsAggregator(fetchPhase, subSearchContext, name, expectedBucketsCount, aggregationContext, parent, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -30,6 +30,7 @@ import org.elasticsearch.search.fetch.script.ScriptFieldsParseElement;
|
|||
import org.elasticsearch.search.fetch.source.FetchSourceParseElement;
|
||||
import org.elasticsearch.search.highlight.HighlighterParseElement;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
import org.elasticsearch.search.sort.SortParseElement;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -63,7 +64,7 @@ public class TopHitsParser implements Aggregator.Parser {
|
|||
|
||||
@Override
|
||||
public AggregatorFactory parse(String aggregationName, XContentParser parser, SearchContext context) throws IOException {
|
||||
TopHitsContext topHitsContext = new TopHitsContext(context);
|
||||
SubSearchContext subSearchContext = new SubSearchContext(context);
|
||||
XContentParser.Token token;
|
||||
String currentFieldName = null;
|
||||
try {
|
||||
|
@ -71,26 +72,26 @@ public class TopHitsParser implements Aggregator.Parser {
|
|||
if (token == XContentParser.Token.FIELD_NAME) {
|
||||
currentFieldName = parser.currentName();
|
||||
} else if ("sort".equals(currentFieldName)) {
|
||||
sortParseElement.parse(parser, topHitsContext);
|
||||
sortParseElement.parse(parser, subSearchContext);
|
||||
} else if ("_source".equals(currentFieldName)) {
|
||||
sourceParseElement.parse(parser, topHitsContext);
|
||||
sourceParseElement.parse(parser, subSearchContext);
|
||||
} else if (token.isValue()) {
|
||||
switch (currentFieldName) {
|
||||
case "from":
|
||||
topHitsContext.from(parser.intValue());
|
||||
subSearchContext.from(parser.intValue());
|
||||
break;
|
||||
case "size":
|
||||
topHitsContext.size(parser.intValue());
|
||||
subSearchContext.size(parser.intValue());
|
||||
break;
|
||||
case "track_scores":
|
||||
case "trackScores":
|
||||
topHitsContext.trackScores(parser.booleanValue());
|
||||
subSearchContext.trackScores(parser.booleanValue());
|
||||
break;
|
||||
case "version":
|
||||
topHitsContext.version(parser.booleanValue());
|
||||
subSearchContext.version(parser.booleanValue());
|
||||
break;
|
||||
case "explain":
|
||||
topHitsContext.explain(parser.booleanValue());
|
||||
subSearchContext.explain(parser.booleanValue());
|
||||
break;
|
||||
default:
|
||||
throw new SearchParseException(context, "Unknown key for a " + token + " in [" + aggregationName + "]: [" + currentFieldName + "].");
|
||||
|
@ -98,11 +99,11 @@ public class TopHitsParser implements Aggregator.Parser {
|
|||
} else if (token == XContentParser.Token.START_OBJECT) {
|
||||
switch (currentFieldName) {
|
||||
case "highlight":
|
||||
highlighterParseElement.parse(parser, topHitsContext);
|
||||
highlighterParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
case "scriptFields":
|
||||
case "script_fields":
|
||||
scriptFieldsParseElement.parse(parser, topHitsContext);
|
||||
scriptFieldsParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
default:
|
||||
throw new SearchParseException(context, "Unknown key for a " + token + " in [" + aggregationName + "]: [" + currentFieldName + "].");
|
||||
|
@ -111,7 +112,7 @@ public class TopHitsParser implements Aggregator.Parser {
|
|||
switch (currentFieldName) {
|
||||
case "fielddataFields":
|
||||
case "fielddata_fields":
|
||||
fieldDataFieldsParseElement.parse(parser, topHitsContext);
|
||||
fieldDataFieldsParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
default:
|
||||
throw new SearchParseException(context, "Unknown key for a " + token + " in [" + aggregationName + "]: [" + currentFieldName + "].");
|
||||
|
@ -123,7 +124,7 @@ public class TopHitsParser implements Aggregator.Parser {
|
|||
} catch (Exception e) {
|
||||
throw ExceptionsHelper.convertToElastic(e);
|
||||
}
|
||||
return new TopHitsAggregator.Factory(aggregationName, fetchPhase, topHitsContext);
|
||||
return new TopHitsAggregator.Factory(aggregationName, fetchPhase, subSearchContext);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -38,6 +38,7 @@ import org.elasticsearch.common.xcontent.XContentType;
|
|||
import org.elasticsearch.index.query.FilterBuilder;
|
||||
import org.elasticsearch.index.query.QueryBuilder;
|
||||
import org.elasticsearch.search.aggregations.AbstractAggregationBuilder;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsBuilder;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceContext;
|
||||
import org.elasticsearch.search.highlight.HighlightBuilder;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
|
@ -113,6 +114,8 @@ public class SearchSourceBuilder implements ToXContent {
|
|||
|
||||
private SuggestBuilder suggestBuilder;
|
||||
|
||||
private InnerHitsBuilder innerHitsBuilder;
|
||||
|
||||
private List<RescoreBuilder> rescoreBuilders;
|
||||
private Integer defaultRescoreWindowSize;
|
||||
|
||||
|
@ -434,6 +437,13 @@ public class SearchSourceBuilder implements ToXContent {
|
|||
return this;
|
||||
}
|
||||
|
||||
public InnerHitsBuilder innerHitsBuilder() {
|
||||
if (innerHitsBuilder == null) {
|
||||
innerHitsBuilder = new InnerHitsBuilder();
|
||||
}
|
||||
return innerHitsBuilder;
|
||||
}
|
||||
|
||||
public SuggestBuilder suggest() {
|
||||
if (suggestBuilder == null) {
|
||||
suggestBuilder = new SuggestBuilder("suggest");
|
||||
|
@ -642,7 +652,12 @@ public class SearchSourceBuilder implements ToXContent {
|
|||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject();
|
||||
innerToXContent(builder, params);
|
||||
builder.endObject();
|
||||
return builder;
|
||||
}
|
||||
|
||||
public void innerToXContent(XContentBuilder builder, Params params) throws IOException{
|
||||
if (from != -1) {
|
||||
builder.field("from", from);
|
||||
}
|
||||
|
@ -792,6 +807,10 @@ public class SearchSourceBuilder implements ToXContent {
|
|||
highlightBuilder.toXContent(builder, params);
|
||||
}
|
||||
|
||||
if (innerHitsBuilder != null) {
|
||||
innerHitsBuilder.toXContent(builder, params);
|
||||
}
|
||||
|
||||
if (suggestBuilder != null) {
|
||||
suggestBuilder.toXContent(builder, params);
|
||||
}
|
||||
|
@ -835,9 +854,6 @@ public class SearchSourceBuilder implements ToXContent {
|
|||
}
|
||||
builder.endArray();
|
||||
}
|
||||
|
||||
builder.endObject();
|
||||
return builder;
|
||||
}
|
||||
|
||||
private static class ScriptField {
|
||||
|
|
|
@ -48,6 +48,7 @@ import org.elasticsearch.search.SearchParseElement;
|
|||
import org.elasticsearch.search.SearchPhase;
|
||||
import org.elasticsearch.search.fetch.explain.ExplainFetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsFetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsFetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.matchedqueries.MatchedQueriesFetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsFetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceContext;
|
||||
|
@ -80,9 +81,11 @@ public class FetchPhase implements SearchPhase {
|
|||
@Inject
|
||||
public FetchPhase(HighlightPhase highlightPhase, ScriptFieldsFetchSubPhase scriptFieldsPhase,
|
||||
MatchedQueriesFetchSubPhase matchedQueriesPhase, ExplainFetchSubPhase explainPhase, VersionFetchSubPhase versionPhase,
|
||||
FetchSourceSubPhase fetchSourceSubPhase, FieldDataFieldsFetchSubPhase fieldDataFieldsFetchSubPhase) {
|
||||
FetchSourceSubPhase fetchSourceSubPhase, FieldDataFieldsFetchSubPhase fieldDataFieldsFetchSubPhase,
|
||||
InnerHitsFetchSubPhase innerHitsFetchSubPhase) {
|
||||
innerHitsFetchSubPhase.setFetchPhase(this);
|
||||
this.fetchSubPhases = new FetchSubPhase[]{scriptFieldsPhase, matchedQueriesPhase, explainPhase, highlightPhase,
|
||||
fetchSourceSubPhase, versionPhase, fieldDataFieldsFetchSubPhase};
|
||||
fetchSourceSubPhase, versionPhase, fieldDataFieldsFetchSubPhase, innerHitsFetchSubPhase};
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -262,7 +265,10 @@ public class FetchPhase implements SearchPhase {
|
|||
|
||||
private InternalSearchHit createNestedSearchHit(SearchContext context, int nestedTopDocId, int nestedSubDocId, int rootSubDocId, List<String> extractFieldNames, boolean loadAllStored, Set<String> fieldNames, LeafReaderContext subReaderContext) throws IOException {
|
||||
final FieldsVisitor rootFieldsVisitor;
|
||||
if (context.sourceRequested() || extractFieldNames != null) {
|
||||
if (context.sourceRequested() || extractFieldNames != null || context.highlight() != null) {
|
||||
// Also if highlighting is requested on nested documents we need to fetch the _source from the root document,
|
||||
// otherwise highlighting will attempt to fetch the _source from the nested doc, which will fail,
|
||||
// because the entire _source is only stored with the root document.
|
||||
rootFieldsVisitor = new UidAndSourceFieldsVisitor();
|
||||
} else {
|
||||
rootFieldsVisitor = new JustUidFieldsVisitor();
|
||||
|
|
|
@ -0,0 +1,442 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.search.fetch.innerhits;
|
||||
|
||||
import org.elasticsearch.common.Nullable;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.QueryBuilder;
|
||||
import org.elasticsearch.search.builder.SearchSourceBuilder;
|
||||
import org.elasticsearch.search.highlight.HighlightBuilder;
|
||||
import org.elasticsearch.search.sort.SortBuilder;
|
||||
import org.elasticsearch.search.sort.SortOrder;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class InnerHitsBuilder implements ToXContent {
|
||||
|
||||
private Map<String, InnerHit> innerHits = new HashMap<>();
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject("inner_hits");
|
||||
for (Map.Entry<String, InnerHit> entry : innerHits.entrySet()) {
|
||||
builder.startObject(entry.getKey());
|
||||
entry.getValue().toXContent(builder, params);
|
||||
builder.endObject();
|
||||
}
|
||||
return builder.endObject();
|
||||
}
|
||||
|
||||
public void addInnerHit(String name, InnerHit innerHit) {
|
||||
innerHits.put(name, innerHit);
|
||||
}
|
||||
|
||||
public static class InnerHit implements ToXContent {
|
||||
|
||||
private String path;
|
||||
private String type;
|
||||
|
||||
private SearchSourceBuilder sourceBuilder;
|
||||
|
||||
public InnerHit setQuery(QueryBuilder query) {
|
||||
sourceBuilder().query(query);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setPath(String path) {
|
||||
this.path = path;
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setType(String type) {
|
||||
this.type = type;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The index to start to return hits from. Defaults to <tt>0</tt>.
|
||||
*/
|
||||
public InnerHit setFrom(int from) {
|
||||
sourceBuilder().from(from);
|
||||
return this;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* The number of search hits to return. Defaults to <tt>10</tt>.
|
||||
*/
|
||||
public InnerHit setSize(int size) {
|
||||
sourceBuilder().size(size);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Applies when sorting, and controls if scores will be tracked as well. Defaults to
|
||||
* <tt>false</tt>.
|
||||
*/
|
||||
public InnerHit setTrackScores(boolean trackScores) {
|
||||
sourceBuilder().trackScores(trackScores);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Should each {@link org.elasticsearch.search.SearchHit} be returned with an
|
||||
* explanation of the hit (ranking).
|
||||
*/
|
||||
public InnerHit setExplain(boolean explain) {
|
||||
sourceBuilder().explain(explain);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Should each {@link org.elasticsearch.search.SearchHit} be returned with its
|
||||
* version.
|
||||
*/
|
||||
public InnerHit setVersion(boolean version) {
|
||||
sourceBuilder().version(version);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets no fields to be loaded, resulting in only id and type to be returned per field.
|
||||
*/
|
||||
public InnerHit setNoFields() {
|
||||
sourceBuilder().noFields();
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicates whether the response should contain the stored _source for every hit
|
||||
*/
|
||||
public InnerHit setFetchSource(boolean fetch) {
|
||||
sourceBuilder().fetchSource(fetch);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicate that _source should be returned with every hit, with an "include" and/or "exclude" set which can include simple wildcard
|
||||
* elements.
|
||||
*
|
||||
* @param include An optional include (optionally wildcarded) pattern to filter the returned _source
|
||||
* @param exclude An optional exclude (optionally wildcarded) pattern to filter the returned _source
|
||||
*/
|
||||
public InnerHit setFetchSource(@Nullable String include, @Nullable String exclude) {
|
||||
sourceBuilder().fetchSource(include, exclude);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicate that _source should be returned with every hit, with an "include" and/or "exclude" set which can include simple wildcard
|
||||
* elements.
|
||||
*
|
||||
* @param includes An optional list of include (optionally wildcarded) pattern to filter the returned _source
|
||||
* @param excludes An optional list of exclude (optionally wildcarded) pattern to filter the returned _source
|
||||
*/
|
||||
public InnerHit setFetchSource(@Nullable String[] includes, @Nullable String[] excludes) {
|
||||
sourceBuilder().fetchSource(includes, excludes);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field data based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The field to get from the field data cache
|
||||
*/
|
||||
public InnerHit addFieldDataField(String name) {
|
||||
sourceBuilder().fieldDataField(name);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a script based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The name that will represent this value in the return hit
|
||||
* @param script The script to use
|
||||
*/
|
||||
public InnerHit addScriptField(String name, String script) {
|
||||
sourceBuilder().scriptField(name, script);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a script based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The name that will represent this value in the return hit
|
||||
* @param script The script to use
|
||||
* @param params Parameters that the script can use.
|
||||
*/
|
||||
public InnerHit addScriptField(String name, String script, Map<String, Object> params) {
|
||||
sourceBuilder().scriptField(name, script, params);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a script based field to load and return. The field does not have to be stored,
|
||||
* but its recommended to use non analyzed or numeric fields.
|
||||
*
|
||||
* @param name The name that will represent this value in the return hit
|
||||
* @param lang The language of the script
|
||||
* @param script The script to use
|
||||
* @param params Parameters that the script can use (can be <tt>null</tt>).
|
||||
*/
|
||||
public InnerHit addScriptField(String name, String lang, String script, Map<String, Object> params) {
|
||||
sourceBuilder().scriptField(name, lang, script, params);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a sort against the given field name and the sort ordering.
|
||||
*
|
||||
* @param field The name of the field
|
||||
* @param order The sort ordering
|
||||
*/
|
||||
public InnerHit addSort(String field, SortOrder order) {
|
||||
sourceBuilder().sort(field, order);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a generic sort builder.
|
||||
*
|
||||
* @see org.elasticsearch.search.sort.SortBuilders
|
||||
*/
|
||||
public InnerHit addSort(SortBuilder sort) {
|
||||
sourceBuilder().sort(sort);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with default fragment size of 100 characters, and
|
||||
* default number of fragments of 5.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
*/
|
||||
public InnerHit addHighlightedField(String name) {
|
||||
highlightBuilder().field(name);
|
||||
return this;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with a provided fragment size (in characters), and
|
||||
* default number of fragments of 5.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
* @param fragmentSize The size of a fragment in characters
|
||||
*/
|
||||
public InnerHit addHighlightedField(String name, int fragmentSize) {
|
||||
highlightBuilder().field(name, fragmentSize);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with a provided fragment size (in characters), and
|
||||
* a provided (maximum) number of fragments.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
* @param fragmentSize The size of a fragment in characters
|
||||
* @param numberOfFragments The (maximum) number of fragments
|
||||
*/
|
||||
public InnerHit addHighlightedField(String name, int fragmentSize, int numberOfFragments) {
|
||||
highlightBuilder().field(name, fragmentSize, numberOfFragments);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a field to be highlighted with a provided fragment size (in characters),
|
||||
* a provided (maximum) number of fragments and an offset for the highlight.
|
||||
*
|
||||
* @param name The field to highlight
|
||||
* @param fragmentSize The size of a fragment in characters
|
||||
* @param numberOfFragments The (maximum) number of fragments
|
||||
*/
|
||||
public InnerHit addHighlightedField(String name, int fragmentSize, int numberOfFragments,
|
||||
int fragmentOffset) {
|
||||
highlightBuilder().field(name, fragmentSize, numberOfFragments, fragmentOffset);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a highlighted field.
|
||||
*/
|
||||
public InnerHit addHighlightedField(HighlightBuilder.Field field) {
|
||||
highlightBuilder().field(field);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Set a tag scheme that encapsulates a built in pre and post tags. The allows schemes
|
||||
* are <tt>styled</tt> and <tt>default</tt>.
|
||||
*
|
||||
* @param schemaName The tag scheme name
|
||||
*/
|
||||
public InnerHit setHighlighterTagsSchema(String schemaName) {
|
||||
highlightBuilder().tagsSchema(schemaName);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterFragmentSize(Integer fragmentSize) {
|
||||
highlightBuilder().fragmentSize(fragmentSize);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterNumOfFragments(Integer numOfFragments) {
|
||||
highlightBuilder().numOfFragments(numOfFragments);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterFilter(Boolean highlightFilter) {
|
||||
highlightBuilder().highlightFilter(highlightFilter);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The encoder to set for highlighting
|
||||
*/
|
||||
public InnerHit setHighlighterEncoder(String encoder) {
|
||||
highlightBuilder().encoder(encoder);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Explicitly set the pre tags that will be used for highlighting.
|
||||
*/
|
||||
public InnerHit setHighlighterPreTags(String... preTags) {
|
||||
highlightBuilder().preTags(preTags);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Explicitly set the post tags that will be used for highlighting.
|
||||
*/
|
||||
public InnerHit setHighlighterPostTags(String... postTags) {
|
||||
highlightBuilder().postTags(postTags);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The order of fragments per field. By default, ordered by the order in the
|
||||
* highlighted text. Can be <tt>score</tt>, which then it will be ordered
|
||||
* by score of the fragments.
|
||||
*/
|
||||
public InnerHit setHighlighterOrder(String order) {
|
||||
highlightBuilder().order(order);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterRequireFieldMatch(boolean requireFieldMatch) {
|
||||
highlightBuilder().requireFieldMatch(requireFieldMatch);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterBoundaryMaxScan(Integer boundaryMaxScan) {
|
||||
highlightBuilder().boundaryMaxScan(boundaryMaxScan);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterBoundaryChars(char[] boundaryChars) {
|
||||
highlightBuilder().boundaryChars(boundaryChars);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* The highlighter type to use.
|
||||
*/
|
||||
public InnerHit setHighlighterType(String type) {
|
||||
highlightBuilder().highlighterType(type);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterFragmenter(String fragmenter) {
|
||||
highlightBuilder().fragmenter(fragmenter);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets a query to be used for highlighting all fields instead of the search query.
|
||||
*/
|
||||
public InnerHit setHighlighterQuery(QueryBuilder highlightQuery) {
|
||||
highlightBuilder().highlightQuery(highlightQuery);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the size of the fragment to return from the beginning of the field if there are no matches to
|
||||
* highlight and the field doesn't also define noMatchSize.
|
||||
* @param noMatchSize integer to set or null to leave out of request. default is null.
|
||||
* @return this builder for chaining
|
||||
*/
|
||||
public InnerHit setHighlighterNoMatchSize(Integer noMatchSize) {
|
||||
highlightBuilder().noMatchSize(noMatchSize);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the maximum number of phrases the fvh will consider if the field doesn't also define phraseLimit.
|
||||
*/
|
||||
public InnerHit setHighlighterPhraseLimit(Integer phraseLimit) {
|
||||
highlightBuilder().phraseLimit(phraseLimit);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit setHighlighterOptions(Map<String, Object> options) {
|
||||
highlightBuilder().options(options);
|
||||
return this;
|
||||
}
|
||||
|
||||
public InnerHit addInnerHit(String name, InnerHit innerHit) {
|
||||
sourceBuilder().innerHitsBuilder().addInnerHit(name, innerHit);
|
||||
return this;
|
||||
}
|
||||
|
||||
private SearchSourceBuilder sourceBuilder() {
|
||||
if (sourceBuilder == null) {
|
||||
sourceBuilder = new SearchSourceBuilder();
|
||||
}
|
||||
return sourceBuilder;
|
||||
}
|
||||
|
||||
public HighlightBuilder highlightBuilder() {
|
||||
return sourceBuilder().highlighter();
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
if (path != null) {
|
||||
builder.startObject("path").startObject(path);
|
||||
} else {
|
||||
builder.startObject("type").startObject(type);
|
||||
}
|
||||
if (sourceBuilder != null) {
|
||||
sourceBuilder.innerToXContent(builder, params);
|
||||
}
|
||||
return builder.endObject().endObject();
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,279 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.search.fetch.innerhits;
|
||||
|
||||
import com.google.common.collect.ImmutableMap;
|
||||
import org.apache.lucene.index.LeafReader;
|
||||
import org.apache.lucene.index.LeafReaderContext;
|
||||
import org.apache.lucene.index.Term;
|
||||
import org.apache.lucene.queries.TermFilter;
|
||||
import org.apache.lucene.search.*;
|
||||
import org.apache.lucene.search.join.BitDocIdSetFilter;
|
||||
import org.apache.lucene.util.BitSet;
|
||||
import org.apache.lucene.util.Bits;
|
||||
import org.elasticsearch.ExceptionsHelper;
|
||||
import org.elasticsearch.common.lucene.search.AndFilter;
|
||||
import org.elasticsearch.index.mapper.DocumentMapper;
|
||||
import org.elasticsearch.index.mapper.Uid;
|
||||
import org.elasticsearch.index.mapper.internal.ParentFieldMapper;
|
||||
import org.elasticsearch.index.mapper.internal.UidFieldMapper;
|
||||
import org.elasticsearch.index.mapper.object.ObjectMapper;
|
||||
import org.elasticsearch.index.query.ParsedQuery;
|
||||
import org.elasticsearch.index.search.nested.NonNestedDocsFilter;
|
||||
import org.elasticsearch.search.fetch.FetchSubPhase;
|
||||
import org.elasticsearch.search.internal.FilteredSearchContext;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Arrays;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
*/
|
||||
public final class InnerHitsContext {
|
||||
|
||||
private Map<String, BaseInnerHits> innerHits;
|
||||
|
||||
public InnerHitsContext(Map<String, BaseInnerHits> innerHits) {
|
||||
this.innerHits = innerHits;
|
||||
}
|
||||
|
||||
public Map<String, BaseInnerHits> getInnerHits() {
|
||||
return innerHits;
|
||||
}
|
||||
|
||||
public static abstract class BaseInnerHits extends FilteredSearchContext {
|
||||
|
||||
protected final Query query;
|
||||
private final InnerHitsContext childInnerHits;
|
||||
|
||||
protected BaseInnerHits(SearchContext context, Query query, Map<String, BaseInnerHits> childInnerHits) {
|
||||
super(context);
|
||||
this.query = query;
|
||||
if (childInnerHits != null && !childInnerHits.isEmpty()) {
|
||||
this.childInnerHits = new InnerHitsContext(childInnerHits);
|
||||
} else {
|
||||
this.childInnerHits = null;
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public Query query() {
|
||||
return query;
|
||||
}
|
||||
|
||||
@Override
|
||||
public ParsedQuery parsedQuery() {
|
||||
return new ParsedQuery(query, ImmutableMap.<String, Filter>of());
|
||||
}
|
||||
|
||||
public abstract TopDocs topDocs(SearchContext context, FetchSubPhase.HitContext hitContext);
|
||||
|
||||
@Override
|
||||
public InnerHitsContext innerHits() {
|
||||
return childInnerHits;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
public static final class NestedInnerHits extends BaseInnerHits {
|
||||
|
||||
private final ObjectMapper parentObjectMapper;
|
||||
private final ObjectMapper childObjectMapper;
|
||||
|
||||
public NestedInnerHits(SearchContext context, Query query, Map<String, BaseInnerHits> childInnerHits, ObjectMapper parentObjectMapper, ObjectMapper childObjectMapper) {
|
||||
super(context, query, childInnerHits);
|
||||
this.parentObjectMapper = parentObjectMapper;
|
||||
this.childObjectMapper = childObjectMapper;
|
||||
}
|
||||
|
||||
@Override
|
||||
public TopDocs topDocs(SearchContext context, FetchSubPhase.HitContext hitContext) {
|
||||
TopDocsCollector topDocsCollector;
|
||||
int topN = from() + size();
|
||||
if (sort() != null) {
|
||||
try {
|
||||
topDocsCollector = TopFieldCollector.create(sort(), topN, true, trackScores(), trackScores(), true);
|
||||
} catch (IOException e) {
|
||||
throw ExceptionsHelper.convertToElastic(e);
|
||||
}
|
||||
} else {
|
||||
topDocsCollector = TopScoreDocCollector.create(topN, true);
|
||||
}
|
||||
|
||||
Filter rawParentFilter;
|
||||
if (parentObjectMapper == null) {
|
||||
rawParentFilter = NonNestedDocsFilter.INSTANCE;
|
||||
} else {
|
||||
rawParentFilter = parentObjectMapper.nestedTypeFilter();
|
||||
}
|
||||
BitDocIdSetFilter parentFilter = context.bitsetFilterCache().getBitDocIdSetFilter(rawParentFilter);
|
||||
Filter childFilter = context.filterCache().cache(childObjectMapper.nestedTypeFilter());
|
||||
try {
|
||||
Query q = new FilteredQuery(query, new NestedChildrenFilter(parentFilter, childFilter, hitContext));
|
||||
context.searcher().search(q, topDocsCollector);
|
||||
} catch (IOException e) {
|
||||
throw ExceptionsHelper.convertToElastic(e);
|
||||
}
|
||||
return topDocsCollector.topDocs(from(), size());
|
||||
}
|
||||
|
||||
// A filter that only emits the nested children docs of a specific nested parent doc
|
||||
static class NestedChildrenFilter extends Filter {
|
||||
|
||||
private final BitDocIdSetFilter parentFilter;
|
||||
private final Filter childFilter;
|
||||
private final int docId;
|
||||
private final LeafReader atomicReader;
|
||||
|
||||
NestedChildrenFilter(BitDocIdSetFilter parentFilter, Filter childFilter, FetchSubPhase.HitContext hitContext) {
|
||||
this.parentFilter = parentFilter;
|
||||
this.childFilter = childFilter;
|
||||
this.docId = hitContext.docId();
|
||||
this.atomicReader = hitContext.readerContext().reader();
|
||||
}
|
||||
|
||||
@Override
|
||||
public DocIdSet getDocIdSet(LeafReaderContext context, final Bits acceptDocs) throws IOException {
|
||||
// Nested docs only reside in a single segment, so no need to evaluate all segments
|
||||
if (!context.reader().getCoreCacheKey().equals(this.atomicReader.getCoreCacheKey())) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// If docId == 0 then we a parent doc doesn't have child docs, because child docs are stored
|
||||
// before the parent doc and because parent doc is 0 we can safely assume that there are no child docs.
|
||||
if (docId == 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
final BitSet parents = parentFilter.getDocIdSet(context).bits();
|
||||
final int firstChildDocId = parents.prevSetBit(docId - 1) + 1;
|
||||
// A parent doc doesn't have child docs, so we can early exit here:
|
||||
if (firstChildDocId == docId) {
|
||||
return null;
|
||||
}
|
||||
|
||||
final DocIdSet children = childFilter.getDocIdSet(context, acceptDocs);
|
||||
if (children == null) {
|
||||
return null;
|
||||
}
|
||||
final DocIdSetIterator childrenIterator = children.iterator();
|
||||
if (childrenIterator == null) {
|
||||
return null;
|
||||
}
|
||||
return new DocIdSet() {
|
||||
|
||||
@Override
|
||||
public long ramBytesUsed() {
|
||||
return parents.ramBytesUsed() + children.ramBytesUsed();
|
||||
}
|
||||
|
||||
@Override
|
||||
public DocIdSetIterator iterator() throws IOException {
|
||||
return new DocIdSetIterator() {
|
||||
|
||||
int currentDocId = -1;
|
||||
|
||||
@Override
|
||||
public int docID() {
|
||||
return currentDocId;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int nextDoc() throws IOException {
|
||||
return advance(currentDocId + 1);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int advance(int target) throws IOException {
|
||||
target = Math.max(firstChildDocId, target);
|
||||
if (target >= docId) {
|
||||
// We're outside the child nested scope, so it is done
|
||||
return currentDocId = NO_MORE_DOCS;
|
||||
} else {
|
||||
int advanced = childrenIterator.advance(target);
|
||||
if (advanced >= docId) {
|
||||
// We're outside the child nested scope, so it is done
|
||||
return currentDocId = NO_MORE_DOCS;
|
||||
} else {
|
||||
return currentDocId = advanced;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public long cost() {
|
||||
return childrenIterator.cost();
|
||||
}
|
||||
};
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
public static final class ParentChildInnerHits extends BaseInnerHits {
|
||||
|
||||
private final DocumentMapper documentMapper;
|
||||
|
||||
public ParentChildInnerHits(SearchContext context, Query query, Map<String, BaseInnerHits> childInnerHits, DocumentMapper documentMapper) {
|
||||
super(context, query, childInnerHits);
|
||||
this.documentMapper = documentMapper;
|
||||
}
|
||||
|
||||
@Override
|
||||
public TopDocs topDocs(SearchContext context, FetchSubPhase.HitContext hitContext) {
|
||||
TopDocsCollector topDocsCollector;
|
||||
int topN = from() + size();
|
||||
if (sort() != null) {
|
||||
try {
|
||||
topDocsCollector = TopFieldCollector.create(sort(), topN, true, trackScores(), trackScores(), false);
|
||||
} catch (IOException e) {
|
||||
throw ExceptionsHelper.convertToElastic(e);
|
||||
}
|
||||
} else {
|
||||
topDocsCollector = TopScoreDocCollector.create(topN, false);
|
||||
}
|
||||
|
||||
String field;
|
||||
ParentFieldMapper hitParentFieldMapper = documentMapper.parentFieldMapper();
|
||||
if (hitParentFieldMapper.active()) {
|
||||
// Hit has a active _parent field and it is a child doc, so we want a parent doc as inner hits.
|
||||
field = ParentFieldMapper.NAME;
|
||||
} else {
|
||||
// Hit has no active _parent field and it is a parent doc, so we want children docs as inner hits.
|
||||
field = UidFieldMapper.NAME;
|
||||
}
|
||||
String term = Uid.createUid(hitContext.hit().type(), hitContext.hit().id());
|
||||
Filter filter = new TermFilter(new Term(field, term)); // Only include docs that have the current hit as parent
|
||||
Filter typeFilter = documentMapper.typeFilter(); // Only include docs that have this inner hits type.
|
||||
try {
|
||||
context.searcher().search(
|
||||
new FilteredQuery(query, new AndFilter(Arrays.asList(filter, typeFilter))),
|
||||
topDocsCollector
|
||||
);
|
||||
} catch (IOException e) {
|
||||
throw ExceptionsHelper.convertToElastic(e);
|
||||
}
|
||||
return topDocsCollector.topDocs(from(), size());
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,122 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.search.fetch.innerhits;
|
||||
|
||||
import com.google.common.collect.ImmutableMap;
|
||||
import org.apache.lucene.search.FieldDoc;
|
||||
import org.apache.lucene.search.ScoreDoc;
|
||||
import org.apache.lucene.search.TopDocs;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.search.SearchParseElement;
|
||||
import org.elasticsearch.search.fetch.FetchPhase;
|
||||
import org.elasticsearch.search.fetch.FetchSearchResult;
|
||||
import org.elasticsearch.search.fetch.FetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsParseElement;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsParseElement;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceParseElement;
|
||||
import org.elasticsearch.search.highlight.HighlighterParseElement;
|
||||
import org.elasticsearch.search.internal.InternalSearchHit;
|
||||
import org.elasticsearch.search.internal.InternalSearchHits;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
import org.elasticsearch.search.sort.SortParseElement;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class InnerHitsFetchSubPhase implements FetchSubPhase {
|
||||
|
||||
private final SortParseElement sortParseElement;
|
||||
private final FetchSourceParseElement sourceParseElement;
|
||||
private final HighlighterParseElement highlighterParseElement;
|
||||
private final FieldDataFieldsParseElement fieldDataFieldsParseElement;
|
||||
private final ScriptFieldsParseElement scriptFieldsParseElement;
|
||||
|
||||
private FetchPhase fetchPhase;
|
||||
|
||||
@Inject
|
||||
public InnerHitsFetchSubPhase(SortParseElement sortParseElement, FetchSourceParseElement sourceParseElement, HighlighterParseElement highlighterParseElement, FieldDataFieldsParseElement fieldDataFieldsParseElement, ScriptFieldsParseElement scriptFieldsParseElement) {
|
||||
this.sortParseElement = sortParseElement;
|
||||
this.sourceParseElement = sourceParseElement;
|
||||
this.highlighterParseElement = highlighterParseElement;
|
||||
this.fieldDataFieldsParseElement = fieldDataFieldsParseElement;
|
||||
this.scriptFieldsParseElement = scriptFieldsParseElement;
|
||||
}
|
||||
|
||||
@Override
|
||||
public Map<String, ? extends SearchParseElement> parseElements() {
|
||||
return ImmutableMap.of("inner_hits", new InnerHitsParseElement(
|
||||
sortParseElement, sourceParseElement, highlighterParseElement, fieldDataFieldsParseElement, scriptFieldsParseElement
|
||||
));
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hitExecutionNeeded(SearchContext context) {
|
||||
return context.innerHits() != null;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void hitExecute(SearchContext context, HitContext hitContext) throws ElasticsearchException {
|
||||
InnerHitsContext innerHitsContext = context.innerHits();
|
||||
Map<String, InternalSearchHits> results = new HashMap<>();
|
||||
Map<String, InnerHitsContext.BaseInnerHits> innerHitsByKey = innerHitsContext.getInnerHits();
|
||||
for (Map.Entry<String, InnerHitsContext.BaseInnerHits> entry : innerHitsByKey.entrySet()) {
|
||||
InnerHitsContext.BaseInnerHits innerHits = entry.getValue();
|
||||
TopDocs topDocs = innerHits.topDocs(context, hitContext);
|
||||
innerHits.queryResult().topDocs(topDocs);
|
||||
int[] docIdsToLoad = new int[topDocs.scoreDocs.length];
|
||||
for (int i = 0; i < topDocs.scoreDocs.length; i++) {
|
||||
docIdsToLoad[i] = topDocs.scoreDocs[i].doc;
|
||||
}
|
||||
innerHits.docIdsToLoad(docIdsToLoad, 0, docIdsToLoad.length);
|
||||
fetchPhase.execute(innerHits);
|
||||
FetchSearchResult fetchResult = innerHits.fetchResult();
|
||||
InternalSearchHit[] internalHits = fetchResult.fetchResult().hits().internalHits();
|
||||
for (int i = 0; i < internalHits.length; i++) {
|
||||
ScoreDoc scoreDoc = topDocs.scoreDocs[i];
|
||||
InternalSearchHit searchHitFields = internalHits[i];
|
||||
searchHitFields.shard(innerHits.shardTarget());
|
||||
searchHitFields.score(scoreDoc.score);
|
||||
if (scoreDoc instanceof FieldDoc) {
|
||||
FieldDoc fieldDoc = (FieldDoc) scoreDoc;
|
||||
searchHitFields.sortValues(fieldDoc.fields);
|
||||
}
|
||||
}
|
||||
results.put(entry.getKey(), fetchResult.hits());
|
||||
}
|
||||
hitContext.hit().setInnerHits(results);
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hitsExecutionNeeded(SearchContext context) {
|
||||
return false;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void hitsExecute(SearchContext context, InternalSearchHit[] hits) throws ElasticsearchException {
|
||||
}
|
||||
|
||||
// To get around cyclic dependency issue
|
||||
public void setFetchPhase(FetchPhase fetchPhase) {
|
||||
this.fetchPhase = fetchPhase;
|
||||
}
|
||||
}
|
|
@ -0,0 +1,252 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.search.fetch.innerhits;
|
||||
|
||||
import org.apache.lucene.search.MatchAllDocsQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.elasticsearch.ElasticsearchIllegalArgumentException;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.index.mapper.DocumentMapper;
|
||||
import org.elasticsearch.index.mapper.MapperService;
|
||||
import org.elasticsearch.index.mapper.object.ObjectMapper;
|
||||
import org.elasticsearch.index.query.NestedQueryParser;
|
||||
import org.elasticsearch.search.SearchParseElement;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsParseElement;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsParseElement;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceParseElement;
|
||||
import org.elasticsearch.search.highlight.HighlighterParseElement;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
import org.elasticsearch.search.sort.SortParseElement;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class InnerHitsParseElement implements SearchParseElement {
|
||||
|
||||
private final SortParseElement sortParseElement;
|
||||
private final FetchSourceParseElement sourceParseElement;
|
||||
private final HighlighterParseElement highlighterParseElement;
|
||||
private final FieldDataFieldsParseElement fieldDataFieldsParseElement;
|
||||
private final ScriptFieldsParseElement scriptFieldsParseElement;
|
||||
|
||||
public InnerHitsParseElement(SortParseElement sortParseElement, FetchSourceParseElement sourceParseElement, HighlighterParseElement highlighterParseElement, FieldDataFieldsParseElement fieldDataFieldsParseElement, ScriptFieldsParseElement scriptFieldsParseElement) {
|
||||
this.sortParseElement = sortParseElement;
|
||||
this.sourceParseElement = sourceParseElement;
|
||||
this.highlighterParseElement = highlighterParseElement;
|
||||
this.fieldDataFieldsParseElement = fieldDataFieldsParseElement;
|
||||
this.scriptFieldsParseElement = scriptFieldsParseElement;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void parse(XContentParser parser, SearchContext context) throws Exception {
|
||||
Map<String, InnerHitsContext.BaseInnerHits> innerHitsMap = parseInnerHits(parser, context);
|
||||
if (innerHitsMap != null) {
|
||||
context.innerHits(new InnerHitsContext(innerHitsMap));
|
||||
}
|
||||
}
|
||||
|
||||
private Map<String, InnerHitsContext.BaseInnerHits> parseInnerHits(XContentParser parser, SearchContext context) throws Exception {
|
||||
XContentParser.Token token;
|
||||
Map<String, InnerHitsContext.BaseInnerHits> innerHitsMap = null;
|
||||
while ((token = parser.nextToken()) != XContentParser.Token.END_OBJECT) {
|
||||
if (token != XContentParser.Token.FIELD_NAME) {
|
||||
throw new ElasticsearchIllegalArgumentException("Unexpected token " + token + " in [inner_hits]: aggregations definitions must start with the name of the aggregation.");
|
||||
}
|
||||
final String innerHitName = parser.currentName();
|
||||
token = parser.nextToken();
|
||||
if (token != XContentParser.Token.START_OBJECT) {
|
||||
throw new ElasticsearchIllegalArgumentException("Inner hit definition for [" + innerHitName + " starts with a [" + token + "], expected a [" + XContentParser.Token.START_OBJECT + "].");
|
||||
}
|
||||
InnerHitsContext.BaseInnerHits innerHits = parseInnerHit(parser, context, innerHitName);
|
||||
if (innerHitsMap == null) {
|
||||
innerHitsMap = new HashMap<>();
|
||||
}
|
||||
innerHitsMap.put(innerHitName, innerHits);
|
||||
}
|
||||
return innerHitsMap;
|
||||
}
|
||||
|
||||
private InnerHitsContext.BaseInnerHits parseInnerHit(XContentParser parser, SearchContext context, String innerHitName) throws Exception {
|
||||
XContentParser.Token token = parser.nextToken();
|
||||
if (token != XContentParser.Token.FIELD_NAME) {
|
||||
throw new ElasticsearchIllegalArgumentException("Unexpected token " + token + " inside inner hit definition. Either specify [path] or [type] object");
|
||||
}
|
||||
String fieldName = parser.currentName();
|
||||
token = parser.nextToken();
|
||||
if (token != XContentParser.Token.START_OBJECT) {
|
||||
throw new ElasticsearchIllegalArgumentException("Inner hit definition for [" + innerHitName + " starts with a [" + token + "], expected a [" + XContentParser.Token.START_OBJECT + "].");
|
||||
}
|
||||
final boolean nested;
|
||||
switch (fieldName) {
|
||||
case "path":
|
||||
nested = true;
|
||||
break;
|
||||
case "type":
|
||||
nested = false;
|
||||
break;
|
||||
default:
|
||||
throw new ElasticsearchIllegalArgumentException("Either path or type object must be defined");
|
||||
}
|
||||
token = parser.nextToken();
|
||||
if (token != XContentParser.Token.FIELD_NAME) {
|
||||
throw new ElasticsearchIllegalArgumentException("Unexpected token " + token + " inside inner hit definition. Either specify [path] or [type] object");
|
||||
}
|
||||
fieldName = parser.currentName();
|
||||
token = parser.nextToken();
|
||||
if (token != XContentParser.Token.START_OBJECT) {
|
||||
throw new ElasticsearchIllegalArgumentException("Inner hit definition for [" + innerHitName + " starts with a [" + token + "], expected a [" + XContentParser.Token.START_OBJECT + "].");
|
||||
}
|
||||
|
||||
NestedQueryParser.LateBindingParentFilter parentFilter = null;
|
||||
NestedQueryParser.LateBindingParentFilter currentFilter = null;
|
||||
|
||||
|
||||
String nestedPath = null;
|
||||
String type = null;
|
||||
if (nested) {
|
||||
nestedPath = fieldName;
|
||||
currentFilter = new NestedQueryParser.LateBindingParentFilter();
|
||||
parentFilter = NestedQueryParser.parentFilterContext.get();
|
||||
NestedQueryParser.parentFilterContext.set(currentFilter);
|
||||
} else {
|
||||
type = fieldName;
|
||||
}
|
||||
|
||||
Query query = null;
|
||||
Map<String, InnerHitsContext.BaseInnerHits> childInnerHits = null;
|
||||
SubSearchContext subSearchContext = new SubSearchContext(context);
|
||||
while ((token = parser.nextToken()) != XContentParser.Token.END_OBJECT) {
|
||||
if (token == XContentParser.Token.FIELD_NAME) {
|
||||
fieldName = parser.currentName();
|
||||
} else if ("sort".equals(fieldName)) {
|
||||
sortParseElement.parse(parser, subSearchContext);
|
||||
} else if ("_source".equals(fieldName)) {
|
||||
sourceParseElement.parse(parser, subSearchContext);
|
||||
} else if (token == XContentParser.Token.START_OBJECT) {
|
||||
switch (fieldName) {
|
||||
case "highlight":
|
||||
highlighterParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
case "scriptFields":
|
||||
case "script_fields":
|
||||
scriptFieldsParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
case "inner_hits":
|
||||
childInnerHits = parseInnerHits(parser, subSearchContext);
|
||||
break;
|
||||
case "query":
|
||||
query = context.queryParserService().parse(parser).query();
|
||||
break;
|
||||
default:
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown key for a " + token + " in [" + innerHitName + "]: [" + fieldName + "].");
|
||||
}
|
||||
} else if (token == XContentParser.Token.START_ARRAY) {
|
||||
switch (fieldName) {
|
||||
case "fielddataFields":
|
||||
case "fielddata_fields":
|
||||
fieldDataFieldsParseElement.parse(parser, subSearchContext);
|
||||
break;
|
||||
default:
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown key for a " + token + " in [" + innerHitName + "]: [" + fieldName + "].");
|
||||
}
|
||||
} else if (token.isValue()) {
|
||||
switch (fieldName) {
|
||||
case "query" :
|
||||
query = context.queryParserService().parse(parser).query();
|
||||
break;
|
||||
case "from":
|
||||
subSearchContext.from(parser.intValue());
|
||||
break;
|
||||
case "size":
|
||||
subSearchContext.size(parser.intValue());
|
||||
break;
|
||||
case "track_scores":
|
||||
case "trackScores":
|
||||
subSearchContext.trackScores(parser.booleanValue());
|
||||
break;
|
||||
case "version":
|
||||
subSearchContext.version(parser.booleanValue());
|
||||
break;
|
||||
case "explain":
|
||||
subSearchContext.explain(parser.booleanValue());
|
||||
break;
|
||||
default:
|
||||
throw new ElasticsearchIllegalArgumentException("Unknown key for a " + token + " in [" + innerHitName + "]: [" + fieldName + "].");
|
||||
}
|
||||
} else {
|
||||
throw new ElasticsearchIllegalArgumentException("Unexpected token " + token + " in [" + innerHitName + "].");
|
||||
}
|
||||
}
|
||||
|
||||
// Completely consume all json objects:
|
||||
token = parser.nextToken();
|
||||
if (token != XContentParser.Token.END_OBJECT) {
|
||||
throw new ElasticsearchIllegalArgumentException("Expected [" + XContentParser.Token.END_OBJECT + "] token, but got a [" + token + "] token.");
|
||||
}
|
||||
token = parser.nextToken();
|
||||
if (token != XContentParser.Token.END_OBJECT) {
|
||||
throw new ElasticsearchIllegalArgumentException("Expected [" + XContentParser.Token.END_OBJECT + "] token, but got a [" + token + "] token.");
|
||||
}
|
||||
|
||||
if (query == null) {
|
||||
query = new MatchAllDocsQuery();
|
||||
}
|
||||
|
||||
if (nestedPath != null && type != null) {
|
||||
throw new ElasticsearchIllegalArgumentException("Either [path] or [type] can be defined not both");
|
||||
} else if (nestedPath != null) {
|
||||
MapperService.SmartNameObjectMapper smartNameObjectMapper = context.smartNameObjectMapper(nestedPath);
|
||||
if (smartNameObjectMapper == null || !smartNameObjectMapper.hasMapper()) {
|
||||
throw new ElasticsearchIllegalArgumentException("path [" + nestedPath +"] doesn't exist");
|
||||
}
|
||||
ObjectMapper childObjectMapper = smartNameObjectMapper.mapper();
|
||||
if (!childObjectMapper.nested().isNested()) {
|
||||
throw new ElasticsearchIllegalArgumentException("path [" + nestedPath +"] isn't nested");
|
||||
}
|
||||
DocumentMapper childDocumentMapper = smartNameObjectMapper.docMapper();
|
||||
if (childDocumentMapper == null) {
|
||||
for (DocumentMapper documentMapper : context.mapperService().docMappers(false)) {
|
||||
if (documentMapper.objectMappers().containsKey(nestedPath)) {
|
||||
childDocumentMapper = documentMapper;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (currentFilter != null && childDocumentMapper != null) {
|
||||
currentFilter.filter = context.bitsetFilterCache().getBitDocIdSetFilter(childObjectMapper.nestedTypeFilter());
|
||||
NestedQueryParser.parentFilterContext.set(parentFilter);
|
||||
}
|
||||
|
||||
ObjectMapper parentObjectMapper = childDocumentMapper.findParentObjectMapper(childObjectMapper);
|
||||
return new InnerHitsContext.NestedInnerHits(subSearchContext, query, childInnerHits, parentObjectMapper, childObjectMapper);
|
||||
} else if (type != null) {
|
||||
DocumentMapper documentMapper = context.mapperService().documentMapper(type);
|
||||
if (documentMapper == null) {
|
||||
throw new ElasticsearchIllegalArgumentException("type [" + type + "] doesn't exist");
|
||||
}
|
||||
return new InnerHitsContext.ParentChildInnerHits(subSearchContext, query, childInnerHits, documentMapper);
|
||||
} else {
|
||||
throw new ElasticsearchIllegalArgumentException("Either [path] or [type] must be defined");
|
||||
}
|
||||
}
|
||||
}
|
|
@ -59,6 +59,7 @@ import org.elasticsearch.search.aggregations.SearchContextAggregations;
|
|||
import org.elasticsearch.search.dfs.DfsSearchResult;
|
||||
import org.elasticsearch.search.fetch.FetchSearchResult;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceContext;
|
||||
import org.elasticsearch.search.highlight.SearchContextHighlight;
|
||||
|
@ -176,6 +177,8 @@ public class DefaultSearchContext extends SearchContext {
|
|||
|
||||
private volatile boolean useSlowScroll;
|
||||
|
||||
private InnerHitsContext innerHitsContext;
|
||||
|
||||
public DefaultSearchContext(long id, ShardSearchRequest request, SearchShardTarget shardTarget,
|
||||
Engine.Searcher engineSearcher, IndexService indexService, IndexShard indexShard,
|
||||
ScriptService scriptService, PageCacheRecycler pageCacheRecycler,
|
||||
|
@ -700,4 +703,14 @@ public class DefaultSearchContext extends SearchContext {
|
|||
public Counter timeEstimateCounter() {
|
||||
return timeEstimateCounter;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void innerHits(InnerHitsContext innerHitsContext) {
|
||||
this.innerHitsContext = innerHitsContext;
|
||||
}
|
||||
|
||||
@Override
|
||||
public InnerHitsContext innerHits() {
|
||||
return innerHitsContext;
|
||||
}
|
||||
}
|
||||
|
|
|
@ -16,10 +16,9 @@
|
|||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
package org.elasticsearch.search.aggregations.metrics.tophits;
|
||||
|
||||
import com.google.common.collect.ImmutableList;
|
||||
import com.google.common.collect.Lists;
|
||||
package org.elasticsearch.search.internal;
|
||||
|
||||
import org.apache.lucene.search.Filter;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.ScoreDoc;
|
||||
|
@ -47,12 +46,10 @@ import org.elasticsearch.search.aggregations.SearchContextAggregations;
|
|||
import org.elasticsearch.search.dfs.DfsSearchResult;
|
||||
import org.elasticsearch.search.fetch.FetchSearchResult;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceContext;
|
||||
import org.elasticsearch.search.highlight.SearchContextHighlight;
|
||||
import org.elasticsearch.search.internal.ContextIndexSearcher;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
import org.elasticsearch.search.internal.ShardSearchRequest;
|
||||
import org.elasticsearch.search.lookup.SearchLookup;
|
||||
import org.elasticsearch.search.query.QuerySearchResult;
|
||||
import org.elasticsearch.search.rescore.RescoreSearchContext;
|
||||
|
@ -63,541 +60,511 @@ import java.util.List;
|
|||
|
||||
/**
|
||||
*/
|
||||
public class TopHitsContext extends SearchContext {
|
||||
public abstract class FilteredSearchContext extends SearchContext {
|
||||
|
||||
// By default return 3 hits per bucket. A higher default would make the response really large by default, since
|
||||
// the to hits are returned per bucket.
|
||||
private final static int DEFAULT_SIZE = 3;
|
||||
private final SearchContext in;
|
||||
|
||||
private int from;
|
||||
private int size = DEFAULT_SIZE;
|
||||
private Sort sort;
|
||||
|
||||
private final FetchSearchResult fetchSearchResult;
|
||||
private final QuerySearchResult querySearchResult;
|
||||
|
||||
private int[] docIdsToLoad;
|
||||
private int docsIdsToLoadFrom;
|
||||
private int docsIdsToLoadSize;
|
||||
|
||||
private final SearchContext context;
|
||||
|
||||
private List<String> fieldNames;
|
||||
private FieldDataFieldsContext fieldDataFields;
|
||||
private ScriptFieldsContext scriptFields;
|
||||
private FetchSourceContext fetchSourceContext;
|
||||
private SearchContextHighlight highlight;
|
||||
|
||||
private boolean explain;
|
||||
private boolean trackScores;
|
||||
private boolean version;
|
||||
|
||||
public TopHitsContext(SearchContext context) {
|
||||
this.fetchSearchResult = new FetchSearchResult();
|
||||
this.querySearchResult = new QuerySearchResult();
|
||||
this.context = context;
|
||||
public FilteredSearchContext(SearchContext in) {
|
||||
this.in = in;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void doClose() {
|
||||
in.doClose();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void preProcess() {
|
||||
in.preProcess();
|
||||
}
|
||||
|
||||
@Override
|
||||
public Filter searchFilter(String[] types) {
|
||||
throw new UnsupportedOperationException("this context should be read only");
|
||||
return in.searchFilter(types);
|
||||
}
|
||||
|
||||
@Override
|
||||
public long id() {
|
||||
return context.id();
|
||||
return in.id();
|
||||
}
|
||||
|
||||
@Override
|
||||
public String source() {
|
||||
return context.source();
|
||||
return in.source();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ShardSearchRequest request() {
|
||||
return context.request();
|
||||
return in.request();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchType searchType() {
|
||||
return context.searchType();
|
||||
return in.searchType();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext searchType(SearchType searchType) {
|
||||
throw new UnsupportedOperationException("this context should be read only");
|
||||
return in.searchType(searchType);
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchShardTarget shardTarget() {
|
||||
return context.shardTarget();
|
||||
return in.shardTarget();
|
||||
}
|
||||
|
||||
@Override
|
||||
public int numberOfShards() {
|
||||
return context.numberOfShards();
|
||||
return in.numberOfShards();
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hasTypes() {
|
||||
return context.hasTypes();
|
||||
return in.hasTypes();
|
||||
}
|
||||
|
||||
@Override
|
||||
public String[] types() {
|
||||
return context.types();
|
||||
return in.types();
|
||||
}
|
||||
|
||||
@Override
|
||||
public float queryBoost() {
|
||||
return context.queryBoost();
|
||||
return in.queryBoost();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext queryBoost(float queryBoost) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
return in.queryBoost(queryBoost);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected long nowInMillisImpl() {
|
||||
return context.nowInMillis();
|
||||
return in.nowInMillisImpl();
|
||||
}
|
||||
|
||||
@Override
|
||||
public Scroll scroll() {
|
||||
return context.scroll();
|
||||
return in.scroll();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext scroll(Scroll scroll) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
return in.scroll(scroll);
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContextAggregations aggregations() {
|
||||
return context.aggregations();
|
||||
return in.aggregations();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext aggregations(SearchContextAggregations aggregations) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
return in.aggregations(aggregations);
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContextHighlight highlight() {
|
||||
return highlight;
|
||||
return in.highlight();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void highlight(SearchContextHighlight highlight) {
|
||||
this.highlight = highlight;
|
||||
in.highlight(highlight);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void innerHits(InnerHitsContext innerHitsContext) {
|
||||
in.innerHits(innerHitsContext);
|
||||
}
|
||||
|
||||
@Override
|
||||
public InnerHitsContext innerHits() {
|
||||
return in.innerHits();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SuggestionSearchContext suggest() {
|
||||
return context.suggest();
|
||||
return in.suggest();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void suggest(SuggestionSearchContext suggest) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
in.suggest(suggest);
|
||||
}
|
||||
|
||||
@Override
|
||||
public List<RescoreSearchContext> rescore() {
|
||||
return context.rescore();
|
||||
return in.rescore();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void addRescore(RescoreSearchContext rescore) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
in.addRescore(rescore);
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hasFieldDataFields() {
|
||||
return fieldDataFields != null;
|
||||
return in.hasFieldDataFields();
|
||||
}
|
||||
|
||||
@Override
|
||||
public FieldDataFieldsContext fieldDataFields() {
|
||||
if (fieldDataFields == null) {
|
||||
fieldDataFields = new FieldDataFieldsContext();
|
||||
}
|
||||
return this.fieldDataFields;
|
||||
return in.fieldDataFields();
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hasScriptFields() {
|
||||
return scriptFields != null;
|
||||
return in.hasScriptFields();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ScriptFieldsContext scriptFields() {
|
||||
if (scriptFields == null) {
|
||||
scriptFields = new ScriptFieldsContext();
|
||||
}
|
||||
return this.scriptFields;
|
||||
return in.scriptFields();
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean sourceRequested() {
|
||||
return fetchSourceContext != null && fetchSourceContext.fetchSource();
|
||||
return in.sourceRequested();
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hasFetchSourceContext() {
|
||||
return fetchSourceContext != null;
|
||||
return in.hasFetchSourceContext();
|
||||
}
|
||||
|
||||
@Override
|
||||
public FetchSourceContext fetchSourceContext() {
|
||||
return fetchSourceContext;
|
||||
return in.fetchSourceContext();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext fetchSourceContext(FetchSourceContext fetchSourceContext) {
|
||||
this.fetchSourceContext = fetchSourceContext;
|
||||
return this;
|
||||
return in.fetchSourceContext(fetchSourceContext);
|
||||
}
|
||||
|
||||
@Override
|
||||
public ContextIndexSearcher searcher() {
|
||||
return context.searcher();
|
||||
return in.searcher();
|
||||
}
|
||||
|
||||
@Override
|
||||
public IndexShard indexShard() {
|
||||
return context.indexShard();
|
||||
return in.indexShard();
|
||||
}
|
||||
|
||||
@Override
|
||||
public MapperService mapperService() {
|
||||
return context.mapperService();
|
||||
return in.mapperService();
|
||||
}
|
||||
|
||||
@Override
|
||||
public AnalysisService analysisService() {
|
||||
return context.analysisService();
|
||||
return in.analysisService();
|
||||
}
|
||||
|
||||
@Override
|
||||
public IndexQueryParserService queryParserService() {
|
||||
return context.queryParserService();
|
||||
return in.queryParserService();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SimilarityService similarityService() {
|
||||
return context.similarityService();
|
||||
return in.similarityService();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ScriptService scriptService() {
|
||||
return context.scriptService();
|
||||
return in.scriptService();
|
||||
}
|
||||
|
||||
@Override
|
||||
public PageCacheRecycler pageCacheRecycler() {
|
||||
return context.pageCacheRecycler();
|
||||
return in.pageCacheRecycler();
|
||||
}
|
||||
|
||||
@Override
|
||||
public BigArrays bigArrays() {
|
||||
return context.bigArrays();
|
||||
return in.bigArrays();
|
||||
}
|
||||
|
||||
@Override
|
||||
public FilterCache filterCache() {
|
||||
return context.filterCache();
|
||||
return in.filterCache();
|
||||
}
|
||||
|
||||
@Override
|
||||
public BitsetFilterCache bitsetFilterCache() {
|
||||
return context.bitsetFilterCache();
|
||||
return in.bitsetFilterCache();
|
||||
}
|
||||
|
||||
@Override
|
||||
public IndexFieldDataService fieldData() {
|
||||
return context.fieldData();
|
||||
return in.fieldData();
|
||||
}
|
||||
|
||||
@Override
|
||||
public long timeoutInMillis() {
|
||||
return context.timeoutInMillis();
|
||||
return in.timeoutInMillis();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void timeoutInMillis(long timeoutInMillis) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
in.timeoutInMillis(timeoutInMillis);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int terminateAfter() {
|
||||
return context.terminateAfter();
|
||||
return in.terminateAfter();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void terminateAfter(int terminateAfter) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
in.terminateAfter(terminateAfter);
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext minimumScore(float minimumScore) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
return in.minimumScore(minimumScore);
|
||||
}
|
||||
|
||||
@Override
|
||||
public Float minimumScore() {
|
||||
return context.minimumScore();
|
||||
return in.minimumScore();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext sort(Sort sort) {
|
||||
this.sort = sort;
|
||||
return null;
|
||||
return in.sort(sort);
|
||||
}
|
||||
|
||||
@Override
|
||||
public Sort sort() {
|
||||
return sort;
|
||||
return in.sort();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext trackScores(boolean trackScores) {
|
||||
this.trackScores = trackScores;
|
||||
return this;
|
||||
return in.trackScores(trackScores);
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean trackScores() {
|
||||
return trackScores;
|
||||
return in.trackScores();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext parsedPostFilter(ParsedFilter postFilter) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
return in.parsedPostFilter(postFilter);
|
||||
}
|
||||
|
||||
@Override
|
||||
public ParsedFilter parsedPostFilter() {
|
||||
return context.parsedPostFilter();
|
||||
return in.parsedPostFilter();
|
||||
}
|
||||
|
||||
@Override
|
||||
public Filter aliasFilter() {
|
||||
return context.aliasFilter();
|
||||
return in.aliasFilter();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext parsedQuery(ParsedQuery query) {
|
||||
return context.parsedQuery(query);
|
||||
return in.parsedQuery(query);
|
||||
}
|
||||
|
||||
@Override
|
||||
public ParsedQuery parsedQuery() {
|
||||
return context.parsedQuery();
|
||||
return in.parsedQuery();
|
||||
}
|
||||
|
||||
@Override
|
||||
public Query query() {
|
||||
return context.query();
|
||||
return in.query();
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean queryRewritten() {
|
||||
return context.queryRewritten();
|
||||
return in.queryRewritten();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext updateRewriteQuery(Query rewriteQuery) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
return in.updateRewriteQuery(rewriteQuery);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int from() {
|
||||
return from;
|
||||
return in.from();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext from(int from) {
|
||||
this.from = from;
|
||||
return this;
|
||||
return in.from(from);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int size() {
|
||||
return size;
|
||||
return in.size();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext size(int size) {
|
||||
this.size = size;
|
||||
return this;
|
||||
return in.size(size);
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hasFieldNames() {
|
||||
return fieldNames != null;
|
||||
return in.hasFieldNames();
|
||||
}
|
||||
|
||||
@Override
|
||||
public List<String> fieldNames() {
|
||||
if (fieldNames == null) {
|
||||
fieldNames = Lists.newArrayList();
|
||||
}
|
||||
return fieldNames;
|
||||
return in.fieldNames();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void emptyFieldNames() {
|
||||
this.fieldNames = ImmutableList.of();
|
||||
in.emptyFieldNames();
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean explain() {
|
||||
return explain;
|
||||
return in.explain();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void explain(boolean explain) {
|
||||
this.explain = explain;
|
||||
in.explain(explain);
|
||||
}
|
||||
|
||||
@Override
|
||||
public List<String> groupStats() {
|
||||
return context.groupStats();
|
||||
return in.groupStats();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void groupStats(List<String> groupStats) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
in.groupStats(groupStats);
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean version() {
|
||||
return version;
|
||||
return in.version();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void version(boolean version) {
|
||||
this.version = version;
|
||||
in.version(version);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int[] docIdsToLoad() {
|
||||
return docIdsToLoad;
|
||||
return in.docIdsToLoad();
|
||||
}
|
||||
|
||||
@Override
|
||||
public int docIdsToLoadFrom() {
|
||||
return docsIdsToLoadFrom;
|
||||
return in.docIdsToLoadFrom();
|
||||
}
|
||||
|
||||
@Override
|
||||
public int docIdsToLoadSize() {
|
||||
return docsIdsToLoadSize;
|
||||
return in.docIdsToLoadSize();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext docIdsToLoad(int[] docIdsToLoad, int docsIdsToLoadFrom, int docsIdsToLoadSize) {
|
||||
this.docIdsToLoad = docIdsToLoad;
|
||||
this.docsIdsToLoadFrom = docsIdsToLoadFrom;
|
||||
this.docsIdsToLoadSize = docsIdsToLoadSize;
|
||||
return this;
|
||||
return in.docIdsToLoad(docIdsToLoad, docsIdsToLoadFrom, docsIdsToLoadSize);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void accessed(long accessTime) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
accessed(accessTime);
|
||||
}
|
||||
|
||||
@Override
|
||||
public long lastAccessTime() {
|
||||
return context.lastAccessTime();
|
||||
return in.lastAccessTime();
|
||||
}
|
||||
|
||||
@Override
|
||||
public long keepAlive() {
|
||||
return context.keepAlive();
|
||||
return in.keepAlive();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void keepAlive(long keepAlive) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
in.keepAlive(keepAlive);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void lastEmittedDoc(ScoreDoc doc) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
in.lastEmittedDoc(doc);
|
||||
}
|
||||
|
||||
@Override
|
||||
public ScoreDoc lastEmittedDoc() {
|
||||
return context.lastEmittedDoc();
|
||||
return in.lastEmittedDoc();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchLookup lookup() {
|
||||
return context.lookup();
|
||||
return in.lookup();
|
||||
}
|
||||
|
||||
@Override
|
||||
public DfsSearchResult dfsResult() {
|
||||
return context.dfsResult();
|
||||
return in.dfsResult();
|
||||
}
|
||||
|
||||
@Override
|
||||
public QuerySearchResult queryResult() {
|
||||
return querySearchResult;
|
||||
return in.queryResult();
|
||||
}
|
||||
|
||||
@Override
|
||||
public FetchSearchResult fetchResult() {
|
||||
return fetchSearchResult;
|
||||
return in.fetchResult();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ScanContext scanContext() {
|
||||
return context.scanContext();
|
||||
return in.scanContext();
|
||||
}
|
||||
|
||||
@Override
|
||||
public MapperService.SmartNameFieldMappers smartFieldMappers(String name) {
|
||||
return context.smartFieldMappers(name);
|
||||
return in.smartFieldMappers(name);
|
||||
}
|
||||
|
||||
@Override
|
||||
public FieldMappers smartNameFieldMappers(String name) {
|
||||
return context.smartNameFieldMappers(name);
|
||||
return in.smartNameFieldMappers(name);
|
||||
}
|
||||
|
||||
@Override
|
||||
public FieldMapper smartNameFieldMapper(String name) {
|
||||
return context.smartNameFieldMapper(name);
|
||||
return in.smartNameFieldMapper(name);
|
||||
}
|
||||
|
||||
@Override
|
||||
public MapperService.SmartNameObjectMapper smartNameObjectMapper(String name) {
|
||||
return context.smartNameObjectMapper(name);
|
||||
return in.smartNameObjectMapper(name);
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean useSlowScroll() {
|
||||
return context.useSlowScroll();
|
||||
return in.useSlowScroll();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext useSlowScroll(boolean useSlowScroll) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
return in.useSlowScroll(useSlowScroll);
|
||||
}
|
||||
|
||||
@Override
|
||||
public Counter timeEstimateCounter() {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
return in.timeEstimateCounter();
|
||||
}
|
||||
}
|
|
@ -41,11 +41,13 @@ import org.elasticsearch.common.xcontent.XContentHelper;
|
|||
import org.elasticsearch.index.fielddata.fieldcomparator.BytesRefFieldComparatorSource;
|
||||
import org.elasticsearch.search.SearchHit;
|
||||
import org.elasticsearch.search.SearchHitField;
|
||||
import org.elasticsearch.search.SearchHits;
|
||||
import org.elasticsearch.search.SearchShardTarget;
|
||||
import org.elasticsearch.search.highlight.HighlightField;
|
||||
import org.elasticsearch.search.lookup.SourceLookup;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.HashMap;
|
||||
import java.util.Iterator;
|
||||
import java.util.Map;
|
||||
|
||||
|
@ -92,6 +94,8 @@ public class InternalSearchHit implements SearchHit {
|
|||
private Map<String, Object> sourceAsMap;
|
||||
private byte[] sourceAsBytes;
|
||||
|
||||
private Map<String, InternalSearchHits> innerHits;
|
||||
|
||||
private InternalSearchHit() {
|
||||
|
||||
}
|
||||
|
@ -117,6 +121,11 @@ public class InternalSearchHit implements SearchHit {
|
|||
|
||||
public void shardTarget(SearchShardTarget shardTarget) {
|
||||
this.shard = shardTarget;
|
||||
if (innerHits != null) {
|
||||
for (InternalSearchHits searchHits : innerHits.values()) {
|
||||
searchHits.shardTarget(shardTarget);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public void score(float score) {
|
||||
|
@ -392,6 +401,15 @@ public class InternalSearchHit implements SearchHit {
|
|||
return this.matchedQueries;
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public Map<String, SearchHits> getInnerHits() {
|
||||
return (Map) innerHits;
|
||||
}
|
||||
|
||||
public void setInnerHits(Map<String, InternalSearchHits> innerHits) {
|
||||
this.innerHits = innerHits;
|
||||
}
|
||||
|
||||
public static class Fields {
|
||||
static final XContentBuilderString _INDEX = new XContentBuilderString("_index");
|
||||
static final XContentBuilderString _TYPE = new XContentBuilderString("_type");
|
||||
|
@ -406,16 +424,21 @@ public class InternalSearchHit implements SearchHit {
|
|||
static final XContentBuilderString VALUE = new XContentBuilderString("value");
|
||||
static final XContentBuilderString DESCRIPTION = new XContentBuilderString("description");
|
||||
static final XContentBuilderString DETAILS = new XContentBuilderString("details");
|
||||
static final XContentBuilderString INNER_HITS = new XContentBuilderString("inner_hits");
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject();
|
||||
if (explanation() != null) {
|
||||
// For inner_hit hits shard is null and that is ok, because the parent search hit has all this information.
|
||||
// Even if this was included in the inner_hit hits this would be the same, so better leave it out.
|
||||
if (explanation() != null && shard != null) {
|
||||
builder.field("_shard", shard.shardId());
|
||||
builder.field("_node", shard.nodeIdText());
|
||||
}
|
||||
builder.field(Fields._INDEX, shard.indexText());
|
||||
if (shard != null) {
|
||||
builder.field(Fields._INDEX, shard.indexText());
|
||||
}
|
||||
builder.field(Fields._TYPE, type);
|
||||
builder.field(Fields._ID, id);
|
||||
if (nestedIdentity != null) {
|
||||
|
@ -491,6 +514,15 @@ public class InternalSearchHit implements SearchHit {
|
|||
builder.field(Fields._EXPLANATION);
|
||||
buildExplanation(builder, explanation());
|
||||
}
|
||||
if (innerHits != null) {
|
||||
builder.startObject(Fields.INNER_HITS);
|
||||
for (Map.Entry<String, InternalSearchHits> entry : innerHits.entrySet()) {
|
||||
builder.startObject(entry.getKey());
|
||||
entry.getValue().toXContent(builder, params);
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
}
|
||||
builder.endObject();
|
||||
return builder;
|
||||
}
|
||||
|
@ -652,6 +684,18 @@ public class InternalSearchHit implements SearchHit {
|
|||
shard = context.handleShardLookup().get(lookupId);
|
||||
}
|
||||
}
|
||||
|
||||
if (in.getVersion().onOrAfter(Version.V_1_5_0)) {
|
||||
size = in.readVInt();
|
||||
if (size > 0) {
|
||||
innerHits = new HashMap<>(size);
|
||||
for (int i = 0; i < size; i++) {
|
||||
String key = in.readString();
|
||||
InternalSearchHits value = InternalSearchHits.readSearchHits(in, InternalSearchHits.streamContext().streamShardTarget(InternalSearchHits.StreamContext.ShardTargetType.NO_STREAM));
|
||||
innerHits.put(key, value);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -757,6 +801,18 @@ public class InternalSearchHit implements SearchHit {
|
|||
out.writeVInt(context.shardHandleLookup().get(shard));
|
||||
}
|
||||
}
|
||||
|
||||
if (out.getVersion().onOrAfter(Version.V_1_5_0)) {
|
||||
if (innerHits == null) {
|
||||
out.writeVInt(0);
|
||||
} else {
|
||||
out.writeVInt(innerHits.size());
|
||||
for (Map.Entry<String, InternalSearchHits> entry : innerHits.entrySet()) {
|
||||
out.writeString(entry.getKey());
|
||||
entry.getValue().writeTo(out, InternalSearchHits.streamContext().streamShardTarget(InternalSearchHits.StreamContext.ShardTargetType.NO_STREAM));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public final static class InternalNestedIdentity implements NestedIdentity, Streamable, ToXContent {
|
||||
|
|
|
@ -115,7 +115,7 @@ public class InternalSearchHits implements SearchHits {
|
|||
|
||||
public void shardTarget(SearchShardTarget shardTarget) {
|
||||
for (InternalSearchHit hit : hits) {
|
||||
hit.shardTarget(shardTarget);
|
||||
hit.shard(shardTarget);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -52,6 +52,7 @@ import org.elasticsearch.search.aggregations.SearchContextAggregations;
|
|||
import org.elasticsearch.search.dfs.DfsSearchResult;
|
||||
import org.elasticsearch.search.fetch.FetchSearchResult;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceContext;
|
||||
import org.elasticsearch.search.highlight.SearchContextHighlight;
|
||||
|
@ -156,6 +157,10 @@ public abstract class SearchContext implements Releasable {
|
|||
|
||||
public abstract void highlight(SearchContextHighlight highlight);
|
||||
|
||||
public abstract void innerHits(InnerHitsContext innerHitsContext);
|
||||
|
||||
public abstract InnerHitsContext innerHits();
|
||||
|
||||
public abstract SuggestionSearchContext suggest();
|
||||
|
||||
public abstract void suggest(SuggestionSearchContext suggest);
|
||||
|
|
|
@ -0,0 +1,369 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
package org.elasticsearch.search.internal;
|
||||
|
||||
import com.google.common.collect.ImmutableList;
|
||||
import com.google.common.collect.Lists;
|
||||
import org.apache.lucene.search.Filter;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.search.ScoreDoc;
|
||||
import org.apache.lucene.search.Sort;
|
||||
import org.apache.lucene.util.Counter;
|
||||
import org.elasticsearch.action.search.SearchType;
|
||||
import org.elasticsearch.index.query.ParsedFilter;
|
||||
import org.elasticsearch.search.Scroll;
|
||||
import org.elasticsearch.search.aggregations.SearchContextAggregations;
|
||||
import org.elasticsearch.search.fetch.FetchSearchResult;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceContext;
|
||||
import org.elasticsearch.search.highlight.SearchContextHighlight;
|
||||
import org.elasticsearch.search.lookup.SearchLookup;
|
||||
import org.elasticsearch.search.query.QuerySearchResult;
|
||||
import org.elasticsearch.search.rescore.RescoreSearchContext;
|
||||
import org.elasticsearch.search.suggest.SuggestionSearchContext;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class SubSearchContext extends FilteredSearchContext {
|
||||
|
||||
// By default return 3 hits per bucket. A higher default would make the response really large by default, since
|
||||
// the to hits are returned per bucket.
|
||||
private final static int DEFAULT_SIZE = 3;
|
||||
|
||||
private int from;
|
||||
private int size = DEFAULT_SIZE;
|
||||
private Sort sort;
|
||||
|
||||
private final FetchSearchResult fetchSearchResult;
|
||||
private final QuerySearchResult querySearchResult;
|
||||
|
||||
private int[] docIdsToLoad;
|
||||
private int docsIdsToLoadFrom;
|
||||
private int docsIdsToLoadSize;
|
||||
|
||||
private List<String> fieldNames;
|
||||
private FieldDataFieldsContext fieldDataFields;
|
||||
private ScriptFieldsContext scriptFields;
|
||||
private FetchSourceContext fetchSourceContext;
|
||||
private SearchContextHighlight highlight;
|
||||
|
||||
private boolean explain;
|
||||
private boolean trackScores;
|
||||
private boolean version;
|
||||
|
||||
public SubSearchContext(SearchContext context) {
|
||||
super(context);
|
||||
this.fetchSearchResult = new FetchSearchResult();
|
||||
this.querySearchResult = new QuerySearchResult();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void doClose() {
|
||||
}
|
||||
|
||||
@Override
|
||||
public void preProcess() {
|
||||
}
|
||||
|
||||
@Override
|
||||
public Filter searchFilter(String[] types) {
|
||||
throw new UnsupportedOperationException("this context should be read only");
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext searchType(SearchType searchType) {
|
||||
throw new UnsupportedOperationException("this context should be read only");
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext queryBoost(float queryBoost) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext scroll(Scroll scroll) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext aggregations(SearchContextAggregations aggregations) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
public SearchContextHighlight highlight() {
|
||||
return highlight;
|
||||
}
|
||||
|
||||
public void highlight(SearchContextHighlight highlight) {
|
||||
this.highlight = highlight;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void suggest(SuggestionSearchContext suggest) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public void addRescore(RescoreSearchContext rescore) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hasFieldDataFields() {
|
||||
return fieldDataFields != null;
|
||||
}
|
||||
|
||||
@Override
|
||||
public FieldDataFieldsContext fieldDataFields() {
|
||||
if (fieldDataFields == null) {
|
||||
fieldDataFields = new FieldDataFieldsContext();
|
||||
}
|
||||
return this.fieldDataFields;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hasScriptFields() {
|
||||
return scriptFields != null;
|
||||
}
|
||||
|
||||
@Override
|
||||
public ScriptFieldsContext scriptFields() {
|
||||
if (scriptFields == null) {
|
||||
scriptFields = new ScriptFieldsContext();
|
||||
}
|
||||
return this.scriptFields;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean sourceRequested() {
|
||||
return fetchSourceContext != null && fetchSourceContext.fetchSource();
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hasFetchSourceContext() {
|
||||
return fetchSourceContext != null;
|
||||
}
|
||||
|
||||
@Override
|
||||
public FetchSourceContext fetchSourceContext() {
|
||||
return fetchSourceContext;
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext fetchSourceContext(FetchSourceContext fetchSourceContext) {
|
||||
this.fetchSourceContext = fetchSourceContext;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void timeoutInMillis(long timeoutInMillis) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public void terminateAfter(int terminateAfter) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext minimumScore(float minimumScore) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext sort(Sort sort) {
|
||||
this.sort = sort;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
public Sort sort() {
|
||||
return sort;
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext trackScores(boolean trackScores) {
|
||||
this.trackScores = trackScores;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean trackScores() {
|
||||
return trackScores;
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext parsedPostFilter(ParsedFilter postFilter) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext updateRewriteQuery(Query rewriteQuery) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public int from() {
|
||||
return from;
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext from(int from) {
|
||||
this.from = from;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int size() {
|
||||
return size;
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext size(int size) {
|
||||
this.size = size;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean hasFieldNames() {
|
||||
return fieldNames != null;
|
||||
}
|
||||
|
||||
@Override
|
||||
public List<String> fieldNames() {
|
||||
if (fieldNames == null) {
|
||||
fieldNames = Lists.newArrayList();
|
||||
}
|
||||
return fieldNames;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void emptyFieldNames() {
|
||||
this.fieldNames = ImmutableList.of();
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean explain() {
|
||||
return explain;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void explain(boolean explain) {
|
||||
this.explain = explain;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void groupStats(List<String> groupStats) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean version() {
|
||||
return version;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void version(boolean version) {
|
||||
this.version = version;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int[] docIdsToLoad() {
|
||||
return docIdsToLoad;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int docIdsToLoadFrom() {
|
||||
return docsIdsToLoadFrom;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int docIdsToLoadSize() {
|
||||
return docsIdsToLoadSize;
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext docIdsToLoad(int[] docIdsToLoad, int docsIdsToLoadFrom, int docsIdsToLoadSize) {
|
||||
this.docIdsToLoad = docIdsToLoad;
|
||||
this.docsIdsToLoadFrom = docsIdsToLoadFrom;
|
||||
this.docsIdsToLoadSize = docsIdsToLoadSize;
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void accessed(long accessTime) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public void keepAlive(long keepAlive) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public void lastEmittedDoc(ScoreDoc doc) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public QuerySearchResult queryResult() {
|
||||
return querySearchResult;
|
||||
}
|
||||
|
||||
@Override
|
||||
public FetchSearchResult fetchResult() {
|
||||
return fetchSearchResult;
|
||||
}
|
||||
|
||||
private SearchLookup searchLookup;
|
||||
|
||||
@Override
|
||||
public SearchLookup lookup() {
|
||||
if (searchLookup == null) {
|
||||
searchLookup = new SearchLookup(mapperService(), fieldData(), request().types());
|
||||
}
|
||||
return searchLookup;
|
||||
}
|
||||
|
||||
@Override
|
||||
public SearchContext useSlowScroll(boolean useSlowScroll) {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
@Override
|
||||
public Counter timeEstimateCounter() {
|
||||
throw new UnsupportedOperationException("Not supported");
|
||||
}
|
||||
|
||||
private InnerHitsContext innerHitsContext;
|
||||
|
||||
@Override
|
||||
public void innerHits(InnerHitsContext innerHitsContext) {
|
||||
this.innerHitsContext = innerHitsContext;
|
||||
}
|
||||
|
||||
@Override
|
||||
public InnerHitsContext innerHits() {
|
||||
return innerHitsContext;
|
||||
}
|
||||
}
|
|
@ -41,7 +41,7 @@ import org.elasticsearch.index.search.nested.NonNestedDocsFilter;
|
|||
import org.elasticsearch.search.MultiValueMode;
|
||||
import org.elasticsearch.search.SearchParseElement;
|
||||
import org.elasticsearch.search.SearchParseException;
|
||||
import org.elasticsearch.search.aggregations.metrics.tophits.TopHitsContext;
|
||||
import org.elasticsearch.search.internal.SubSearchContext;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
|
||||
import java.util.List;
|
||||
|
@ -244,7 +244,7 @@ public class SortParseElement implements SearchParseElement {
|
|||
if (!objectMapper.nested().isNested()) {
|
||||
throw new ElasticsearchIllegalArgumentException("mapping for explicit nested path is not mapped as nested: [" + nestedPath + "]");
|
||||
}
|
||||
} else if (!(context instanceof TopHitsContext)) {
|
||||
} else if (!(context instanceof SubSearchContext)) {
|
||||
// Only automatically resolve nested path when sort isn't defined for top_hits
|
||||
objectMapper = context.mapperService().resolveClosestNestedObjectMapper(fieldName);
|
||||
}
|
||||
|
|
|
@ -0,0 +1,97 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.search.fetch.innerhits;
|
||||
|
||||
import org.apache.lucene.document.Document;
|
||||
import org.apache.lucene.document.Field;
|
||||
import org.apache.lucene.document.IntField;
|
||||
import org.apache.lucene.document.StringField;
|
||||
import org.apache.lucene.index.IndexReader;
|
||||
import org.apache.lucene.index.LeafReaderContext;
|
||||
import org.apache.lucene.index.RandomIndexWriter;
|
||||
import org.apache.lucene.index.Term;
|
||||
import org.apache.lucene.queries.TermFilter;
|
||||
import org.apache.lucene.search.*;
|
||||
import org.apache.lucene.search.join.BitDocIdSetCachingWrapperFilter;
|
||||
import org.apache.lucene.search.join.BitDocIdSetFilter;
|
||||
import org.apache.lucene.store.Directory;
|
||||
import org.elasticsearch.search.fetch.FetchSubPhase;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext.NestedInnerHits.NestedChildrenFilter;
|
||||
import org.elasticsearch.test.ElasticsearchLuceneTestCase;
|
||||
import org.junit.Test;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class NestedChildrenFilterTest extends ElasticsearchLuceneTestCase {
|
||||
|
||||
@Test
|
||||
public void testNestedChildrenFilter() throws Exception {
|
||||
int numParentDocs = scaledRandomIntBetween(0, 32);
|
||||
int maxChildDocsPerParent = scaledRandomIntBetween(8, 16);
|
||||
|
||||
Directory dir = newDirectory();
|
||||
RandomIndexWriter writer = new RandomIndexWriter(random(), dir);
|
||||
for (int i = 0; i < numParentDocs; i++) {
|
||||
int numChildDocs = scaledRandomIntBetween(0, maxChildDocsPerParent);
|
||||
List<Document> docs = new ArrayList<>(numChildDocs + 1);
|
||||
for (int j = 0; j < numChildDocs; j++) {
|
||||
Document childDoc = new Document();
|
||||
childDoc.add(new StringField("type", "child", Field.Store.NO));
|
||||
docs.add(childDoc);
|
||||
}
|
||||
|
||||
Document parenDoc = new Document();
|
||||
parenDoc.add(new StringField("type", "parent", Field.Store.NO));
|
||||
parenDoc.add(new IntField("num_child_docs", numChildDocs, Field.Store.YES));
|
||||
docs.add(parenDoc);
|
||||
writer.addDocuments(docs);
|
||||
}
|
||||
|
||||
IndexReader reader = writer.getReader();
|
||||
writer.close();
|
||||
|
||||
IndexSearcher searcher = new IndexSearcher(reader);
|
||||
FetchSubPhase.HitContext hitContext = new FetchSubPhase.HitContext();
|
||||
BitDocIdSetFilter parentFilter = new BitDocIdSetCachingWrapperFilter(new TermFilter(new Term("type", "parent")));
|
||||
Filter childFilter = new TermFilter(new Term("type", "child"));
|
||||
int checkedParents = 0;
|
||||
for (LeafReaderContext leaf : reader.leaves()) {
|
||||
DocIdSetIterator parents = parentFilter.getDocIdSet(leaf).iterator();
|
||||
for (int parentDoc = parents.nextDoc(); parentDoc != DocIdSetIterator.NO_MORE_DOCS ; parentDoc = parents.nextDoc()) {
|
||||
int expectedChildDocs = leaf.reader().document(parentDoc).getField("num_child_docs").numericValue().intValue();
|
||||
hitContext.reset(null, leaf, parentDoc, reader);
|
||||
NestedChildrenFilter nestedChildrenFilter = new NestedChildrenFilter(parentFilter, childFilter, hitContext);
|
||||
TotalHitCountCollector totalHitCountCollector = new TotalHitCountCollector();
|
||||
searcher.search(new ConstantScoreQuery(nestedChildrenFilter), totalHitCountCollector);
|
||||
assertThat(totalHitCountCollector.getTotalHits(), equalTo(expectedChildDocs));
|
||||
checkedParents++;
|
||||
}
|
||||
}
|
||||
assertThat(checkedParents, equalTo(numParentDocs));
|
||||
reader.close();
|
||||
dir.close();
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,523 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.search.innerhits;
|
||||
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.search.SearchHit;
|
||||
import org.elasticsearch.search.SearchHits;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsBuilder;
|
||||
import org.elasticsearch.search.sort.SortOrder;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.Locale;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.*;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
import static org.hamcrest.Matchers.containsString;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.nullValue;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class InnerHitsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimpleNested() throws Exception {
|
||||
assertAcked(prepareCreate("articles").addMapping("article", jsonBuilder().startObject().startObject("article").startObject("properties")
|
||||
.startObject("comments")
|
||||
.field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("message")
|
||||
.field("type", "string")
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.startObject("title")
|
||||
.field("type", "string")
|
||||
.endObject()
|
||||
.endObject().endObject().endObject()));
|
||||
|
||||
List<IndexRequestBuilder> requests = new ArrayList<>();
|
||||
requests.add(client().prepareIndex("articles", "article", "1").setSource(jsonBuilder().startObject()
|
||||
.field("title", "quick brown fox")
|
||||
.startArray("comments")
|
||||
.startObject().field("message", "fox eat quick").endObject()
|
||||
.startObject().field("message", "fox ate rabbit x y z").endObject()
|
||||
.startObject().field("message", "rabbit got away").endObject()
|
||||
.endArray()
|
||||
.endObject()));
|
||||
requests.add(client().prepareIndex("articles", "article", "2").setSource(jsonBuilder().startObject()
|
||||
.field("title", "big gray elephant")
|
||||
.startArray("comments")
|
||||
.startObject().field("message", "elephant captured").endObject()
|
||||
.startObject().field("message", "mice squashed by elephant x").endObject()
|
||||
.startObject().field("message", "elephant scared by mice x y").endObject()
|
||||
.endArray()
|
||||
.endObject()));
|
||||
indexRandom(true, requests);
|
||||
|
||||
SearchResponse response = client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", matchQuery("comments.message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setPath("comments").setQuery(matchQuery("comments.message", "fox")))
|
||||
.get();
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("1"));
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(2l));
|
||||
assertThat(innerHits.getHits().length, equalTo(2));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getOffset(), equalTo(0));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getOffset(), equalTo(1));
|
||||
|
||||
response = client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", matchQuery("comments.message", "elephant")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setPath("comments").setQuery(matchQuery("comments.message", "elephant")))
|
||||
.get();
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("2"));
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(3l));
|
||||
assertThat(innerHits.getHits().length, equalTo(3));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getOffset(), equalTo(0));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(1).getNestedIdentity().getOffset(), equalTo(1));
|
||||
assertThat(innerHits.getAt(2).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(2).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(2).getNestedIdentity().getOffset(), equalTo(2));
|
||||
|
||||
response = client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", matchQuery("comments.message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setPath("comments")
|
||||
.setQuery(matchQuery("comments.message", "fox"))
|
||||
.addHighlightedField("comments.message")
|
||||
.setExplain(true)
|
||||
.addFieldDataField("comments.message")
|
||||
.addScriptField("script", "doc['comments.message'].value")
|
||||
.setSize(1)
|
||||
).get();
|
||||
|
||||
assertNoFailures(response);
|
||||
innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.getHits().length, equalTo(1));
|
||||
assertThat(innerHits.getAt(0).getHighlightFields().get("comments.message").getFragments()[0].string(), equalTo("<em>fox</em> eat quick"));
|
||||
assertThat(innerHits.getAt(0).explanation().toString(), containsString("(MATCH) weight(comments.message:fox in"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("comments.message").getValue().toString(), equalTo("eat"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("script").getValue().toString(), equalTo("eat"));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testRandomNested() throws Exception {
|
||||
assertAcked(prepareCreate("idx").addMapping("type", "field1", "type=nested", "field2", "type=nested"));
|
||||
int numDocs = scaledRandomIntBetween(25, 100);
|
||||
List<IndexRequestBuilder> requestBuilders = new ArrayList<>();
|
||||
|
||||
int[] field1InnerObjects = new int[numDocs];
|
||||
int[] field2InnerObjects = new int[numDocs];
|
||||
for (int i = 0; i < numDocs; i++) {
|
||||
int numInnerObjects = field1InnerObjects[i] = scaledRandomIntBetween(0, numDocs);
|
||||
XContentBuilder source = jsonBuilder().startObject().startArray("field1");
|
||||
for (int j = 0; j < numInnerObjects; j++) {
|
||||
source.startObject().field("x", "y").endObject();
|
||||
}
|
||||
numInnerObjects = field2InnerObjects[i] = scaledRandomIntBetween(0, numDocs);
|
||||
source.endArray().startArray("field2");
|
||||
for (int j = 0; j < numInnerObjects; j++) {
|
||||
source.startObject().field("x", "y").endObject();
|
||||
}
|
||||
source.endArray().endObject();
|
||||
|
||||
requestBuilders.add(client().prepareIndex("idx", "type", String.format(Locale.ENGLISH, "%03d", i)).setSource(source));
|
||||
}
|
||||
|
||||
indexRandom(true, requestBuilders);
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("idx")
|
||||
.setSize(numDocs)
|
||||
.addSort("_uid", SortOrder.ASC)
|
||||
.addInnerHit("a", new InnerHitsBuilder.InnerHit().setPath("field1").addSort("_doc", SortOrder.DESC).setSize(numDocs)) // Sort order is DESC, because we reverse the inner objects during indexing!
|
||||
.addInnerHit("b", new InnerHitsBuilder.InnerHit().setPath("field2").addSort("_doc", SortOrder.DESC).setSize(numDocs))
|
||||
.get();
|
||||
|
||||
assertHitCount(searchResponse, numDocs);
|
||||
assertThat(searchResponse.getHits().getHits().length, equalTo(numDocs));
|
||||
for (int i = 0; i < numDocs; i++) {
|
||||
SearchHit searchHit = searchResponse.getHits().getAt(i);
|
||||
SearchHits inner = searchHit.getInnerHits().get("a");
|
||||
assertThat(inner.totalHits(), equalTo((long) field1InnerObjects[i]));
|
||||
for (int j = 0; j < field1InnerObjects[i]; j++) {
|
||||
SearchHit innerHit = inner.getAt(j);
|
||||
assertThat(innerHit.getNestedIdentity().getField().string(), equalTo("field1"));
|
||||
assertThat(innerHit.getNestedIdentity().getOffset(), equalTo(j));
|
||||
assertThat(innerHit.getNestedIdentity().getChild(), nullValue());
|
||||
}
|
||||
|
||||
inner = searchHit.getInnerHits().get("b");
|
||||
assertThat(inner.totalHits(), equalTo((long) field2InnerObjects[i]));
|
||||
for (int j = 0; j < field2InnerObjects[i]; j++) {
|
||||
SearchHit innerHit = inner.getAt(j);
|
||||
assertThat(innerHit.getNestedIdentity().getField().string(), equalTo("field2"));
|
||||
assertThat(innerHit.getNestedIdentity().getOffset(), equalTo(j));
|
||||
assertThat(innerHit.getNestedIdentity().getChild(), nullValue());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testSimpleParentChild() throws Exception {
|
||||
assertAcked(prepareCreate("articles")
|
||||
.addMapping("article", "title", "type=string")
|
||||
.addMapping("comment", "_parent", "type=article", "message", "type=string")
|
||||
);
|
||||
|
||||
List<IndexRequestBuilder> requests = new ArrayList<>();
|
||||
requests.add(client().prepareIndex("articles", "article", "1").setSource("title", "quick brown fox"));
|
||||
requests.add(client().prepareIndex("articles", "comment", "1").setParent("1").setSource("message", "fox eat quick"));
|
||||
requests.add(client().prepareIndex("articles", "comment", "2").setParent("1").setSource("message", "fox ate rabbit x y z"));
|
||||
requests.add(client().prepareIndex("articles", "comment", "3").setParent("1").setSource("message", "rabbit got away"));
|
||||
requests.add(client().prepareIndex("articles", "article", "2").setSource("title", "big gray elephant"));
|
||||
requests.add(client().prepareIndex("articles", "comment", "4").setParent("2").setSource("message", "elephant captured"));
|
||||
requests.add(client().prepareIndex("articles", "comment", "5").setParent("2").setSource("message", "mice squashed by elephant x"));
|
||||
requests.add(client().prepareIndex("articles", "comment", "6").setParent("2").setSource("message", "elephant scared by mice x y"));
|
||||
indexRandom(true, requests);
|
||||
|
||||
SearchResponse response = client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setType("comment").setQuery(matchQuery("message", "fox")))
|
||||
.get();
|
||||
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("1"));
|
||||
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(2l));
|
||||
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(0).type(), equalTo("comment"));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(1).type(), equalTo("comment"));
|
||||
|
||||
response = client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "elephant")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setType("comment").setQuery(matchQuery("message", "elephant")))
|
||||
.get();
|
||||
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("2"));
|
||||
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(3l));
|
||||
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("4"));
|
||||
assertThat(innerHits.getAt(0).type(), equalTo("comment"));
|
||||
assertThat(innerHits.getAt(1).getId(), equalTo("5"));
|
||||
assertThat(innerHits.getAt(1).type(), equalTo("comment"));
|
||||
assertThat(innerHits.getAt(2).getId(), equalTo("6"));
|
||||
assertThat(innerHits.getAt(2).type(), equalTo("comment"));
|
||||
|
||||
response = client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", matchQuery("message", "fox")))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit().setType("comment")
|
||||
.setQuery(matchQuery("message", "fox"))
|
||||
.addHighlightedField("message")
|
||||
.setExplain(true)
|
||||
.addFieldDataField("message")
|
||||
.addScriptField("script", "doc['message'].value")
|
||||
.setSize(1)
|
||||
).get();
|
||||
|
||||
assertNoFailures(response);
|
||||
innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.getHits().length, equalTo(1));
|
||||
assertThat(innerHits.getAt(0).getHighlightFields().get("message").getFragments()[0].string(), equalTo("<em>fox</em> eat quick"));
|
||||
assertThat(innerHits.getAt(0).explanation().toString(), containsString("(MATCH) weight(message:fox"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("message").getValue().toString(), equalTo("eat"));
|
||||
assertThat(innerHits.getAt(0).getFields().get("script").getValue().toString(), equalTo("eat"));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testRandomParentChild() throws Exception {
|
||||
assertAcked(prepareCreate("idx")
|
||||
.addMapping("parent")
|
||||
.addMapping("child1", "_parent", "type=parent")
|
||||
.addMapping("child2", "_parent", "type=parent")
|
||||
);
|
||||
int numDocs = scaledRandomIntBetween(5, 50);
|
||||
List<IndexRequestBuilder> requestBuilders = new ArrayList<>();
|
||||
|
||||
int child1 = 0;
|
||||
int child2 = 0;
|
||||
int[] child1InnerObjects = new int[numDocs];
|
||||
int[] child2InnerObjects = new int[numDocs];
|
||||
for (int parent = 0; parent < numDocs; parent++) {
|
||||
String parentId = String.format(Locale.ENGLISH, "%03d", parent);
|
||||
requestBuilders.add(client().prepareIndex("idx", "parent", parentId).setSource("{}"));
|
||||
|
||||
int numChildDocs = child1InnerObjects[parent] = scaledRandomIntBetween(0, numDocs);
|
||||
int limit = child1 + numChildDocs;
|
||||
for (; child1 < limit; child1++) {
|
||||
requestBuilders.add(client().prepareIndex("idx", "child1", String.format(Locale.ENGLISH, "%04d", child1)).setParent(parentId).setSource("{}"));
|
||||
}
|
||||
numChildDocs = child2InnerObjects[parent] = scaledRandomIntBetween(0, numDocs);
|
||||
limit = child2 + numChildDocs;
|
||||
for (; child2 < limit; child2++) {
|
||||
requestBuilders.add(client().prepareIndex("idx", "child2", String.format(Locale.ENGLISH, "%04d", child2)).setParent(parentId).setSource("{}"));
|
||||
}
|
||||
}
|
||||
indexRandom(true, requestBuilders);
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("idx")
|
||||
.setSize(numDocs)
|
||||
.setTypes("parent")
|
||||
.addSort("_uid", SortOrder.ASC)
|
||||
.addInnerHit("a", new InnerHitsBuilder.InnerHit().setType("child1").addSort("_uid", SortOrder.ASC).setSize(numDocs))
|
||||
.addInnerHit("b", new InnerHitsBuilder.InnerHit().setType("child2").addSort("_uid", SortOrder.ASC).setSize(numDocs))
|
||||
.get();
|
||||
|
||||
assertHitCount(searchResponse, numDocs);
|
||||
assertThat(searchResponse.getHits().getHits().length, equalTo(numDocs));
|
||||
|
||||
int offset1 = 0;
|
||||
int offset2 = 0;
|
||||
for (int parent = 0; parent < numDocs; parent++) {
|
||||
SearchHit searchHit = searchResponse.getHits().getAt(parent);
|
||||
assertThat(searchHit.getType(), equalTo("parent"));
|
||||
assertThat(searchHit.getId(), equalTo(String.format(Locale.ENGLISH, "%03d", parent)));
|
||||
|
||||
SearchHits inner = searchHit.getInnerHits().get("a");
|
||||
assertThat(inner.totalHits(), equalTo((long) child1InnerObjects[parent]));
|
||||
for (int child = 0; child < child1InnerObjects[parent]; child++) {
|
||||
SearchHit innerHit = inner.getAt(child);
|
||||
assertThat(innerHit.getType(), equalTo("child1"));
|
||||
String childId = String.format(Locale.ENGLISH, "%04d", offset1 + child);
|
||||
assertThat(innerHit.getId(), equalTo(childId));
|
||||
assertThat(innerHit.getNestedIdentity(), nullValue());
|
||||
}
|
||||
offset1 += child1InnerObjects[parent];
|
||||
|
||||
inner = searchHit.getInnerHits().get("b");
|
||||
assertThat(inner.totalHits(), equalTo((long) child2InnerObjects[parent]));
|
||||
for (int child = 0; child < child2InnerObjects[parent]; child++) {
|
||||
SearchHit innerHit = inner.getAt(child);
|
||||
assertThat(innerHit.getType(), equalTo("child2"));
|
||||
String childId = String.format(Locale.ENGLISH, "%04d", offset2 + child);
|
||||
assertThat(innerHit.getId(), equalTo(childId));
|
||||
assertThat(innerHit.getNestedIdentity(), nullValue());
|
||||
}
|
||||
offset2 += child2InnerObjects[parent];
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testPathOrTypeMustBeDefined() {
|
||||
createIndex("articles");
|
||||
ensureGreen("articles");
|
||||
try {
|
||||
client().prepareSearch("articles")
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit())
|
||||
.get();
|
||||
} catch (Exception e) {
|
||||
assertThat(e.getMessage(), containsString("Failed to build search source"));
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testParentChildMultipleLayers() throws Exception {
|
||||
assertAcked(prepareCreate("articles")
|
||||
.addMapping("article", "title", "type=string")
|
||||
.addMapping("comment", "_parent", "type=article", "message", "type=string")
|
||||
.addMapping("remark", "_parent", "type=comment", "message", "type=string")
|
||||
);
|
||||
|
||||
List<IndexRequestBuilder> requests = new ArrayList<>();
|
||||
requests.add(client().prepareIndex("articles", "article", "1").setSource("title", "quick brown fox"));
|
||||
requests.add(client().prepareIndex("articles", "comment", "1").setParent("1").setSource("message", "fox eat quick"));
|
||||
requests.add(client().prepareIndex("articles", "remark", "1").setParent("1").setRouting("1").setSource("message", "good"));
|
||||
requests.add(client().prepareIndex("articles", "article", "2").setSource("title", "big gray elephant"));
|
||||
requests.add(client().prepareIndex("articles", "comment", "2").setParent("2").setSource("message", "elephant captured"));
|
||||
requests.add(client().prepareIndex("articles", "remark", "2").setParent("2").setRouting("2").setSource("message", "bad"));
|
||||
indexRandom(true, requests);
|
||||
|
||||
SearchResponse response = client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", hasChildQuery("remark", matchQuery("message", "good"))))
|
||||
.addInnerHit("comment",
|
||||
new InnerHitsBuilder.InnerHit().setType("comment")
|
||||
.setQuery(hasChildQuery("remark", matchQuery("message", "good")))
|
||||
.addInnerHit("remark", new InnerHitsBuilder.InnerHit().setType("remark").setQuery(matchQuery("message", "good")))
|
||||
)
|
||||
.get();
|
||||
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("1"));
|
||||
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(1l));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(0).type(), equalTo("comment"));
|
||||
|
||||
innerHits = innerHits.getAt(0).getInnerHits().get("remark");
|
||||
assertThat(innerHits.totalHits(), equalTo(1l));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(0).type(), equalTo("remark"));
|
||||
|
||||
response = client().prepareSearch("articles")
|
||||
.setQuery(hasChildQuery("comment", hasChildQuery("remark", matchQuery("message", "bad"))))
|
||||
.addInnerHit("comment",
|
||||
new InnerHitsBuilder.InnerHit().setType("comment")
|
||||
.setQuery(hasChildQuery("remark", matchQuery("message", "bad")))
|
||||
.addInnerHit("remark", new InnerHitsBuilder.InnerHit().setType("remark").setQuery(matchQuery("message", "bad")))
|
||||
)
|
||||
.get();
|
||||
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("2"));
|
||||
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(1l));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(0).type(), equalTo("comment"));
|
||||
|
||||
innerHits = innerHits.getAt(0).getInnerHits().get("remark");
|
||||
assertThat(innerHits.totalHits(), equalTo(1l));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(0).type(), equalTo("remark"));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testNestedMultipleLayers() throws Exception {
|
||||
assertAcked(prepareCreate("articles").addMapping("article", jsonBuilder().startObject().startObject("article").startObject("properties")
|
||||
.startObject("comments")
|
||||
.field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("message")
|
||||
.field("type", "string")
|
||||
.endObject()
|
||||
.startObject("remarks")
|
||||
.field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("message").field("type", "string").endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.startObject("title")
|
||||
.field("type", "string")
|
||||
.endObject()
|
||||
.endObject().endObject().endObject()));
|
||||
|
||||
List<IndexRequestBuilder> requests = new ArrayList<>();
|
||||
requests.add(client().prepareIndex("articles", "article", "1").setSource(jsonBuilder().startObject()
|
||||
.field("title", "quick brown fox")
|
||||
.startArray("comments")
|
||||
.startObject()
|
||||
.field("message", "fox eat quick")
|
||||
.startArray("remarks").startObject().field("message", "good").endObject().endArray()
|
||||
.endObject()
|
||||
.endArray()
|
||||
.endObject()));
|
||||
requests.add(client().prepareIndex("articles", "article", "2").setSource(jsonBuilder().startObject()
|
||||
.field("title", "big gray elephant")
|
||||
.startArray("comments")
|
||||
.startObject()
|
||||
.field("message", "elephant captured")
|
||||
.startArray("remarks").startObject().field("message", "bad").endObject().endArray()
|
||||
.endObject()
|
||||
.endArray()
|
||||
.endObject()));
|
||||
indexRandom(true, requests);
|
||||
|
||||
SearchResponse response = client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", nestedQuery("comments.remarks", matchQuery("comments.remarks.message", "good"))))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit()
|
||||
.setPath("comments")
|
||||
.setQuery(nestedQuery("comments.remarks", matchQuery("comments.remarks.message", "good")))
|
||||
.addInnerHit("remark", new InnerHitsBuilder.InnerHit().setPath("comments.remarks").setQuery(matchQuery("comments.remarks.message", "good")))
|
||||
).get();
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("1"));
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
SearchHits innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(1l));
|
||||
assertThat(innerHits.getHits().length, equalTo(1));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getOffset(), equalTo(0));
|
||||
innerHits = innerHits.getAt(0).getInnerHits().get("remark");
|
||||
assertThat(innerHits.totalHits(), equalTo(1l));
|
||||
assertThat(innerHits.getHits().length, equalTo(1));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("1"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getOffset(), equalTo(0));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getChild().getField().string(), equalTo("remarks"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getChild().getOffset(), equalTo(0));
|
||||
|
||||
response = client().prepareSearch("articles")
|
||||
.setQuery(nestedQuery("comments", nestedQuery("comments.remarks", matchQuery("comments.remarks.message", "bad"))))
|
||||
.addInnerHit("comment", new InnerHitsBuilder.InnerHit()
|
||||
.setPath("comments")
|
||||
.setQuery(nestedQuery("comments.remarks", matchQuery("comments.remarks.message", "bad")))
|
||||
.addInnerHit("remark", new InnerHitsBuilder.InnerHit().setPath("comments.remarks").setQuery(matchQuery("comments.remarks.message", "bad")))
|
||||
).get();
|
||||
assertNoFailures(response);
|
||||
assertHitCount(response, 1);
|
||||
assertSearchHit(response, 1, hasId("2"));
|
||||
assertThat(response.getHits().getAt(0).getInnerHits().size(), equalTo(1));
|
||||
innerHits = response.getHits().getAt(0).getInnerHits().get("comment");
|
||||
assertThat(innerHits.totalHits(), equalTo(1l));
|
||||
assertThat(innerHits.getHits().length, equalTo(1));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getOffset(), equalTo(0));
|
||||
innerHits = innerHits.getAt(0).getInnerHits().get("remark");
|
||||
assertThat(innerHits.totalHits(), equalTo(1l));
|
||||
assertThat(innerHits.getHits().length, equalTo(1));
|
||||
assertThat(innerHits.getAt(0).getId(), equalTo("2"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getField().string(), equalTo("comments"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getOffset(), equalTo(0));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getChild().getField().string(), equalTo("remarks"));
|
||||
assertThat(innerHits.getAt(0).getNestedIdentity().getChild().getOffset(), equalTo(0));
|
||||
}
|
||||
|
||||
}
|
|
@ -47,6 +47,7 @@ import org.elasticsearch.search.aggregations.SearchContextAggregations;
|
|||
import org.elasticsearch.search.dfs.DfsSearchResult;
|
||||
import org.elasticsearch.search.fetch.FetchSearchResult;
|
||||
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext;
|
||||
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
|
||||
import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceContext;
|
||||
import org.elasticsearch.search.highlight.SearchContextHighlight;
|
||||
|
@ -592,4 +593,14 @@ public class TestSearchContext extends SearchContext {
|
|||
public Counter timeEstimateCounter() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void innerHits(InnerHitsContext innerHitsContext) {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
|
||||
@Override
|
||||
public InnerHitsContext innerHits() {
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
}
|
||||
|
|
Loading…
Reference in New Issue