Aggregations: Renaming reducers to Pipeline Aggregators

This commit is contained in:
Colin Goodheart-Smithe 2015-05-21 10:39:38 +01:00
parent e7958f46dc
commit 35deb7efea
178 changed files with 1336 additions and 1242 deletions

View File

@ -23,7 +23,7 @@ it is often easier to break them into two main families:
<<search-aggregations-metrics, _Metric_>>:: <<search-aggregations-metrics, _Metric_>>::
Aggregations that keep track and compute metrics over a set of documents. Aggregations that keep track and compute metrics over a set of documents.
<<search-aggregations-reducer, _Reducer_>>:: <<search-aggregations-pipeline, _Pipeline_>>::
Aggregations that aggregate the output of other aggregations and their associated metrics Aggregations that aggregate the output of other aggregations and their associated metrics
The interesting part comes next. Since each bucket effectively defines a document set (all documents belonging to The interesting part comes next. Since each bucket effectively defines a document set (all documents belonging to
@ -100,6 +100,6 @@ include::aggregations/metrics.asciidoc[]
include::aggregations/bucket.asciidoc[] include::aggregations/bucket.asciidoc[]
include::aggregations/reducer.asciidoc[] include::aggregations/pipeline.asciidoc[]
include::aggregations/misc.asciidoc[] include::aggregations/misc.asciidoc[]

View File

@ -1,39 +1,39 @@
[[search-aggregations-reducer]] [[search-aggregations-pipeline]]
== Reducer Aggregations == Pipeline Aggregations
coming[2.0.0] coming[2.0.0]
experimental[] experimental[]
Reducer aggregations work on the outputs produced from other aggregations rather than from document sets, adding Pipeline aggregations work on the outputs produced from other aggregations rather than from document sets, adding
information to the output tree. There are many different types of reducer, each computing different information from information to the output tree. There are many different types of pipeline aggregation, each computing different information from
other aggregations, but these types can broken down into two families: other aggregations, but these types can broken down into two families:
_Parent_:: _Parent_::
A family of reducer aggregations that is provided with the output of its parent aggregation and is able A family of pipeline aggregations that is provided with the output of its parent aggregation and is able
to compute new buckets or new aggregations to add to existing buckets. to compute new buckets or new aggregations to add to existing buckets.
_Sibling_:: _Sibling_::
Reducer aggregations that are provided with the output of a sibling aggregation and are able to compute a Pipeline aggregations that are provided with the output of a sibling aggregation and are able to compute a
new aggregation which will be at the same level as the sibling aggregation. new aggregation which will be at the same level as the sibling aggregation.
Reducer aggregations can reference the aggregations they need to perform their computation by using the `buckets_paths` Pipeline aggregations can reference the aggregations they need to perform their computation by using the `buckets_paths`
parameter to indicate the paths to the required metrics. The syntax for defining these paths can be found in the parameter to indicate the paths to the required metrics. The syntax for defining these paths can be found in the
<<bucket-path-syntax, `buckets_path` Syntax>> section below. <<bucket-path-syntax, `buckets_path` Syntax>> section below.
Reducer aggregations cannot have sub-aggregations but depending on the type it can reference another reducer in the `buckets_path` Pipeline aggregations cannot have sub-aggregations but depending on the type it can reference another pipeline in the `buckets_path`
allowing reducers to be chained. For example, you can chain together two derivatives to calculate the second derivative allowing pipeline aggregations to be chained. For example, you can chain together two derivatives to calculate the second derivative
(e.g. a derivative of a derivative). (e.g. a derivative of a derivative).
NOTE: Because reducer aggregations only add to the output, when chaining reducer aggregations the output of each reducer will be NOTE: Because pipeline aggregations only add to the output, when chaining pipeline aggregations the output of each pipeline aggregation
included in the final output. will be included in the final output.
[[bucket-path-syntax]] [[bucket-path-syntax]]
[float] [float]
=== `buckets_path` Syntax === `buckets_path` Syntax
Most reducers require another aggregation as their input. The input aggregation is defined via the `buckets_path` Most pipeline aggregations require another aggregation as their input. The input aggregation is defined via the `buckets_path`
parameter, which follows a specific format: parameter, which follows a specific format:
-------------------------------------------------- --------------------------------------------------
@ -47,7 +47,7 @@ PATH := <AGG_NAME>[<AGG_SEPARATOR><AGG_NAME>]*[<METRIC_SEPARATOR
For example, the path `"my_bucket>my_stats.avg"` will path to the `avg` value in the `"my_stats"` metric, which is For example, the path `"my_bucket>my_stats.avg"` will path to the `avg` value in the `"my_stats"` metric, which is
contained in the `"my_bucket"` bucket aggregation. contained in the `"my_bucket"` bucket aggregation.
Paths are relative from the position of the reducer; they are not absolute paths, and the path cannot go back "up" the Paths are relative from the position of the pipeline aggregation; they are not absolute paths, and the path cannot go back "up" the
aggregation tree. For example, this moving average is embedded inside a date_histogram and refers to a "sibling" aggregation tree. For example, this moving average is embedded inside a date_histogram and refers to a "sibling"
metric `"the_sum"`: metric `"the_sum"`:
@ -73,7 +73,7 @@ metric `"the_sum"`:
<1> The metric is called `"the_sum"` <1> The metric is called `"the_sum"`
<2> The `buckets_path` refers to the metric via a relative path `"the_sum"` <2> The `buckets_path` refers to the metric via a relative path `"the_sum"`
`buckets_path` is also used for Sibling reducer aggregations, where the aggregation is "next" to a series of buckets `buckets_path` is also used for Sibling pipeline aggregations, where the aggregation is "next" to a series of buckets
instead of embedded "inside" them. For example, the `max_bucket` aggregation uses the `buckets_path` to specify instead of embedded "inside" them. For example, the `max_bucket` aggregation uses the `buckets_path` to specify
a metric embedded inside a sibling aggregation: a metric embedded inside a sibling aggregation:
@ -109,7 +109,7 @@ a metric embedded inside a sibling aggregation:
==== Special Paths ==== Special Paths
Instead of pathing to a metric, `buckets_path` can use a special `"_count"` path. This instructs Instead of pathing to a metric, `buckets_path` can use a special `"_count"` path. This instructs
the reducer to use the document count as it's input. For example, a moving average can be calculated on the document the pipeline aggregation to use the document count as it's input. For example, a moving average can be calculated on the document
count of each bucket, instead of a specific metric: count of each bucket, instead of a specific metric:
[source,js] [source,js]
@ -141,7 +141,7 @@ There are a couple of reasons why the data output by the enclosing histogram may
on the enclosing histogram or with a query matching only a small number of documents) on the enclosing histogram or with a query matching only a small number of documents)
Where there is no data available in a bucket for a given metric it presents a problem for calculating the derivative value for both Where there is no data available in a bucket for a given metric it presents a problem for calculating the derivative value for both
the current bucket and the next bucket. In the derivative reducer aggregation has a `gap policy` parameter to define what the behavior the current bucket and the next bucket. In the derivative pipeline aggregation has a `gap policy` parameter to define what the behavior
should be when a gap in the data is found. There are currently two options for controlling the gap policy: should be when a gap in the data is found. There are currently two options for controlling the gap policy:
_skip_:: _skip_::
@ -154,9 +154,9 @@ _insert_zeros_::
include::reducer/avg-bucket-aggregation.asciidoc[] include::pipeline/avg-bucket-aggregation.asciidoc[]
include::reducer/derivative-aggregation.asciidoc[] include::pipeline/derivative-aggregation.asciidoc[]
include::reducer/max-bucket-aggregation.asciidoc[] include::pipeline/max-bucket-aggregation.asciidoc[]
include::reducer/min-bucket-aggregation.asciidoc[] include::pipeline/min-bucket-aggregation.asciidoc[]
include::reducer/sum-bucket-aggregation.asciidoc[] include::pipeline/sum-bucket-aggregation.asciidoc[]
include::reducer/movavg-aggregation.asciidoc[] include::pipeline/movavg-aggregation.asciidoc[]

View File

@ -1,7 +1,7 @@
[[search-aggregations-reducer-avg-bucket-aggregation]] [[search-aggregations-pipeline-avg-bucket-aggregation]]
=== Avg Bucket Aggregation === Avg Bucket Aggregation
A sibling reducer aggregation which calculates the (mean) average value of a specified metric in a sibling aggregation. A sibling pipeline aggregation which calculates the (mean) average value of a specified metric in a sibling aggregation.
The specified metric must be numeric and the sibling aggregation must be a multi-bucket aggregation. The specified metric must be numeric and the sibling aggregation must be a multi-bucket aggregation.
==== Syntax ==== Syntax

View File

@ -1,7 +1,7 @@
[[search-aggregations-reducer-derivative-aggregation]] [[search-aggregations-pipeline-derivative-aggregation]]
=== Derivative Aggregation === Derivative Aggregation
A parent reducer aggregation which calculates the derivative of a specified metric in a parent histogram (or date_histogram) A parent pipeline aggregation which calculates the derivative of a specified metric in a parent histogram (or date_histogram)
aggregation. The specified metric must be numeric and the enclosing histogram must have `min_doc_count` set to `0` (default aggregation. The specified metric must be numeric and the enclosing histogram must have `min_doc_count` set to `0` (default
for `histogram` aggregations). for `histogram` aggregations).
@ -112,8 +112,8 @@ would be $/month assuming the `price` field has units of $.
==== Second Order Derivative ==== Second Order Derivative
A second order derivative can be calculated by chaining the derivative reducer aggregation onto the result of another derivative A second order derivative can be calculated by chaining the derivative pipeline aggregation onto the result of another derivative
reducer aggregation as in the following example which will calculate both the first and the second order derivative of the total pipeline aggregation as in the following example which will calculate both the first and the second order derivative of the total
monthly sales: monthly sales:
[source,js] [source,js]

View File

@ -1,7 +1,7 @@
[[search-aggregations-reducer-max-bucket-aggregation]] [[search-aggregations-pipeline-max-bucket-aggregation]]
=== Max Bucket Aggregation === Max Bucket Aggregation
A sibling reducer aggregation which identifies the bucket(s) with the maximum value of a specified metric in a sibling aggregation A sibling pipeline aggregation which identifies the bucket(s) with the maximum value of a specified metric in a sibling aggregation
and outputs both the value and the key(s) of the bucket(s). The specified metric must be numeric and the sibling aggregation must and outputs both the value and the key(s) of the bucket(s). The specified metric must be numeric and the sibling aggregation must
be a multi-bucket aggregation. be a multi-bucket aggregation.

View File

@ -1,7 +1,7 @@
[[search-aggregations-reducer-min-bucket-aggregation]] [[search-aggregations-pipeline-min-bucket-aggregation]]
=== Min Bucket Aggregation === Min Bucket Aggregation
A sibling reducer aggregation which identifies the bucket(s) with the minimum value of a specified metric in a sibling aggregation A sibling pipeline aggregation which identifies the bucket(s) with the minimum value of a specified metric in a sibling aggregation
and outputs both the value and the key(s) of the bucket(s). The specified metric must be numeric and the sibling aggregation must and outputs both the value and the key(s) of the bucket(s). The specified metric must be numeric and the sibling aggregation must
be a multi-bucket aggregation. be a multi-bucket aggregation.

View File

@ -1,4 +1,4 @@
[[search-aggregations-reducers-movavg-reducer]] [[search-aggregations-pipeline-movavg-aggregation]]
=== Moving Average Aggregation === Moving Average Aggregation
Given an ordered series of data, the Moving Average aggregation will slide a window across the data and emit the average Given an ordered series of data, the Moving Average aggregation will slide a window across the data and emit the average
@ -109,14 +109,14 @@ track the data and only smooth out small scale fluctuations:
[[movavg_10window]] [[movavg_10window]]
.Moving average with window of size 10 .Moving average with window of size 10
image::images/reducers_movavg/movavg_10window.png[] image::images/pipeline_movavg/movavg_10window.png[]
In contrast, a `simple` moving average with larger window (`"window": 100`) will smooth out all higher-frequency fluctuations, In contrast, a `simple` moving average with larger window (`"window": 100`) will smooth out all higher-frequency fluctuations,
leaving only low-frequency, long term trends. It also tends to "lag" behind the actual data by a substantial amount: leaving only low-frequency, long term trends. It also tends to "lag" behind the actual data by a substantial amount:
[[movavg_100window]] [[movavg_100window]]
.Moving average with window of size 100 .Moving average with window of size 100
image::images/reducers_movavg/movavg_100window.png[] image::images/pipeline_movavg/movavg_100window.png[]
==== Linear ==== Linear
@ -143,7 +143,7 @@ will closely track the data and only smooth out small scale fluctuations:
[[linear_10window]] [[linear_10window]]
.Linear moving average with window of size 10 .Linear moving average with window of size 10
image::images/reducers_movavg/linear_10window.png[] image::images/pipeline_movavg/linear_10window.png[]
In contrast, a `linear` moving average with larger window (`"window": 100`) will smooth out all higher-frequency fluctuations, In contrast, a `linear` moving average with larger window (`"window": 100`) will smooth out all higher-frequency fluctuations,
leaving only low-frequency, long term trends. It also tends to "lag" behind the actual data by a substantial amount, leaving only low-frequency, long term trends. It also tends to "lag" behind the actual data by a substantial amount,
@ -151,7 +151,7 @@ although typically less than the `simple` model:
[[linear_100window]] [[linear_100window]]
.Linear moving average with window of size 100 .Linear moving average with window of size 100
image::images/reducers_movavg/linear_100window.png[] image::images/pipeline_movavg/linear_100window.png[]
==== EWMA (Exponentially Weighted) ==== EWMA (Exponentially Weighted)
@ -181,11 +181,11 @@ The default value of `alpha` is `0.5`, and the setting accepts any float from 0-
[[single_0.2alpha]] [[single_0.2alpha]]
.Single Exponential moving average with window of size 10, alpha = 0.2 .Single Exponential moving average with window of size 10, alpha = 0.2
image::images/reducers_movavg/single_0.2alpha.png[] image::images/pipeline_movavg/single_0.2alpha.png[]
[[single_0.7alpha]] [[single_0.7alpha]]
.Single Exponential moving average with window of size 10, alpha = 0.7 .Single Exponential moving average with window of size 10, alpha = 0.7
image::images/reducers_movavg/single_0.7alpha.png[] image::images/pipeline_movavg/single_0.7alpha.png[]
==== Holt-Linear ==== Holt-Linear
@ -224,11 +224,11 @@ values emphasize short-term trends. This will become more apparently when you a
[[double_0.2beta]] [[double_0.2beta]]
.Double Exponential moving average with window of size 100, alpha = 0.5, beta = 0.2 .Double Exponential moving average with window of size 100, alpha = 0.5, beta = 0.2
image::images/reducers_movavg/double_0.2beta.png[] image::images/pipeline_movavg/double_0.2beta.png[]
[[double_0.7beta]] [[double_0.7beta]]
.Double Exponential moving average with window of size 100, alpha = 0.5, beta = 0.7 .Double Exponential moving average with window of size 100, alpha = 0.5, beta = 0.7
image::images/reducers_movavg/double_0.7beta.png[] image::images/pipeline_movavg/double_0.7beta.png[]
==== Prediction ==== Prediction
@ -256,7 +256,7 @@ of the last value in the series, producing a flat:
[[simple_prediction]] [[simple_prediction]]
.Simple moving average with window of size 10, predict = 50 .Simple moving average with window of size 10, predict = 50
image::images/reducers_movavg/simple_prediction.png[] image::images/pipeline_movavg/simple_prediction.png[]
In contrast, the `holt` model can extrapolate based on local or global constant trends. If we set a high `beta` In contrast, the `holt` model can extrapolate based on local or global constant trends. If we set a high `beta`
value, we can extrapolate based on local constant trends (in this case the predictions head down, because the data at the end value, we can extrapolate based on local constant trends (in this case the predictions head down, because the data at the end
@ -264,11 +264,11 @@ of the series was heading in a downward direction):
[[double_prediction_local]] [[double_prediction_local]]
.Double Exponential moving average with window of size 100, predict = 20, alpha = 0.5, beta = 0.8 .Double Exponential moving average with window of size 100, predict = 20, alpha = 0.5, beta = 0.8
image::images/reducers_movavg/double_prediction_local.png[] image::images/pipeline_movavg/double_prediction_local.png[]
In contrast, if we choose a small `beta`, the predictions are based on the global constant trend. In this series, the In contrast, if we choose a small `beta`, the predictions are based on the global constant trend. In this series, the
global trend is slightly positive, so the prediction makes a sharp u-turn and begins a positive slope: global trend is slightly positive, so the prediction makes a sharp u-turn and begins a positive slope:
[[double_prediction_global]] [[double_prediction_global]]
.Double Exponential moving average with window of size 100, predict = 20, alpha = 0.5, beta = 0.1 .Double Exponential moving average with window of size 100, predict = 20, alpha = 0.5, beta = 0.1
image::images/reducers_movavg/double_prediction_global.png[] image::images/pipeline_movavg/double_prediction_global.png[]

View File

@ -1,7 +1,7 @@
[[search-aggregations-reducer-sum-bucket-aggregation]] [[search-aggregations-pipeline-sum-bucket-aggregation]]
=== Sum Bucket Aggregation === Sum Bucket Aggregation
A sibling reducer aggregation which calculates the sum across all bucket of a specified metric in a sibling aggregation. A sibling pipeline aggregation which calculates the sum across all bucket of a specified metric in a sibling aggregation.
The specified metric must be numeric and the sibling aggregation must be a multi-bucket aggregation. The specified metric must be numeric and the sibling aggregation must be a multi-bucket aggregation.
==== Syntax ==== Syntax

View File

Before

Width:  |  Height:  |  Size: 69 KiB

After

Width:  |  Height:  |  Size: 69 KiB

View File

Before

Width:  |  Height:  |  Size: 72 KiB

After

Width:  |  Height:  |  Size: 72 KiB

View File

Before

Width:  |  Height:  |  Size: 70 KiB

After

Width:  |  Height:  |  Size: 70 KiB

View File

Before

Width:  |  Height:  |  Size: 66 KiB

After

Width:  |  Height:  |  Size: 66 KiB

View File

Before

Width:  |  Height:  |  Size: 65 KiB

After

Width:  |  Height:  |  Size: 65 KiB

View File

Before

Width:  |  Height:  |  Size: 70 KiB

After

Width:  |  Height:  |  Size: 70 KiB

View File

Before

Width:  |  Height:  |  Size: 64 KiB

After

Width:  |  Height:  |  Size: 64 KiB

View File

Before

Width:  |  Height:  |  Size: 66 KiB

After

Width:  |  Height:  |  Size: 66 KiB

View File

Before

Width:  |  Height:  |  Size: 67 KiB

After

Width:  |  Height:  |  Size: 67 KiB

View File

Before

Width:  |  Height:  |  Size: 63 KiB

After

Width:  |  Height:  |  Size: 63 KiB

View File

Before

Width:  |  Height:  |  Size: 67 KiB

After

Width:  |  Height:  |  Size: 67 KiB

View File

@ -20,6 +20,7 @@
package org.elasticsearch.action; package org.elasticsearch.action;
import com.google.common.base.Preconditions; import com.google.common.base.Preconditions;
import org.elasticsearch.ElasticsearchException; import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.action.support.PlainListenableActionFuture; import org.elasticsearch.action.support.PlainListenableActionFuture;
import org.elasticsearch.client.Client; import org.elasticsearch.client.Client;
@ -27,7 +28,7 @@ import org.elasticsearch.client.ClusterAdminClient;
import org.elasticsearch.client.ElasticsearchClient; import org.elasticsearch.client.ElasticsearchClient;
import org.elasticsearch.client.IndicesAdminClient; import org.elasticsearch.client.IndicesAdminClient;
import org.elasticsearch.common.unit.TimeValue; import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.search.aggregations.reducers.ReducerBuilder; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorBuilder;
import org.elasticsearch.threadpool.ThreadPool; import org.elasticsearch.threadpool.ThreadPool;
/** /**

View File

@ -28,9 +28,9 @@ import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.index.shard.ShardId; import org.elasticsearch.index.shard.ShardId;
import org.elasticsearch.percolator.PercolateContext; import org.elasticsearch.percolator.PercolateContext;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.ReducerStreams; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorStreams;
import org.elasticsearch.search.aggregations.reducers.SiblingReducer; import org.elasticsearch.search.aggregations.pipeline.SiblingPipelineAggregator;
import org.elasticsearch.search.highlight.HighlightField; import org.elasticsearch.search.highlight.HighlightField;
import org.elasticsearch.search.query.QuerySearchResult; import org.elasticsearch.search.query.QuerySearchResult;
@ -56,7 +56,7 @@ public class PercolateShardResponse extends BroadcastShardOperationResponse {
private int requestedSize; private int requestedSize;
private InternalAggregations aggregations; private InternalAggregations aggregations;
private List<SiblingReducer> reducers; private List<SiblingPipelineAggregator> pipelineAggregators;
PercolateShardResponse() { PercolateShardResponse() {
hls = new ArrayList<>(); hls = new ArrayList<>();
@ -75,7 +75,7 @@ public class PercolateShardResponse extends BroadcastShardOperationResponse {
if (result.aggregations() != null) { if (result.aggregations() != null) {
this.aggregations = (InternalAggregations) result.aggregations(); this.aggregations = (InternalAggregations) result.aggregations();
} }
this.reducers = result.reducers(); this.pipelineAggregators = result.pipelineAggregators();
} }
} }
@ -119,8 +119,8 @@ public class PercolateShardResponse extends BroadcastShardOperationResponse {
return aggregations; return aggregations;
} }
public List<SiblingReducer> reducers() { public List<SiblingPipelineAggregator> pipelineAggregators() {
return reducers; return pipelineAggregators;
} }
public byte percolatorTypeId() { public byte percolatorTypeId() {
@ -156,14 +156,14 @@ public class PercolateShardResponse extends BroadcastShardOperationResponse {
} }
aggregations = InternalAggregations.readOptionalAggregations(in); aggregations = InternalAggregations.readOptionalAggregations(in);
if (in.readBoolean()) { if (in.readBoolean()) {
int reducersSize = in.readVInt(); int pipelineAggregatorsSize = in.readVInt();
List<SiblingReducer> reducers = new ArrayList<>(reducersSize); List<SiblingPipelineAggregator> pipelineAggregators = new ArrayList<>(pipelineAggregatorsSize);
for (int i = 0; i < reducersSize; i++) { for (int i = 0; i < pipelineAggregatorsSize; i++) {
BytesReference type = in.readBytesReference(); BytesReference type = in.readBytesReference();
Reducer reducer = ReducerStreams.stream(type).readResult(in); PipelineAggregator pipelineAggregator = PipelineAggregatorStreams.stream(type).readResult(in);
reducers.add((SiblingReducer) reducer); pipelineAggregators.add((SiblingPipelineAggregator) pipelineAggregator);
} }
this.reducers = reducers; this.pipelineAggregators = pipelineAggregators;
} }
} }
@ -190,14 +190,14 @@ public class PercolateShardResponse extends BroadcastShardOperationResponse {
} }
} }
out.writeOptionalStreamable(aggregations); out.writeOptionalStreamable(aggregations);
if (reducers == null) { if (pipelineAggregators == null) {
out.writeBoolean(false); out.writeBoolean(false);
} else { } else {
out.writeBoolean(true); out.writeBoolean(true);
out.writeVInt(reducers.size()); out.writeVInt(pipelineAggregators.size());
for (Reducer reducer : reducers) { for (PipelineAggregator pipelineAggregator : pipelineAggregators) {
out.writeBytesReference(reducer.type().stream()); out.writeBytesReference(pipelineAggregator.type().stream());
reducer.writeTo(out); pipelineAggregator.writeTo(out);
} }
} }
} }

View File

@ -86,8 +86,8 @@ import org.elasticsearch.search.aggregations.AggregationPhase;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.InternalAggregation.ReduceContext; import org.elasticsearch.search.aggregations.InternalAggregation.ReduceContext;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.SiblingReducer; import org.elasticsearch.search.aggregations.pipeline.SiblingPipelineAggregator;
import org.elasticsearch.search.highlight.HighlightField; import org.elasticsearch.search.highlight.HighlightField;
import org.elasticsearch.search.highlight.HighlightPhase; import org.elasticsearch.search.highlight.HighlightPhase;
import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.internal.SearchContext;
@ -852,11 +852,11 @@ public class PercolatorService extends AbstractComponent {
} }
InternalAggregations aggregations = InternalAggregations.reduce(aggregationsList, new ReduceContext(bigArrays, scriptService)); InternalAggregations aggregations = InternalAggregations.reduce(aggregationsList, new ReduceContext(bigArrays, scriptService));
if (aggregations != null) { if (aggregations != null) {
List<SiblingReducer> reducers = shardResults.get(0).reducers(); List<SiblingPipelineAggregator> pipelineAggregators = shardResults.get(0).pipelineAggregators();
if (reducers != null) { if (pipelineAggregators != null) {
List<InternalAggregation> newAggs = new ArrayList<>(Lists.transform(aggregations.asList(), Reducer.AGGREGATION_TRANFORM_FUNCTION)); List<InternalAggregation> newAggs = new ArrayList<>(Lists.transform(aggregations.asList(), PipelineAggregator.AGGREGATION_TRANFORM_FUNCTION));
for (SiblingReducer reducer : reducers) { for (SiblingPipelineAggregator pipelineAggregator : pipelineAggregators) {
InternalAggregation newAgg = reducer.doReduce(new InternalAggregations(newAggs), new ReduceContext(bigArrays, InternalAggregation newAgg = pipelineAggregator.doReduce(new InternalAggregations(newAggs), new ReduceContext(bigArrays,
scriptService)); scriptService));
newAggs.add(newAgg); newAggs.add(newAgg);
} }

View File

@ -56,14 +56,14 @@ import org.elasticsearch.search.aggregations.metrics.stats.extended.ExtendedStat
import org.elasticsearch.search.aggregations.metrics.sum.SumParser; import org.elasticsearch.search.aggregations.metrics.sum.SumParser;
import org.elasticsearch.search.aggregations.metrics.tophits.TopHitsParser; import org.elasticsearch.search.aggregations.metrics.tophits.TopHitsParser;
import org.elasticsearch.search.aggregations.metrics.valuecount.ValueCountParser; import org.elasticsearch.search.aggregations.metrics.valuecount.ValueCountParser;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.avg.AvgBucketParser; import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.avg.AvgBucketParser;
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.max.MaxBucketParser; import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.max.MaxBucketParser;
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.min.MinBucketParser; import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.min.MinBucketParser;
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.sum.SumBucketParser; import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.sum.SumBucketParser;
import org.elasticsearch.search.aggregations.reducers.derivative.DerivativeParser; import org.elasticsearch.search.aggregations.pipeline.derivative.DerivativeParser;
import org.elasticsearch.search.aggregations.reducers.movavg.MovAvgParser; import org.elasticsearch.search.aggregations.pipeline.movavg.MovAvgParser;
import org.elasticsearch.search.aggregations.reducers.movavg.models.MovAvgModelModule; import org.elasticsearch.search.aggregations.pipeline.movavg.models.MovAvgModelModule;
import java.util.List; import java.util.List;
@ -73,7 +73,7 @@ import java.util.List;
public class AggregationModule extends AbstractModule implements SpawnModules{ public class AggregationModule extends AbstractModule implements SpawnModules{
private List<Class<? extends Aggregator.Parser>> aggParsers = Lists.newArrayList(); private List<Class<? extends Aggregator.Parser>> aggParsers = Lists.newArrayList();
private List<Class<? extends Reducer.Parser>> reducerParsers = Lists.newArrayList(); private List<Class<? extends PipelineAggregator.Parser>> pipelineAggParsers = Lists.newArrayList();
public AggregationModule() { public AggregationModule() {
aggParsers.add(AvgParser.class); aggParsers.add(AvgParser.class);
@ -108,12 +108,12 @@ public class AggregationModule extends AbstractModule implements SpawnModules{
aggParsers.add(ScriptedMetricParser.class); aggParsers.add(ScriptedMetricParser.class);
aggParsers.add(ChildrenParser.class); aggParsers.add(ChildrenParser.class);
reducerParsers.add(DerivativeParser.class); pipelineAggParsers.add(DerivativeParser.class);
reducerParsers.add(MaxBucketParser.class); pipelineAggParsers.add(MaxBucketParser.class);
reducerParsers.add(MinBucketParser.class); pipelineAggParsers.add(MinBucketParser.class);
reducerParsers.add(AvgBucketParser.class); pipelineAggParsers.add(AvgBucketParser.class);
reducerParsers.add(SumBucketParser.class); pipelineAggParsers.add(SumBucketParser.class);
reducerParsers.add(MovAvgParser.class); pipelineAggParsers.add(MovAvgParser.class);
} }
/** /**
@ -131,9 +131,9 @@ public class AggregationModule extends AbstractModule implements SpawnModules{
for (Class<? extends Aggregator.Parser> parser : aggParsers) { for (Class<? extends Aggregator.Parser> parser : aggParsers) {
multibinderAggParser.addBinding().to(parser); multibinderAggParser.addBinding().to(parser);
} }
Multibinder<Reducer.Parser> multibinderReducerParser = Multibinder.newSetBinder(binder(), Reducer.Parser.class); Multibinder<PipelineAggregator.Parser> multibinderPipelineAggParser = Multibinder.newSetBinder(binder(), PipelineAggregator.Parser.class);
for (Class<? extends Reducer.Parser> parser : reducerParsers) { for (Class<? extends PipelineAggregator.Parser> parser : pipelineAggParsers) {
multibinderReducerParser.addBinding().to(parser); multibinderPipelineAggParser.addBinding().to(parser);
} }
bind(AggregatorParsers.class).asEagerSingleton(); bind(AggregatorParsers.class).asEagerSingleton();
bind(AggregationParseElement.class).asEagerSingleton(); bind(AggregationParseElement.class).asEagerSingleton();

View File

@ -23,14 +23,13 @@ import com.google.common.collect.ImmutableMap;
import org.apache.lucene.search.BooleanClause.Occur; import org.apache.lucene.search.BooleanClause.Occur;
import org.apache.lucene.search.BooleanQuery; import org.apache.lucene.search.BooleanQuery;
import org.apache.lucene.search.Query; import org.apache.lucene.search.Query;
import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.common.inject.Inject; import org.elasticsearch.common.inject.Inject;
import org.elasticsearch.common.lucene.search.Queries; import org.elasticsearch.common.lucene.search.Queries;
import org.elasticsearch.search.SearchParseElement; import org.elasticsearch.search.SearchParseElement;
import org.elasticsearch.search.SearchPhase; import org.elasticsearch.search.SearchPhase;
import org.elasticsearch.search.aggregations.bucket.global.GlobalAggregator; import org.elasticsearch.search.aggregations.bucket.global.GlobalAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.SiblingReducer; import org.elasticsearch.search.aggregations.pipeline.SiblingPipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.internal.SearchContext;
import org.elasticsearch.search.query.QueryPhaseExecutionException; import org.elasticsearch.search.query.QueryPhaseExecutionException;
@ -145,19 +144,20 @@ public class AggregationPhase implements SearchPhase {
} }
context.queryResult().aggregations(new InternalAggregations(aggregations)); context.queryResult().aggregations(new InternalAggregations(aggregations));
try { try {
List<Reducer> reducers = context.aggregations().factories().createReducers(); List<PipelineAggregator> pipelineAggregators = context.aggregations().factories().createPipelineAggregators();
List<SiblingReducer> siblingReducers = new ArrayList<>(reducers.size()); List<SiblingPipelineAggregator> siblingPipelineAggregators = new ArrayList<>(pipelineAggregators.size());
for (Reducer reducer : reducers) { for (PipelineAggregator pipelineAggregator : pipelineAggregators) {
if (reducer instanceof SiblingReducer) { if (pipelineAggregator instanceof SiblingPipelineAggregator) {
siblingReducers.add((SiblingReducer) reducer); siblingPipelineAggregators.add((SiblingPipelineAggregator) pipelineAggregator);
} else { } else {
throw new AggregationExecutionException("Invalid reducer named [" + reducer.name() + "] of type [" throw new AggregationExecutionException("Invalid pipeline aggregation named [" + pipelineAggregator.name()
+ reducer.type().name() + "]. Only sibling reducers are allowed at the top level"); + "] of type [" + pipelineAggregator.type().name()
+ "]. Only sibling pipeline aggregations are allowed at the top level");
} }
} }
context.queryResult().reducers(siblingReducers); context.queryResult().pipelineAggregators(siblingPipelineAggregators);
} catch (IOException e) { } catch (IOException e) {
throw new AggregationExecutionException("Failed to build top level reducers", e); throw new AggregationExecutionException("Failed to build top level pipeline aggregators", e);
} }
// disable aggregations so that they don't run on next pages in case of scrolling // disable aggregations so that they don't run on next pages in case of scrolling

View File

@ -21,7 +21,7 @@ package org.elasticsearch.search.aggregations;
import org.apache.lucene.index.LeafReaderContext; import org.apache.lucene.index.LeafReaderContext;
import org.elasticsearch.search.aggregations.bucket.BestBucketsDeferringCollector; import org.elasticsearch.search.aggregations.bucket.BestBucketsDeferringCollector;
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector; import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.internal.SearchContext.Lifetime; import org.elasticsearch.search.internal.SearchContext.Lifetime;
import org.elasticsearch.search.query.QueryPhaseExecutionException; import org.elasticsearch.search.query.QueryPhaseExecutionException;
@ -47,7 +47,7 @@ public abstract class AggregatorBase extends Aggregator {
private Map<String, Aggregator> subAggregatorbyName; private Map<String, Aggregator> subAggregatorbyName;
private DeferringBucketCollector recordingWrapper; private DeferringBucketCollector recordingWrapper;
private final List<Reducer> reducers; private final List<PipelineAggregator> pipelineAggregators;
/** /**
* Constructs a new Aggregator. * Constructs a new Aggregator.
@ -59,9 +59,9 @@ public abstract class AggregatorBase extends Aggregator {
* @param metaData The metaData associated with this aggregator * @param metaData The metaData associated with this aggregator
*/ */
protected AggregatorBase(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent, protected AggregatorBase(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
this.name = name; this.name = name;
this.reducers = reducers; this.pipelineAggregators = pipelineAggregators;
this.metaData = metaData; this.metaData = metaData;
this.parent = parent; this.parent = parent;
this.context = context; this.context = context;
@ -116,8 +116,8 @@ public abstract class AggregatorBase extends Aggregator {
return this.metaData; return this.metaData;
} }
public List<Reducer> reducers() { public List<PipelineAggregator> pipelineAggregators() {
return this.reducers; return this.pipelineAggregators;
} }
/** /**

View File

@ -18,8 +18,8 @@
*/ */
package org.elasticsearch.search.aggregations; package org.elasticsearch.search.aggregations;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.ReducerFactory; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorFactory;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.AggregationPath; import org.elasticsearch.search.aggregations.support.AggregationPath;
@ -41,23 +41,23 @@ public class AggregatorFactories {
private AggregatorFactory parent; private AggregatorFactory parent;
private AggregatorFactory[] factories; private AggregatorFactory[] factories;
private List<ReducerFactory> reducerFactories; private List<PipelineAggregatorFactory> pipelineAggregatorFactories;
public static Builder builder() { public static Builder builder() {
return new Builder(); return new Builder();
} }
private AggregatorFactories(AggregatorFactory[] factories, List<ReducerFactory> reducers) { private AggregatorFactories(AggregatorFactory[] factories, List<PipelineAggregatorFactory> pipelineAggregators) {
this.factories = factories; this.factories = factories;
this.reducerFactories = reducers; this.pipelineAggregatorFactories = pipelineAggregators;
} }
public List<Reducer> createReducers() throws IOException { public List<PipelineAggregator> createPipelineAggregators() throws IOException {
List<Reducer> reducers = new ArrayList<>(); List<PipelineAggregator> pipelineAggregators = new ArrayList<>();
for (ReducerFactory factory : this.reducerFactories) { for (PipelineAggregatorFactory factory : this.pipelineAggregatorFactories) {
reducers.add(factory.create()); pipelineAggregators.add(factory.create());
} }
return reducers; return pipelineAggregators;
} }
/** /**
@ -103,8 +103,8 @@ public class AggregatorFactories {
for (AggregatorFactory factory : factories) { for (AggregatorFactory factory : factories) {
factory.validate(); factory.validate();
} }
for (ReducerFactory factory : reducerFactories) { for (PipelineAggregatorFactory factory : pipelineAggregatorFactories) {
factory.validate(parent, factories, reducerFactories); factory.validate(parent, factories, pipelineAggregatorFactories);
} }
} }
@ -112,10 +112,10 @@ public class AggregatorFactories {
private static final AggregatorFactory[] EMPTY_FACTORIES = new AggregatorFactory[0]; private static final AggregatorFactory[] EMPTY_FACTORIES = new AggregatorFactory[0];
private static final Aggregator[] EMPTY_AGGREGATORS = new Aggregator[0]; private static final Aggregator[] EMPTY_AGGREGATORS = new Aggregator[0];
private static final List<ReducerFactory> EMPTY_REDUCERS = new ArrayList<>(); private static final List<PipelineAggregatorFactory> EMPTY_PIPELINE_AGGREGATORS = new ArrayList<>();
private Empty() { private Empty() {
super(EMPTY_FACTORIES, EMPTY_REDUCERS); super(EMPTY_FACTORIES, EMPTY_PIPELINE_AGGREGATORS);
} }
@Override @Override
@ -134,7 +134,7 @@ public class AggregatorFactories {
private final Set<String> names = new HashSet<>(); private final Set<String> names = new HashSet<>();
private final List<AggregatorFactory> factories = new ArrayList<>(); private final List<AggregatorFactory> factories = new ArrayList<>();
private final List<ReducerFactory> reducerFactories = new ArrayList<>(); private final List<PipelineAggregatorFactory> pipelineAggregatorFactories = new ArrayList<>();
public Builder addAggregator(AggregatorFactory factory) { public Builder addAggregator(AggregatorFactory factory) {
if (!names.add(factory.name)) { if (!names.add(factory.name)) {
@ -144,43 +144,43 @@ public class AggregatorFactories {
return this; return this;
} }
public Builder addReducer(ReducerFactory reducerFactory) { public Builder addPipelineAggregator(PipelineAggregatorFactory pipelineAggregatorFactory) {
this.reducerFactories.add(reducerFactory); this.pipelineAggregatorFactories.add(pipelineAggregatorFactory);
return this; return this;
} }
public AggregatorFactories build() { public AggregatorFactories build() {
if (factories.isEmpty() && reducerFactories.isEmpty()) { if (factories.isEmpty() && pipelineAggregatorFactories.isEmpty()) {
return EMPTY; return EMPTY;
} }
List<ReducerFactory> orderedReducers = resolveReducerOrder(this.reducerFactories, this.factories); List<PipelineAggregatorFactory> orderedpipelineAggregators = resolvePipelineAggregatorOrder(this.pipelineAggregatorFactories, this.factories);
return new AggregatorFactories(factories.toArray(new AggregatorFactory[factories.size()]), orderedReducers); return new AggregatorFactories(factories.toArray(new AggregatorFactory[factories.size()]), orderedpipelineAggregators);
} }
private List<ReducerFactory> resolveReducerOrder(List<ReducerFactory> reducerFactories, List<AggregatorFactory> aggFactories) { private List<PipelineAggregatorFactory> resolvePipelineAggregatorOrder(List<PipelineAggregatorFactory> pipelineAggregatorFactories, List<AggregatorFactory> aggFactories) {
Map<String, ReducerFactory> reducerFactoriesMap = new HashMap<>(); Map<String, PipelineAggregatorFactory> pipelineAggregatorFactoriesMap = new HashMap<>();
for (ReducerFactory factory : reducerFactories) { for (PipelineAggregatorFactory factory : pipelineAggregatorFactories) {
reducerFactoriesMap.put(factory.getName(), factory); pipelineAggregatorFactoriesMap.put(factory.getName(), factory);
} }
Set<String> aggFactoryNames = new HashSet<>(); Set<String> aggFactoryNames = new HashSet<>();
for (AggregatorFactory aggFactory : aggFactories) { for (AggregatorFactory aggFactory : aggFactories) {
aggFactoryNames.add(aggFactory.name); aggFactoryNames.add(aggFactory.name);
} }
List<ReducerFactory> orderedReducers = new LinkedList<>(); List<PipelineAggregatorFactory> orderedPipelineAggregatorrs = new LinkedList<>();
List<ReducerFactory> unmarkedFactories = new ArrayList<ReducerFactory>(reducerFactories); List<PipelineAggregatorFactory> unmarkedFactories = new ArrayList<PipelineAggregatorFactory>(pipelineAggregatorFactories);
Set<ReducerFactory> temporarilyMarked = new HashSet<ReducerFactory>(); Set<PipelineAggregatorFactory> temporarilyMarked = new HashSet<PipelineAggregatorFactory>();
while (!unmarkedFactories.isEmpty()) { while (!unmarkedFactories.isEmpty()) {
ReducerFactory factory = unmarkedFactories.get(0); PipelineAggregatorFactory factory = unmarkedFactories.get(0);
resolveReducerOrder(aggFactoryNames, reducerFactoriesMap, orderedReducers, unmarkedFactories, temporarilyMarked, factory); resolvePipelineAggregatorOrder(aggFactoryNames, pipelineAggregatorFactoriesMap, orderedPipelineAggregatorrs, unmarkedFactories, temporarilyMarked, factory);
} }
return orderedReducers; return orderedPipelineAggregatorrs;
} }
private void resolveReducerOrder(Set<String> aggFactoryNames, Map<String, ReducerFactory> reducerFactoriesMap, private void resolvePipelineAggregatorOrder(Set<String> aggFactoryNames, Map<String, PipelineAggregatorFactory> pipelineAggregatorFactoriesMap,
List<ReducerFactory> orderedReducers, List<ReducerFactory> unmarkedFactories, Set<ReducerFactory> temporarilyMarked, List<PipelineAggregatorFactory> orderedPipelineAggregators, List<PipelineAggregatorFactory> unmarkedFactories, Set<PipelineAggregatorFactory> temporarilyMarked,
ReducerFactory factory) { PipelineAggregatorFactory factory) {
if (temporarilyMarked.contains(factory)) { if (temporarilyMarked.contains(factory)) {
throw new IllegalStateException("Cyclical dependancy found with reducer [" + factory.getName() + "]"); throw new IllegalStateException("Cyclical dependancy found with pipeline aggregator [" + factory.getName() + "]");
} else if (unmarkedFactories.contains(factory)) { } else if (unmarkedFactories.contains(factory)) {
temporarilyMarked.add(factory); temporarilyMarked.add(factory);
String[] bucketsPaths = factory.getBucketsPaths(); String[] bucketsPaths = factory.getBucketsPaths();
@ -190,9 +190,9 @@ public class AggregatorFactories {
if (bucketsPath.equals("_count") || bucketsPath.equals("_key") || aggFactoryNames.contains(firstAggName)) { if (bucketsPath.equals("_count") || bucketsPath.equals("_key") || aggFactoryNames.contains(firstAggName)) {
continue; continue;
} else { } else {
ReducerFactory matchingFactory = reducerFactoriesMap.get(firstAggName); PipelineAggregatorFactory matchingFactory = pipelineAggregatorFactoriesMap.get(firstAggName);
if (matchingFactory != null) { if (matchingFactory != null) {
resolveReducerOrder(aggFactoryNames, reducerFactoriesMap, orderedReducers, unmarkedFactories, resolvePipelineAggregatorOrder(aggFactoryNames, pipelineAggregatorFactoriesMap, orderedPipelineAggregators, unmarkedFactories,
temporarilyMarked, matchingFactory); temporarilyMarked, matchingFactory);
} else { } else {
throw new IllegalStateException("No aggregation found for path [" + bucketsPath + "]"); throw new IllegalStateException("No aggregation found for path [" + bucketsPath + "]");
@ -201,7 +201,7 @@ public class AggregatorFactories {
} }
unmarkedFactories.remove(factory); unmarkedFactories.remove(factory);
temporarilyMarked.remove(factory); temporarilyMarked.remove(factory);
orderedReducers.add(factory); orderedPipelineAggregators.add(factory);
} }
} }
} }

View File

@ -23,7 +23,7 @@ import org.apache.lucene.search.Scorer;
import org.elasticsearch.common.lease.Releasables; import org.elasticsearch.common.lease.Releasables;
import org.elasticsearch.common.util.BigArrays; import org.elasticsearch.common.util.BigArrays;
import org.elasticsearch.common.util.ObjectArray; import org.elasticsearch.common.util.ObjectArray;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.internal.SearchContext.Lifetime; import org.elasticsearch.search.internal.SearchContext.Lifetime;
@ -86,7 +86,7 @@ public abstract class AggregatorFactory {
} }
protected abstract Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket, protected abstract Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException; List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException;
/** /**
* Creates the aggregator * Creates the aggregator
@ -99,7 +99,7 @@ public abstract class AggregatorFactory {
* @return The created aggregator * @return The created aggregator
*/ */
public final Aggregator create(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket) throws IOException { public final Aggregator create(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket) throws IOException {
return createInternal(context, parent, collectsFromSingleBucket, this.factories.createReducers(), this.metaData); return createInternal(context, parent, collectsFromSingleBucket, this.factories.createPipelineAggregators(), this.metaData);
} }
public void doValidate() { public void doValidate() {

View File

@ -24,8 +24,8 @@ import org.elasticsearch.common.collect.MapBuilder;
import org.elasticsearch.common.inject.Inject; import org.elasticsearch.common.inject.Inject;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.search.SearchParseException; import org.elasticsearch.search.SearchParseException;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.ReducerFactory; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorFactory;
import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.internal.SearchContext;
import java.io.IOException; import java.io.IOException;
@ -41,7 +41,7 @@ public class AggregatorParsers {
public static final Pattern VALID_AGG_NAME = Pattern.compile("[^\\[\\]>]+"); public static final Pattern VALID_AGG_NAME = Pattern.compile("[^\\[\\]>]+");
private final ImmutableMap<String, Aggregator.Parser> aggParsers; private final ImmutableMap<String, Aggregator.Parser> aggParsers;
private final ImmutableMap<String, Reducer.Parser> reducerParsers; private final ImmutableMap<String, PipelineAggregator.Parser> pipelineAggregatorParsers;
/** /**
@ -53,17 +53,17 @@ public class AggregatorParsers {
* ). * ).
*/ */
@Inject @Inject
public AggregatorParsers(Set<Aggregator.Parser> aggParsers, Set<Reducer.Parser> reducerParsers) { public AggregatorParsers(Set<Aggregator.Parser> aggParsers, Set<PipelineAggregator.Parser> pipelineAggregatorParsers) {
MapBuilder<String, Aggregator.Parser> aggParsersBuilder = MapBuilder.newMapBuilder(); MapBuilder<String, Aggregator.Parser> aggParsersBuilder = MapBuilder.newMapBuilder();
for (Aggregator.Parser parser : aggParsers) { for (Aggregator.Parser parser : aggParsers) {
aggParsersBuilder.put(parser.type(), parser); aggParsersBuilder.put(parser.type(), parser);
} }
this.aggParsers = aggParsersBuilder.immutableMap(); this.aggParsers = aggParsersBuilder.immutableMap();
MapBuilder<String, Reducer.Parser> reducerParsersBuilder = MapBuilder.newMapBuilder(); MapBuilder<String, PipelineAggregator.Parser> pipelineAggregatorParsersBuilder = MapBuilder.newMapBuilder();
for (Reducer.Parser parser : reducerParsers) { for (PipelineAggregator.Parser parser : pipelineAggregatorParsers) {
reducerParsersBuilder.put(parser.type(), parser); pipelineAggregatorParsersBuilder.put(parser.type(), parser);
} }
this.reducerParsers = reducerParsersBuilder.immutableMap(); this.pipelineAggregatorParsers = pipelineAggregatorParsersBuilder.immutableMap();
} }
/** /**
@ -77,14 +77,15 @@ public class AggregatorParsers {
} }
/** /**
* Returns the parser that is registered under the given reducer type. * Returns the parser that is registered under the given pipeline aggregator
* type.
* *
* @param type * @param type
* The reducer type * The pipeline aggregator type
* @return The parser associated with the given reducer type. * @return The parser associated with the given pipeline aggregator type.
*/ */
public Reducer.Parser reducer(String type) { public PipelineAggregator.Parser pipelineAggregator(String type) {
return reducerParsers.get(type); return pipelineAggregatorParsers.get(type);
} }
/** /**
@ -125,7 +126,7 @@ public class AggregatorParsers {
} }
AggregatorFactory aggFactory = null; AggregatorFactory aggFactory = null;
ReducerFactory reducerFactory = null; PipelineAggregatorFactory pipelineAggregatorFactory = null;
AggregatorFactories subFactories = null; AggregatorFactories subFactories = null;
Map<String, Object> metaData = null; Map<String, Object> metaData = null;
@ -161,20 +162,19 @@ public class AggregatorParsers {
throw new SearchParseException(context, "Found two aggregation type definitions in [" + aggregationName + "]: [" throw new SearchParseException(context, "Found two aggregation type definitions in [" + aggregationName + "]: ["
+ aggFactory.type + "] and [" + fieldName + "]", parser.getTokenLocation()); + aggFactory.type + "] and [" + fieldName + "]", parser.getTokenLocation());
} }
if (reducerFactory != null) { if (pipelineAggregatorFactory != null) {
// TODO we would need a .type property on reducers too for this error message?
throw new SearchParseException(context, "Found two aggregation type definitions in [" + aggregationName + "]: [" throw new SearchParseException(context, "Found two aggregation type definitions in [" + aggregationName + "]: ["
+ reducerFactory + "] and [" + fieldName + "]", parser.getTokenLocation()); + pipelineAggregatorFactory + "] and [" + fieldName + "]", parser.getTokenLocation());
} }
Aggregator.Parser aggregatorParser = parser(fieldName); Aggregator.Parser aggregatorParser = parser(fieldName);
if (aggregatorParser == null) { if (aggregatorParser == null) {
Reducer.Parser reducerParser = reducer(fieldName); PipelineAggregator.Parser pipelineAggregatorParser = pipelineAggregator(fieldName);
if (reducerParser == null) { if (pipelineAggregatorParser == null) {
throw new SearchParseException(context, "Could not find aggregator type [" + fieldName + "] in [" throw new SearchParseException(context, "Could not find aggregator type [" + fieldName + "] in ["
+ aggregationName + "]", parser.getTokenLocation()); + aggregationName + "]", parser.getTokenLocation());
} else { } else {
reducerFactory = reducerParser.parse(aggregationName, parser, context); pipelineAggregatorFactory = pipelineAggregatorParser.parse(aggregationName, parser, context);
} }
} else { } else {
aggFactory = aggregatorParser.parse(aggregationName, parser, context); aggFactory = aggregatorParser.parse(aggregationName, parser, context);
@ -182,11 +182,11 @@ public class AggregatorParsers {
} }
} }
if (aggFactory == null && reducerFactory == null) { if (aggFactory == null && pipelineAggregatorFactory == null) {
throw new SearchParseException(context, "Missing definition for aggregation [" + aggregationName + "]", throw new SearchParseException(context, "Missing definition for aggregation [" + aggregationName + "]",
parser.getTokenLocation()); parser.getTokenLocation());
} else if (aggFactory != null) { } else if (aggFactory != null) {
assert reducerFactory == null; assert pipelineAggregatorFactory == null;
if (metaData != null) { if (metaData != null) {
aggFactory.setMetaData(metaData); aggFactory.setMetaData(metaData);
} }
@ -201,12 +201,12 @@ public class AggregatorParsers {
factories.addAggregator(aggFactory); factories.addAggregator(aggFactory);
} else { } else {
assert reducerFactory != null; assert pipelineAggregatorFactory != null;
if (subFactories != null) { if (subFactories != null) {
throw new SearchParseException(context, "Aggregation [" + aggregationName + "] cannot define sub-aggregations", throw new SearchParseException(context, "Aggregation [" + aggregationName + "] cannot define sub-aggregations",
parser.getTokenLocation()); parser.getTokenLocation());
} }
factories.addReducer(reducerFactory); factories.addPipelineAggregator(pipelineAggregatorFactory);
} }
} }

View File

@ -31,8 +31,8 @@ import org.elasticsearch.common.xcontent.ToXContent;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentBuilderString; import org.elasticsearch.common.xcontent.XContentBuilderString;
import org.elasticsearch.script.ScriptService; import org.elasticsearch.script.ScriptService;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.ReducerStreams; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorStreams;
import org.elasticsearch.search.aggregations.support.AggregationPath; import org.elasticsearch.search.aggregations.support.AggregationPath;
import java.io.IOException; import java.io.IOException;
@ -115,7 +115,7 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
protected Map<String, Object> metaData; protected Map<String, Object> metaData;
private List<Reducer> reducers; private List<PipelineAggregator> pipelineAggregators;
/** Constructs an un initialized addAggregation (used for serialization) **/ /** Constructs an un initialized addAggregation (used for serialization) **/
protected InternalAggregation() {} protected InternalAggregation() {}
@ -125,9 +125,9 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
* *
* @param name The name of the get. * @param name The name of the get.
*/ */
protected InternalAggregation(String name, List<Reducer> reducers, Map<String, Object> metaData) { protected InternalAggregation(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
this.name = name; this.name = name;
this.reducers = reducers; this.pipelineAggregators = pipelineAggregators;
this.metaData = metaData; this.metaData = metaData;
} }
@ -149,8 +149,8 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
*/ */
public final InternalAggregation reduce(List<InternalAggregation> aggregations, ReduceContext reduceContext) { public final InternalAggregation reduce(List<InternalAggregation> aggregations, ReduceContext reduceContext) {
InternalAggregation aggResult = doReduce(aggregations, reduceContext); InternalAggregation aggResult = doReduce(aggregations, reduceContext);
for (Reducer reducer : reducers) { for (PipelineAggregator pipelineAggregator : pipelineAggregators) {
aggResult = reducer.reduce(aggResult, reduceContext); aggResult = pipelineAggregator.reduce(aggResult, reduceContext);
} }
return aggResult; return aggResult;
} }
@ -188,8 +188,8 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
return metaData; return metaData;
} }
public List<Reducer> reducers() { public List<PipelineAggregator> pipelineAggregators() {
return reducers; return pipelineAggregators;
} }
@Override @Override
@ -210,10 +210,10 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
public final void writeTo(StreamOutput out) throws IOException { public final void writeTo(StreamOutput out) throws IOException {
out.writeString(name); out.writeString(name);
out.writeGenericValue(metaData); out.writeGenericValue(metaData);
out.writeVInt(reducers.size()); out.writeVInt(pipelineAggregators.size());
for (Reducer reducer : reducers) { for (PipelineAggregator pipelineAggregator : pipelineAggregators) {
out.writeBytesReference(reducer.type().stream()); out.writeBytesReference(pipelineAggregator.type().stream());
reducer.writeTo(out); pipelineAggregator.writeTo(out);
} }
doWriteTo(out); doWriteTo(out);
} }
@ -226,13 +226,13 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
metaData = in.readMap(); metaData = in.readMap();
int size = in.readVInt(); int size = in.readVInt();
if (size == 0) { if (size == 0) {
reducers = ImmutableList.of(); pipelineAggregators = ImmutableList.of();
} else { } else {
reducers = Lists.newArrayListWithCapacity(size); pipelineAggregators = Lists.newArrayListWithCapacity(size);
for (int i = 0; i < size; i++) { for (int i = 0; i < size; i++) {
BytesReference type = in.readBytesReference(); BytesReference type = in.readBytesReference();
Reducer reducer = ReducerStreams.stream(type).readResult(in); PipelineAggregator pipelineAggregator = PipelineAggregatorStreams.stream(type).readResult(in);
reducers.add(reducer); pipelineAggregators.add(pipelineAggregator);
} }
} }
doReadFrom(in); doReadFrom(in);

View File

@ -20,7 +20,7 @@
package org.elasticsearch.search.aggregations; package org.elasticsearch.search.aggregations;
import org.elasticsearch.search.aggregations.bucket.MultiBucketsAggregation; import org.elasticsearch.search.aggregations.bucket.MultiBucketsAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
@ -31,8 +31,8 @@ public abstract class InternalMultiBucketAggregation<A extends InternalMultiBuck
public InternalMultiBucketAggregation() { public InternalMultiBucketAggregation() {
} }
public InternalMultiBucketAggregation(String name, List<Reducer> reducers, Map<String, Object> metaData) { public InternalMultiBucketAggregation(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
} }
/** /**

View File

@ -20,7 +20,7 @@
package org.elasticsearch.search.aggregations; package org.elasticsearch.search.aggregations;
import org.apache.lucene.index.LeafReaderContext; import org.apache.lucene.index.LeafReaderContext;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -34,13 +34,13 @@ import java.util.Map;
public abstract class NonCollectingAggregator extends AggregatorBase { public abstract class NonCollectingAggregator extends AggregatorBase {
protected NonCollectingAggregator(String name, AggregationContext context, Aggregator parent, AggregatorFactories subFactories, protected NonCollectingAggregator(String name, AggregationContext context, Aggregator parent, AggregatorFactories subFactories,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, subFactories, context, parent, reducers, metaData); super(name, subFactories, context, parent, pipelineAggregators, metaData);
} }
protected NonCollectingAggregator(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers, protected NonCollectingAggregator(String name, AggregationContext context, Aggregator parent,
Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
this(name, context, parent, AggregatorFactories.EMPTY, reducers, metaData); this(name, context, parent, AggregatorFactories.EMPTY, pipelineAggregators, metaData);
} }
@Override @Override

View File

@ -59,16 +59,16 @@ import org.elasticsearch.search.aggregations.metrics.stats.extended.InternalExte
import org.elasticsearch.search.aggregations.metrics.sum.InternalSum; import org.elasticsearch.search.aggregations.metrics.sum.InternalSum;
import org.elasticsearch.search.aggregations.metrics.tophits.InternalTopHits; import org.elasticsearch.search.aggregations.metrics.tophits.InternalTopHits;
import org.elasticsearch.search.aggregations.metrics.valuecount.InternalValueCount; import org.elasticsearch.search.aggregations.metrics.valuecount.InternalValueCount;
import org.elasticsearch.search.aggregations.reducers.InternalSimpleValue; import org.elasticsearch.search.aggregations.pipeline.InternalSimpleValue;
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.InternalBucketMetricValue; import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.InternalBucketMetricValue;
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.avg.AvgBucketReducer; import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.avg.AvgBucketPipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.max.MaxBucketReducer; import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.max.MaxBucketPipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.min.MinBucketReducer; import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.min.MinBucketPipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.sum.SumBucketReducer; import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.sum.SumBucketPipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.derivative.DerivativeReducer; import org.elasticsearch.search.aggregations.pipeline.derivative.DerivativePipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.derivative.InternalDerivative; import org.elasticsearch.search.aggregations.pipeline.derivative.InternalDerivative;
import org.elasticsearch.search.aggregations.reducers.movavg.MovAvgReducer; import org.elasticsearch.search.aggregations.pipeline.movavg.MovAvgPipelineAggregator;
import org.elasticsearch.search.aggregations.reducers.movavg.models.TransportMovAvgModelModule; import org.elasticsearch.search.aggregations.pipeline.movavg.models.TransportMovAvgModelModule;
/** /**
* A module that registers all the transport streams for the addAggregation * A module that registers all the transport streams for the addAggregation
@ -117,16 +117,16 @@ public class TransportAggregationModule extends AbstractModule implements SpawnM
InternalGeoBounds.registerStream(); InternalGeoBounds.registerStream();
InternalChildren.registerStream(); InternalChildren.registerStream();
// Reducers // Pipeline Aggregations
DerivativeReducer.registerStreams(); DerivativePipelineAggregator.registerStreams();
InternalDerivative.registerStreams(); InternalDerivative.registerStreams();
InternalSimpleValue.registerStreams(); InternalSimpleValue.registerStreams();
InternalBucketMetricValue.registerStreams(); InternalBucketMetricValue.registerStreams();
MaxBucketReducer.registerStreams(); MaxBucketPipelineAggregator.registerStreams();
MinBucketReducer.registerStreams(); MinBucketPipelineAggregator.registerStreams();
AvgBucketReducer.registerStreams(); AvgBucketPipelineAggregator.registerStreams();
SumBucketReducer.registerStreams(); SumBucketPipelineAggregator.registerStreams();
MovAvgReducer.registerStreams(); MovAvgPipelineAggregator.registerStreams();
} }
@Override @Override

View File

@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -44,8 +44,8 @@ public abstract class BucketsAggregator extends AggregatorBase {
private IntArray docCounts; private IntArray docCounts;
public BucketsAggregator(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent, public BucketsAggregator(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, context, parent, reducers, metaData); super(name, factories, context, parent, pipelineAggregators, metaData);
bigArrays = context.bigArrays(); bigArrays = context.bigArrays();
docCounts = bigArrays.newIntArray(1, true); docCounts = bigArrays.newIntArray(1, true);
} }

View File

@ -23,7 +23,7 @@ import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -47,8 +47,8 @@ public abstract class InternalSingleBucketAggregation extends InternalAggregatio
* @param docCount The document count in the single bucket. * @param docCount The document count in the single bucket.
* @param aggregations The already built sub-aggregations that are associated with the bucket. * @param aggregations The already built sub-aggregations that are associated with the bucket.
*/ */
protected InternalSingleBucketAggregation(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers, Map<String, Object> metaData) { protected InternalSingleBucketAggregation(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
this.docCount = docCount; this.docCount = docCount;
this.aggregations = aggregations; this.aggregations = aggregations;
} }

View File

@ -20,7 +20,7 @@ package org.elasticsearch.search.aggregations.bucket;
import org.elasticsearch.search.aggregations.Aggregator; import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.AggregatorFactories; import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -34,8 +34,8 @@ public abstract class SingleBucketAggregator extends BucketsAggregator {
protected SingleBucketAggregator(String name, AggregatorFactories factories, protected SingleBucketAggregator(String name, AggregatorFactories factories,
AggregationContext aggregationContext, Aggregator parent, AggregationContext aggregationContext, Aggregator parent,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, reducers, metaData); super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
} }
} }

View File

@ -23,7 +23,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation; import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -51,9 +51,9 @@ public class InternalChildren extends InternalSingleBucketAggregation implements
public InternalChildren() { public InternalChildren() {
} }
public InternalChildren(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers, public InternalChildren(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, docCount, aggregations, reducers, metaData); super(name, docCount, aggregations, pipelineAggregators, metaData);
} }
@Override @Override
@ -63,6 +63,6 @@ public class InternalChildren extends InternalSingleBucketAggregation implements
@Override @Override
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) { protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
return new InternalChildren(name, docCount, subAggregations, reducers(), getMetaData()); return new InternalChildren(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
} }
} }

View File

@ -36,7 +36,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.NonCollectingAggregator; import org.elasticsearch.search.aggregations.NonCollectingAggregator;
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator; import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -74,8 +74,8 @@ public class ParentToChildrenAggregator extends SingleBucketAggregator {
public ParentToChildrenAggregator(String name, AggregatorFactories factories, AggregationContext aggregationContext, public ParentToChildrenAggregator(String name, AggregatorFactories factories, AggregationContext aggregationContext,
Aggregator parent, String parentType, Filter childFilter, Filter parentFilter, Aggregator parent, String parentType, Filter childFilter, Filter parentFilter,
ValuesSource.Bytes.WithOrdinals.ParentChild valuesSource, ValuesSource.Bytes.WithOrdinals.ParentChild valuesSource,
long maxOrd, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { long maxOrd, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, reducers, metaData); super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.parentType = parentType; this.parentType = parentType;
// these two filters are cached in the parser // these two filters are cached in the parser
this.childFilter = aggregationContext.searchContext().searcher().createNormalizedWeight(childFilter, false); this.childFilter = aggregationContext.searchContext().searcher().createNormalizedWeight(childFilter, false);
@ -88,13 +88,13 @@ public class ParentToChildrenAggregator extends SingleBucketAggregator {
@Override @Override
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException { public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
return new InternalChildren(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(), return new InternalChildren(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
metaData()); metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalChildren(name, 0, buildEmptySubAggregations(), reducers(), metaData()); return new InternalChildren(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
} }
@Override @Override
@ -196,13 +196,13 @@ public class ParentToChildrenAggregator extends SingleBucketAggregator {
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
return new NonCollectingAggregator(name, aggregationContext, parent, reducers, metaData) { return new NonCollectingAggregator(name, aggregationContext, parent, pipelineAggregators, metaData) {
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalChildren(name, 0, buildEmptySubAggregations(), reducers(), metaData()); return new InternalChildren(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
} }
}; };
@ -210,11 +210,11 @@ public class ParentToChildrenAggregator extends SingleBucketAggregator {
@Override @Override
protected Aggregator doCreateInternal(ValuesSource.Bytes.WithOrdinals.ParentChild valuesSource, protected Aggregator doCreateInternal(ValuesSource.Bytes.WithOrdinals.ParentChild valuesSource,
AggregationContext aggregationContext, Aggregator parent, boolean collectsFromSingleBucket, List<Reducer> reducers, AggregationContext aggregationContext, Aggregator parent, boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
long maxOrd = valuesSource.globalMaxOrd(aggregationContext.searchContext().searcher(), parentType); long maxOrd = valuesSource.globalMaxOrd(aggregationContext.searchContext().searcher(), parentType);
return new ParentToChildrenAggregator(name, factories, aggregationContext, parent, parentType, childFilter, parentFilter, return new ParentToChildrenAggregator(name, factories, aggregationContext, parent, parentType, childFilter, parentFilter,
valuesSource, maxOrd, reducers, metaData); valuesSource, maxOrd, pipelineAggregators, metaData);
} }
} }

View File

@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator; import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -48,9 +48,9 @@ public class FilterAggregator extends SingleBucketAggregator {
Query filter, Query filter,
AggregatorFactories factories, AggregatorFactories factories,
AggregationContext aggregationContext, AggregationContext aggregationContext,
Aggregator parent, List<Reducer> reducers, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, reducers, metaData); super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.filter = aggregationContext.searchContext().searcher().createNormalizedWeight(filter, false); this.filter = aggregationContext.searchContext().searcher().createNormalizedWeight(filter, false);
} }
@ -71,13 +71,13 @@ public class FilterAggregator extends SingleBucketAggregator {
@Override @Override
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException { public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
return new InternalFilter(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(), return new InternalFilter(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
metaData()); metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalFilter(name, 0, buildEmptySubAggregations(), reducers(), metaData()); return new InternalFilter(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
} }
public static class Factory extends AggregatorFactory { public static class Factory extends AggregatorFactory {
@ -91,8 +91,8 @@ public class FilterAggregator extends SingleBucketAggregator {
@Override @Override
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket, public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
return new FilterAggregator(name, filter, factories, context, parent, reducers, metaData); return new FilterAggregator(name, filter, factories, context, parent, pipelineAggregators, metaData);
} }
} }

View File

@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation; import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -50,8 +50,8 @@ public class InternalFilter extends InternalSingleBucketAggregation implements F
InternalFilter() {} // for serialization InternalFilter() {} // for serialization
InternalFilter(String name, long docCount, InternalAggregations subAggregations, List<Reducer> reducers, Map<String, Object> metaData) { InternalFilter(String name, long docCount, InternalAggregations subAggregations, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, docCount, subAggregations, reducers, metaData); super(name, docCount, subAggregations, pipelineAggregators, metaData);
} }
@Override @Override
@ -61,6 +61,6 @@ public class InternalFilter extends InternalSingleBucketAggregation implements F
@Override @Override
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) { protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
return new InternalFilter(name, docCount, subAggregations, reducers(), getMetaData()); return new InternalFilter(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
} }
} }

View File

@ -34,7 +34,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.bucket.BucketsAggregator; import org.elasticsearch.search.aggregations.bucket.BucketsAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -62,9 +62,9 @@ public class FiltersAggregator extends BucketsAggregator {
private final boolean keyed; private final boolean keyed;
public FiltersAggregator(String name, AggregatorFactories factories, List<KeyedFilter> filters, boolean keyed, AggregationContext aggregationContext, public FiltersAggregator(String name, AggregatorFactories factories, List<KeyedFilter> filters, boolean keyed, AggregationContext aggregationContext,
Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException { throws IOException {
super(name, factories, aggregationContext, parent, reducers, metaData); super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.keyed = keyed; this.keyed = keyed;
this.keys = new String[filters.size()]; this.keys = new String[filters.size()];
this.filters = new Weight[filters.size()]; this.filters = new Weight[filters.size()];
@ -103,7 +103,7 @@ public class FiltersAggregator extends BucketsAggregator {
InternalFilters.Bucket bucket = new InternalFilters.Bucket(keys[i], bucketDocCount(bucketOrd), bucketAggregations(bucketOrd), keyed); InternalFilters.Bucket bucket = new InternalFilters.Bucket(keys[i], bucketDocCount(bucketOrd), bucketAggregations(bucketOrd), keyed);
buckets.add(bucket); buckets.add(bucket);
} }
return new InternalFilters(name, buckets, keyed, reducers(), metaData()); return new InternalFilters(name, buckets, keyed, pipelineAggregators(), metaData());
} }
@Override @Override
@ -114,7 +114,7 @@ public class FiltersAggregator extends BucketsAggregator {
InternalFilters.Bucket bucket = new InternalFilters.Bucket(keys[i], 0, subAggs, keyed); InternalFilters.Bucket bucket = new InternalFilters.Bucket(keys[i], 0, subAggs, keyed);
buckets.add(bucket); buckets.add(bucket);
} }
return new InternalFilters(name, buckets, keyed, reducers(), metaData()); return new InternalFilters(name, buckets, keyed, pipelineAggregators(), metaData());
} }
final long bucketOrd(long owningBucketOrdinal, int filterOrd) { final long bucketOrd(long owningBucketOrdinal, int filterOrd) {
@ -134,8 +134,8 @@ public class FiltersAggregator extends BucketsAggregator {
@Override @Override
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket, public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
return new FiltersAggregator(name, factories, filters, keyed, context, parent, reducers, metaData); return new FiltersAggregator(name, factories, filters, keyed, context, parent, pipelineAggregators, metaData);
} }
} }

View File

@ -32,7 +32,7 @@ import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation.InternalBucket; import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation.InternalBucket;
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext; import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -165,8 +165,8 @@ public class InternalFilters extends InternalMultiBucketAggregation<InternalFilt
public InternalFilters() {} // for serialization public InternalFilters() {} // for serialization
public InternalFilters(String name, List<Bucket> buckets, boolean keyed, List<Reducer> reducers, Map<String, Object> metaData) { public InternalFilters(String name, List<Bucket> buckets, boolean keyed, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
this.buckets = buckets; this.buckets = buckets;
this.keyed = keyed; this.keyed = keyed;
} }
@ -178,7 +178,7 @@ public class InternalFilters extends InternalMultiBucketAggregation<InternalFilt
@Override @Override
public InternalFilters create(List<Bucket> buckets) { public InternalFilters create(List<Bucket> buckets) {
return new InternalFilters(this.name, buckets, this.keyed, this.reducers(), this.metaData); return new InternalFilters(this.name, buckets, this.keyed, this.pipelineAggregators(), this.metaData);
} }
@Override @Override
@ -222,7 +222,7 @@ public class InternalFilters extends InternalMultiBucketAggregation<InternalFilt
} }
} }
InternalFilters reduced = new InternalFilters(name, new ArrayList<Bucket>(bucketsList.size()), keyed, reducers(), getMetaData()); InternalFilters reduced = new InternalFilters(name, new ArrayList<Bucket>(bucketsList.size()), keyed, pipelineAggregators(), getMetaData());
for (List<Bucket> sameRangeList : bucketsList) { for (List<Bucket> sameRangeList : bucketsList) {
reduced.buckets.add((sameRangeList.get(0)).reduce(sameRangeList, reduceContext)); reduced.buckets.add((sameRangeList.get(0)).reduce(sameRangeList, reduceContext));
} }

View File

@ -28,7 +28,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.bucket.BucketsAggregator; import org.elasticsearch.search.aggregations.bucket.BucketsAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
@ -51,9 +51,9 @@ public class GeoHashGridAggregator extends BucketsAggregator {
private final LongHash bucketOrds; private final LongHash bucketOrds;
public GeoHashGridAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource, public GeoHashGridAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource,
int requiredSize, int shardSize, AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, int requiredSize, int shardSize, AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, reducers, metaData); super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.requiredSize = requiredSize; this.requiredSize = requiredSize;
this.shardSize = shardSize; this.shardSize = shardSize;
@ -129,12 +129,12 @@ public class GeoHashGridAggregator extends BucketsAggregator {
bucket.aggregations = bucketAggregations(bucket.bucketOrd); bucket.aggregations = bucketAggregations(bucket.bucketOrd);
list[i] = bucket; list[i] = bucket;
} }
return new InternalGeoHashGrid(name, requiredSize, Arrays.asList(list), reducers(), metaData()); return new InternalGeoHashGrid(name, requiredSize, Arrays.asList(list), pipelineAggregators(), metaData());
} }
@Override @Override
public InternalGeoHashGrid buildEmptyAggregation() { public InternalGeoHashGrid buildEmptyAggregation() {
return new InternalGeoHashGrid(name, requiredSize, Collections.<InternalGeoHashGrid.Bucket> emptyList(), reducers(), metaData()); return new InternalGeoHashGrid(name, requiredSize, Collections.<InternalGeoHashGrid.Bucket> emptyList(), pipelineAggregators(), metaData());
} }

View File

@ -34,7 +34,7 @@ import org.elasticsearch.search.aggregations.AggregatorFactory;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.NonCollectingAggregator; import org.elasticsearch.search.aggregations.NonCollectingAggregator;
import org.elasticsearch.search.aggregations.bucket.BucketUtils; import org.elasticsearch.search.aggregations.bucket.BucketUtils;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -125,11 +125,11 @@ public class GeoHashGridParser implements Aggregator.Parser {
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
final InternalAggregation aggregation = new InternalGeoHashGrid(name, requiredSize, final InternalAggregation aggregation = new InternalGeoHashGrid(name, requiredSize,
Collections.<InternalGeoHashGrid.Bucket> emptyList(), reducers, metaData); Collections.<InternalGeoHashGrid.Bucket> emptyList(), pipelineAggregators, metaData);
return new NonCollectingAggregator(name, aggregationContext, parent, reducers, metaData) { return new NonCollectingAggregator(name, aggregationContext, parent, pipelineAggregators, metaData) {
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return aggregation; return aggregation;
} }
@ -138,13 +138,13 @@ public class GeoHashGridParser implements Aggregator.Parser {
@Override @Override
protected Aggregator doCreateInternal(final ValuesSource.GeoPoint valuesSource, AggregationContext aggregationContext, protected Aggregator doCreateInternal(final ValuesSource.GeoPoint valuesSource, AggregationContext aggregationContext,
Aggregator parent, boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) Aggregator parent, boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException { throws IOException {
if (collectsFromSingleBucket == false) { if (collectsFromSingleBucket == false) {
return asMultiBucketAggregator(this, aggregationContext, parent); return asMultiBucketAggregator(this, aggregationContext, parent);
} }
ValuesSource.Numeric cellIdSource = new CellIdSource(valuesSource, precision); ValuesSource.Numeric cellIdSource = new CellIdSource(valuesSource, precision);
return new GeoHashGridAggregator(name, factories, cellIdSource, requiredSize, shardSize, aggregationContext, parent, reducers, return new GeoHashGridAggregator(name, factories, cellIdSource, requiredSize, shardSize, aggregationContext, parent, pipelineAggregators,
metaData); metaData);
} }

View File

@ -32,7 +32,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation; import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext; import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -171,9 +171,9 @@ public class InternalGeoHashGrid extends InternalMultiBucketAggregation<Internal
InternalGeoHashGrid() { InternalGeoHashGrid() {
} // for serialization } // for serialization
public InternalGeoHashGrid(String name, int requiredSize, Collection<Bucket> buckets, List<Reducer> reducers, public InternalGeoHashGrid(String name, int requiredSize, Collection<Bucket> buckets, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
this.requiredSize = requiredSize; this.requiredSize = requiredSize;
this.buckets = buckets; this.buckets = buckets;
} }
@ -185,7 +185,7 @@ public class InternalGeoHashGrid extends InternalMultiBucketAggregation<Internal
@Override @Override
public InternalGeoHashGrid create(List<Bucket> buckets) { public InternalGeoHashGrid create(List<Bucket> buckets) {
return new InternalGeoHashGrid(this.name, this.requiredSize, buckets, this.reducers(), this.metaData); return new InternalGeoHashGrid(this.name, this.requiredSize, buckets, this.pipelineAggregators(), this.metaData);
} }
@Override @Override
@ -229,7 +229,7 @@ public class InternalGeoHashGrid extends InternalMultiBucketAggregation<Internal
for (int i = ordered.size() - 1; i >= 0; i--) { for (int i = ordered.size() - 1; i >= 0; i--) {
list[i] = ordered.pop(); list[i] = ordered.pop();
} }
return new InternalGeoHashGrid(getName(), requiredSize, Arrays.asList(list), reducers(), getMetaData()); return new InternalGeoHashGrid(getName(), requiredSize, Arrays.asList(list), pipelineAggregators(), getMetaData());
} }
@Override @Override

View File

@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator; import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -39,9 +39,9 @@ import java.util.Map;
*/ */
public class GlobalAggregator extends SingleBucketAggregator { public class GlobalAggregator extends SingleBucketAggregator {
public GlobalAggregator(String name, AggregatorFactories subFactories, AggregationContext aggregationContext, List<Reducer> reducers, public GlobalAggregator(String name, AggregatorFactories subFactories, AggregationContext aggregationContext, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, subFactories, aggregationContext, null, reducers, metaData); super(name, subFactories, aggregationContext, null, pipelineAggregators, metaData);
} }
@Override @Override
@ -59,7 +59,7 @@ public class GlobalAggregator extends SingleBucketAggregator {
@Override @Override
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException { public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
assert owningBucketOrdinal == 0 : "global aggregator can only be a top level aggregator"; assert owningBucketOrdinal == 0 : "global aggregator can only be a top level aggregator";
return new InternalGlobal(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(), return new InternalGlobal(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
metaData()); metaData());
} }
@ -76,7 +76,7 @@ public class GlobalAggregator extends SingleBucketAggregator {
@Override @Override
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket, public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
if (parent != null) { if (parent != null) {
throw new AggregationExecutionException("Aggregation [" + parent.name() + "] cannot have a global " + throw new AggregationExecutionException("Aggregation [" + parent.name() + "] cannot have a global " +
"sub-aggregation [" + name + "]. Global aggregations can only be defined as top level aggregations"); "sub-aggregation [" + name + "]. Global aggregations can only be defined as top level aggregations");
@ -84,7 +84,7 @@ public class GlobalAggregator extends SingleBucketAggregator {
if (collectsFromSingleBucket == false) { if (collectsFromSingleBucket == false) {
throw new IllegalStateException(); throw new IllegalStateException();
} }
return new GlobalAggregator(name, factories, context, reducers, metaData); return new GlobalAggregator(name, factories, context, pipelineAggregators, metaData);
} }
} }

View File

@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation; import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -51,8 +51,8 @@ public class InternalGlobal extends InternalSingleBucketAggregation implements G
InternalGlobal() {} // for serialization InternalGlobal() {} // for serialization
InternalGlobal(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers, Map<String, Object> metaData) { InternalGlobal(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, docCount, aggregations, reducers, metaData); super(name, docCount, aggregations, pipelineAggregators, metaData);
} }
@Override @Override
@ -62,6 +62,6 @@ public class InternalGlobal extends InternalSingleBucketAggregation implements G
@Override @Override
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) { protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
return new InternalGlobal(name, docCount, subAggregations, reducers(), getMetaData()); return new InternalGlobal(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
} }
} }

View File

@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.bucket.BucketsAggregator; import org.elasticsearch.search.aggregations.bucket.BucketsAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -62,9 +62,9 @@ public class HistogramAggregator extends BucketsAggregator {
boolean keyed, long minDocCount, @Nullable ExtendedBounds extendedBounds, boolean keyed, long minDocCount, @Nullable ExtendedBounds extendedBounds,
@Nullable ValuesSource.Numeric valuesSource, @Nullable ValueFormatter formatter, @Nullable ValuesSource.Numeric valuesSource, @Nullable ValueFormatter formatter,
InternalHistogram.Factory<?> histogramFactory, AggregationContext aggregationContext, InternalHistogram.Factory<?> histogramFactory, AggregationContext aggregationContext,
Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, reducers, metaData); super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.rounding = rounding; this.rounding = rounding;
this.order = order; this.order = order;
this.keyed = keyed; this.keyed = keyed;
@ -130,13 +130,13 @@ public class HistogramAggregator extends BucketsAggregator {
// value source will be null for unmapped fields // value source will be null for unmapped fields
InternalHistogram.EmptyBucketInfo emptyBucketInfo = minDocCount == 0 ? new InternalHistogram.EmptyBucketInfo(rounding, buildEmptySubAggregations(), extendedBounds) : null; InternalHistogram.EmptyBucketInfo emptyBucketInfo = minDocCount == 0 ? new InternalHistogram.EmptyBucketInfo(rounding, buildEmptySubAggregations(), extendedBounds) : null;
return histogramFactory.create(name, buckets, order, minDocCount, emptyBucketInfo, formatter, keyed, reducers(), metaData()); return histogramFactory.create(name, buckets, order, minDocCount, emptyBucketInfo, formatter, keyed, pipelineAggregators(), metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
InternalHistogram.EmptyBucketInfo emptyBucketInfo = minDocCount == 0 ? new InternalHistogram.EmptyBucketInfo(rounding, buildEmptySubAggregations(), extendedBounds) : null; InternalHistogram.EmptyBucketInfo emptyBucketInfo = minDocCount == 0 ? new InternalHistogram.EmptyBucketInfo(rounding, buildEmptySubAggregations(), extendedBounds) : null;
return histogramFactory.create(name, Collections.emptyList(), order, minDocCount, emptyBucketInfo, formatter, keyed, reducers(), return histogramFactory.create(name, Collections.emptyList(), order, minDocCount, emptyBucketInfo, formatter, keyed, pipelineAggregators(),
metaData()); metaData());
} }
@ -172,15 +172,15 @@ public class HistogramAggregator extends BucketsAggregator {
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
return new HistogramAggregator(name, factories, rounding, order, keyed, minDocCount, null, null, config.formatter(), return new HistogramAggregator(name, factories, rounding, order, keyed, minDocCount, null, null, config.formatter(),
histogramFactory, aggregationContext, parent, reducers, metaData); histogramFactory, aggregationContext, parent, pipelineAggregators, metaData);
} }
@Override @Override
protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent, protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent,
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
if (collectsFromSingleBucket == false) { if (collectsFromSingleBucket == false) {
return asMultiBucketAggregator(this, aggregationContext, parent); return asMultiBucketAggregator(this, aggregationContext, parent);
} }
@ -194,7 +194,7 @@ public class HistogramAggregator extends BucketsAggregator {
roundedBounds = extendedBounds.round(rounding); roundedBounds = extendedBounds.round(rounding);
} }
return new HistogramAggregator(name, factories, rounding, order, keyed, minDocCount, roundedBounds, valuesSource, return new HistogramAggregator(name, factories, rounding, order, keyed, minDocCount, roundedBounds, valuesSource,
config.formatter(), histogramFactory, aggregationContext, parent, reducers, metaData); config.formatter(), histogramFactory, aggregationContext, parent, pipelineAggregators, metaData);
} }
} }

View File

@ -37,7 +37,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation; import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext; import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams; import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
@ -243,14 +243,16 @@ public class InternalHistogram<B extends InternalHistogram.Bucket> extends Inter
} }
public InternalHistogram<B> create(String name, List<B> buckets, InternalOrder order, long minDocCount, public InternalHistogram<B> create(String name, List<B> buckets, InternalOrder order, long minDocCount,
EmptyBucketInfo emptyBucketInfo, @Nullable ValueFormatter formatter, boolean keyed, List<Reducer> reducers, EmptyBucketInfo emptyBucketInfo, @Nullable ValueFormatter formatter, boolean keyed,
List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
return new InternalHistogram<>(name, buckets, order, minDocCount, emptyBucketInfo, formatter, keyed, this, reducers, metaData); return new InternalHistogram<>(name, buckets, order, minDocCount, emptyBucketInfo, formatter, keyed, this, pipelineAggregators,
metaData);
} }
public InternalHistogram<B> create(List<B> buckets, InternalHistogram<B> prototype) { public InternalHistogram<B> create(List<B> buckets, InternalHistogram<B> prototype) {
return new InternalHistogram<>(prototype.name, buckets, prototype.order, prototype.minDocCount, prototype.emptyBucketInfo, return new InternalHistogram<>(prototype.name, buckets, prototype.order, prototype.minDocCount, prototype.emptyBucketInfo,
prototype.formatter, prototype.keyed, this, prototype.reducers(), prototype.metaData); prototype.formatter, prototype.keyed, this, prototype.pipelineAggregators(), prototype.metaData);
} }
public B createBucket(InternalAggregations aggregations, B prototype) { public B createBucket(InternalAggregations aggregations, B prototype) {
@ -284,8 +286,9 @@ public class InternalHistogram<B extends InternalHistogram.Bucket> extends Inter
InternalHistogram(String name, List<B> buckets, InternalOrder order, long minDocCount, InternalHistogram(String name, List<B> buckets, InternalOrder order, long minDocCount,
EmptyBucketInfo emptyBucketInfo, EmptyBucketInfo emptyBucketInfo,
@Nullable ValueFormatter formatter, boolean keyed, Factory<B> factory, List<Reducer> reducers, Map<String, Object> metaData) { @Nullable ValueFormatter formatter, boolean keyed, Factory<B> factory, List<PipelineAggregator> pipelineAggregators,
super(name, reducers, metaData); Map<String, Object> metaData) {
super(name, pipelineAggregators, metaData);
this.buckets = buckets; this.buckets = buckets;
this.order = order; this.order = order;
assert (minDocCount == 0) == (emptyBucketInfo != null); assert (minDocCount == 0) == (emptyBucketInfo != null);
@ -470,7 +473,7 @@ public class InternalHistogram<B extends InternalHistogram.Bucket> extends Inter
CollectionUtil.introSort(reducedBuckets, order.comparator()); CollectionUtil.introSort(reducedBuckets, order.comparator());
} }
return getFactory().create(getName(), reducedBuckets, order, minDocCount, emptyBucketInfo, formatter, keyed, reducers(), return getFactory().create(getName(), reducedBuckets, order, minDocCount, emptyBucketInfo, formatter, keyed, pipelineAggregators(),
getMetaData()); getMetaData());
} }

View File

@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation; import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -52,8 +52,8 @@ public class InternalMissing extends InternalSingleBucketAggregation implements
InternalMissing() { InternalMissing() {
} }
InternalMissing(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers, Map<String, Object> metaData) { InternalMissing(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, docCount, aggregations, reducers, metaData); super(name, docCount, aggregations, pipelineAggregators, metaData);
} }
@Override @Override
@ -63,6 +63,6 @@ public class InternalMissing extends InternalSingleBucketAggregation implements
@Override @Override
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) { protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
return new InternalMissing(name, docCount, subAggregations, reducers(), getMetaData()); return new InternalMissing(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
} }
} }

View File

@ -26,7 +26,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator; import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -44,9 +44,9 @@ public class MissingAggregator extends SingleBucketAggregator {
private final ValuesSource valuesSource; private final ValuesSource valuesSource;
public MissingAggregator(String name, AggregatorFactories factories, ValuesSource valuesSource, public MissingAggregator(String name, AggregatorFactories factories, ValuesSource valuesSource,
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, reducers, metaData); super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
} }
@ -72,13 +72,13 @@ public class MissingAggregator extends SingleBucketAggregator {
@Override @Override
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException { public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
return new InternalMissing(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(), return new InternalMissing(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
metaData()); metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalMissing(name, 0, buildEmptySubAggregations(), reducers(), metaData()); return new InternalMissing(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
} }
public static class Factory extends ValuesSourceAggregatorFactory<ValuesSource> { public static class Factory extends ValuesSourceAggregatorFactory<ValuesSource> {
@ -88,15 +88,15 @@ public class MissingAggregator extends SingleBucketAggregator {
} }
@Override @Override
protected MissingAggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected MissingAggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
return new MissingAggregator(name, factories, null, aggregationContext, parent, reducers, metaData); return new MissingAggregator(name, factories, null, aggregationContext, parent, pipelineAggregators, metaData);
} }
@Override @Override
protected MissingAggregator doCreateInternal(ValuesSource valuesSource, AggregationContext aggregationContext, Aggregator parent, protected MissingAggregator doCreateInternal(ValuesSource valuesSource, AggregationContext aggregationContext, Aggregator parent,
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
return new MissingAggregator(name, factories, valuesSource, aggregationContext, parent, reducers, metaData); return new MissingAggregator(name, factories, valuesSource, aggregationContext, parent, pipelineAggregators, metaData);
} }
} }

View File

@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation; import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -51,9 +51,9 @@ public class InternalNested extends InternalSingleBucketAggregation implements N
public InternalNested() { public InternalNested() {
} }
public InternalNested(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers, public InternalNested(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, docCount, aggregations, reducers, metaData); super(name, docCount, aggregations, pipelineAggregators, metaData);
} }
@Override @Override
@ -63,6 +63,6 @@ public class InternalNested extends InternalSingleBucketAggregation implements N
@Override @Override
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) { protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
return new InternalNested(name, docCount, subAggregations, reducers(), getMetaData()); return new InternalNested(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
} }
} }

View File

@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation; import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -51,9 +51,9 @@ public class InternalReverseNested extends InternalSingleBucketAggregation imple
public InternalReverseNested() { public InternalReverseNested() {
} }
public InternalReverseNested(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers, public InternalReverseNested(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, docCount, aggregations, reducers, metaData); super(name, docCount, aggregations, pipelineAggregators, metaData);
} }
@Override @Override
@ -63,6 +63,6 @@ public class InternalReverseNested extends InternalSingleBucketAggregation imple
@Override @Override
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) { protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
return new InternalReverseNested(name, docCount, subAggregations, reducers(), getMetaData()); return new InternalReverseNested(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
} }
} }

View File

@ -38,7 +38,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.NonCollectingAggregator; import org.elasticsearch.search.aggregations.NonCollectingAggregator;
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator; import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -56,8 +56,8 @@ public class NestedAggregator extends SingleBucketAggregator {
private DocIdSetIterator childDocs; private DocIdSetIterator childDocs;
private BitSet parentDocs; private BitSet parentDocs;
public NestedAggregator(String name, AggregatorFactories factories, ObjectMapper objectMapper, AggregationContext aggregationContext, Aggregator parentAggregator, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { public NestedAggregator(String name, AggregatorFactories factories, ObjectMapper objectMapper, AggregationContext aggregationContext, Aggregator parentAggregator, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parentAggregator, reducers, metaData); super(name, factories, aggregationContext, parentAggregator, pipelineAggregators, metaData);
childFilter = objectMapper.nestedTypeFilter(); childFilter = objectMapper.nestedTypeFilter();
} }
@ -121,13 +121,13 @@ public class NestedAggregator extends SingleBucketAggregator {
@Override @Override
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException { public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
return new InternalNested(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(), return new InternalNested(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
metaData()); metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalNested(name, 0, buildEmptySubAggregations(), reducers(), metaData()); return new InternalNested(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
} }
private static Filter findClosestNestedPath(Aggregator parent) { private static Filter findClosestNestedPath(Aggregator parent) {
@ -152,34 +152,34 @@ public class NestedAggregator extends SingleBucketAggregator {
@Override @Override
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket, public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
if (collectsFromSingleBucket == false) { if (collectsFromSingleBucket == false) {
return asMultiBucketAggregator(this, context, parent); return asMultiBucketAggregator(this, context, parent);
} }
MapperService.SmartNameObjectMapper mapper = context.searchContext().smartNameObjectMapper(path); MapperService.SmartNameObjectMapper mapper = context.searchContext().smartNameObjectMapper(path);
if (mapper == null) { if (mapper == null) {
return new Unmapped(name, context, parent, reducers, metaData); return new Unmapped(name, context, parent, pipelineAggregators, metaData);
} }
ObjectMapper objectMapper = mapper.mapper(); ObjectMapper objectMapper = mapper.mapper();
if (objectMapper == null) { if (objectMapper == null) {
return new Unmapped(name, context, parent, reducers, metaData); return new Unmapped(name, context, parent, pipelineAggregators, metaData);
} }
if (!objectMapper.nested().isNested()) { if (!objectMapper.nested().isNested()) {
throw new AggregationExecutionException("[nested] nested path [" + path + "] is not nested"); throw new AggregationExecutionException("[nested] nested path [" + path + "] is not nested");
} }
return new NestedAggregator(name, factories, objectMapper, context, parent, reducers, metaData); return new NestedAggregator(name, factories, objectMapper, context, parent, pipelineAggregators, metaData);
} }
private final static class Unmapped extends NonCollectingAggregator { private final static class Unmapped extends NonCollectingAggregator {
public Unmapped(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) public Unmapped(String name, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException { throws IOException {
super(name, context, parent, reducers, metaData); super(name, context, parent, pipelineAggregators, metaData);
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalNested(name, 0, buildEmptySubAggregations(), reducers(), metaData()); return new InternalNested(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
} }
} }
} }

View File

@ -40,7 +40,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.NonCollectingAggregator; import org.elasticsearch.search.aggregations.NonCollectingAggregator;
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator; import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -55,9 +55,9 @@ public class ReverseNestedAggregator extends SingleBucketAggregator {
private final BitDocIdSetFilter parentFilter; private final BitDocIdSetFilter parentFilter;
public ReverseNestedAggregator(String name, AggregatorFactories factories, ObjectMapper objectMapper, public ReverseNestedAggregator(String name, AggregatorFactories factories, ObjectMapper objectMapper,
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException { throws IOException {
super(name, factories, aggregationContext, parent, reducers, metaData); super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
if (objectMapper == null) { if (objectMapper == null) {
parentFilter = context.searchContext().bitsetFilterCache().getBitDocIdSetFilter(Queries.newNonNestedFilter()); parentFilter = context.searchContext().bitsetFilterCache().getBitDocIdSetFilter(Queries.newNonNestedFilter());
} else { } else {
@ -111,13 +111,13 @@ public class ReverseNestedAggregator extends SingleBucketAggregator {
@Override @Override
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException { public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
return new InternalReverseNested(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(), return new InternalReverseNested(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
metaData()); metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalReverseNested(name, 0, buildEmptySubAggregations(), reducers(), metaData()); return new InternalReverseNested(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
} }
Filter getParentFilter() { Filter getParentFilter() {
@ -135,7 +135,7 @@ public class ReverseNestedAggregator extends SingleBucketAggregator {
@Override @Override
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket, public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
// Early validation // Early validation
NestedAggregator closestNestedAggregator = findClosestNestedAggregator(parent); NestedAggregator closestNestedAggregator = findClosestNestedAggregator(parent);
if (closestNestedAggregator == null) { if (closestNestedAggregator == null) {
@ -147,11 +147,11 @@ public class ReverseNestedAggregator extends SingleBucketAggregator {
if (path != null) { if (path != null) {
MapperService.SmartNameObjectMapper mapper = context.searchContext().smartNameObjectMapper(path); MapperService.SmartNameObjectMapper mapper = context.searchContext().smartNameObjectMapper(path);
if (mapper == null) { if (mapper == null) {
return new Unmapped(name, context, parent, reducers, metaData); return new Unmapped(name, context, parent, pipelineAggregators, metaData);
} }
objectMapper = mapper.mapper(); objectMapper = mapper.mapper();
if (objectMapper == null) { if (objectMapper == null) {
return new Unmapped(name, context, parent, reducers, metaData); return new Unmapped(name, context, parent, pipelineAggregators, metaData);
} }
if (!objectMapper.nested().isNested()) { if (!objectMapper.nested().isNested()) {
throw new AggregationExecutionException("[reverse_nested] nested path [" + path + "] is not nested"); throw new AggregationExecutionException("[reverse_nested] nested path [" + path + "] is not nested");
@ -159,19 +159,19 @@ public class ReverseNestedAggregator extends SingleBucketAggregator {
} else { } else {
objectMapper = null; objectMapper = null;
} }
return new ReverseNestedAggregator(name, factories, objectMapper, context, parent, reducers, metaData); return new ReverseNestedAggregator(name, factories, objectMapper, context, parent, pipelineAggregators, metaData);
} }
private final static class Unmapped extends NonCollectingAggregator { private final static class Unmapped extends NonCollectingAggregator {
public Unmapped(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) public Unmapped(String name, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException { throws IOException {
super(name, context, parent, reducers, metaData); super(name, context, parent, pipelineAggregators, metaData);
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalReverseNested(name, 0, buildEmptySubAggregations(), reducers(), metaData()); return new InternalReverseNested(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
} }
} }
} }

View File

@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation; import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext; import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams; import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
@ -231,9 +231,9 @@ public class InternalRange<B extends InternalRange.Bucket, R extends InternalRan
return TYPE.name(); return TYPE.name();
} }
public R create(String name, List<B> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<Reducer> reducers, public R create(String name, List<B> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
return (R) new InternalRange<>(name, ranges, formatter, keyed, reducers, metaData); return (R) new InternalRange<>(name, ranges, formatter, keyed, pipelineAggregators, metaData);
} }
public B createBucket(String key, double from, double to, long docCount, InternalAggregations aggregations, boolean keyed, public B createBucket(String key, double from, double to, long docCount, InternalAggregations aggregations, boolean keyed,
@ -242,7 +242,7 @@ public class InternalRange<B extends InternalRange.Bucket, R extends InternalRan
} }
public R create(List<B> ranges, R prototype) { public R create(List<B> ranges, R prototype) {
return (R) new InternalRange<>(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.reducers(), return (R) new InternalRange<>(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.pipelineAggregators(),
prototype.metaData); prototype.metaData);
} }
@ -260,9 +260,9 @@ public class InternalRange<B extends InternalRange.Bucket, R extends InternalRan
public InternalRange() {} // for serialization public InternalRange() {} // for serialization
public InternalRange(String name, List<B> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<Reducer> reducers, public InternalRange(String name, List<B> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
this.ranges = ranges; this.ranges = ranges;
this.formatter = formatter; this.formatter = formatter;
this.keyed = keyed; this.keyed = keyed;
@ -311,7 +311,7 @@ public class InternalRange<B extends InternalRange.Bucket, R extends InternalRan
for (int i = 0; i < this.ranges.size(); ++i) { for (int i = 0; i < this.ranges.size(); ++i) {
ranges.add((B) rangeList[i].get(0).reduce(rangeList[i], reduceContext)); ranges.add((B) rangeList[i].get(0).reduce(rangeList[i], reduceContext));
} }
return getFactory().create(name, ranges, formatter, keyed, reducers(), getMetaData()); return getFactory().create(name, ranges, formatter, keyed, pipelineAggregators(), getMetaData());
} }
@Override @Override

View File

@ -33,7 +33,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.NonCollectingAggregator; import org.elasticsearch.search.aggregations.NonCollectingAggregator;
import org.elasticsearch.search.aggregations.bucket.BucketsAggregator; import org.elasticsearch.search.aggregations.bucket.BucketsAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -105,10 +105,10 @@ public class RangeAggregator extends BucketsAggregator {
List<Range> ranges, List<Range> ranges,
boolean keyed, boolean keyed,
AggregationContext aggregationContext, AggregationContext aggregationContext,
Aggregator parent, List<Reducer> reducers, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, reducers, metaData); super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
assert valuesSource != null; assert valuesSource != null;
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.formatter = format != null ? format.formatter() : null; this.formatter = format != null ? format.formatter() : null;
@ -216,7 +216,7 @@ public class RangeAggregator extends BucketsAggregator {
buckets.add(bucket); buckets.add(bucket);
} }
// value source can be null in the case of unmapped fields // value source can be null in the case of unmapped fields
return rangeFactory.create(name, buckets, formatter, keyed, reducers(), metaData()); return rangeFactory.create(name, buckets, formatter, keyed, pipelineAggregators(), metaData());
} }
@Override @Override
@ -230,7 +230,7 @@ public class RangeAggregator extends BucketsAggregator {
buckets.add(bucket); buckets.add(bucket);
} }
// value source can be null in the case of unmapped fields // value source can be null in the case of unmapped fields
return rangeFactory.create(name, buckets, formatter, keyed, reducers(), metaData()); return rangeFactory.create(name, buckets, formatter, keyed, pipelineAggregators(), metaData());
} }
private static final void sortRanges(final Range[] ranges) { private static final void sortRanges(final Range[] ranges) {
@ -267,10 +267,10 @@ public class RangeAggregator extends BucketsAggregator {
ValueFormat format, ValueFormat format,
AggregationContext context, AggregationContext context,
Aggregator parent, Aggregator parent,
InternalRange.Factory factory, List<Reducer> reducers, InternalRange.Factory factory, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, context, parent, reducers, metaData); super(name, context, parent, pipelineAggregators, metaData);
this.ranges = ranges; this.ranges = ranges;
ValueParser parser = format != null ? format.parser() : ValueParser.RAW; ValueParser parser = format != null ? format.parser() : ValueParser.RAW;
for (Range range : this.ranges) { for (Range range : this.ranges) {
@ -288,7 +288,7 @@ public class RangeAggregator extends BucketsAggregator {
for (RangeAggregator.Range range : ranges) { for (RangeAggregator.Range range : ranges) {
buckets.add(factory.createBucket(range.key, range.from, range.to, 0, subAggs, keyed, formatter)); buckets.add(factory.createBucket(range.key, range.from, range.to, 0, subAggs, keyed, formatter));
} }
return factory.create(name, buckets, formatter, keyed, reducers(), metaData()); return factory.create(name, buckets, formatter, keyed, pipelineAggregators(), metaData());
} }
} }
@ -306,15 +306,15 @@ public class RangeAggregator extends BucketsAggregator {
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
return new Unmapped(name, ranges, keyed, config.format(), aggregationContext, parent, rangeFactory, reducers, metaData); return new Unmapped(name, ranges, keyed, config.format(), aggregationContext, parent, rangeFactory, pipelineAggregators, metaData);
} }
@Override @Override
protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent, protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent,
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
return new RangeAggregator(name, factories, valuesSource, config.format(), rangeFactory, ranges, keyed, aggregationContext, parent, reducers, metaData); return new RangeAggregator(name, factories, valuesSource, config.format(), rangeFactory, ranges, keyed, aggregationContext, parent, pipelineAggregators, metaData);
} }
} }

View File

@ -26,7 +26,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext; import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.bucket.range.InternalRange; import org.elasticsearch.search.aggregations.bucket.range.InternalRange;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import org.joda.time.DateTime; import org.joda.time.DateTime;
import org.joda.time.DateTimeZone; import org.joda.time.DateTimeZone;
@ -122,13 +122,13 @@ public class InternalDateRange extends InternalRange<InternalDateRange.Bucket, I
@Override @Override
public InternalDateRange create(String name, List<InternalDateRange.Bucket> ranges, ValueFormatter formatter, boolean keyed, public InternalDateRange create(String name, List<InternalDateRange.Bucket> ranges, ValueFormatter formatter, boolean keyed,
List<Reducer> reducers, Map<String, Object> metaData) { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
return new InternalDateRange(name, ranges, formatter, keyed, reducers, metaData); return new InternalDateRange(name, ranges, formatter, keyed, pipelineAggregators, metaData);
} }
@Override @Override
public InternalDateRange create(List<Bucket> ranges, InternalDateRange prototype) { public InternalDateRange create(List<Bucket> ranges, InternalDateRange prototype) {
return new InternalDateRange(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.reducers(), return new InternalDateRange(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.pipelineAggregators(),
prototype.metaData); prototype.metaData);
} }
@ -147,8 +147,8 @@ public class InternalDateRange extends InternalRange<InternalDateRange.Bucket, I
InternalDateRange() {} // for serialization InternalDateRange() {} // for serialization
InternalDateRange(String name, List<InternalDateRange.Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed, InternalDateRange(String name, List<InternalDateRange.Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed,
List<Reducer> reducers, Map<String, Object> metaData) { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, ranges, formatter, keyed, reducers, metaData); super(name, ranges, formatter, keyed, pipelineAggregators, metaData);
} }
@Override @Override

View File

@ -35,7 +35,7 @@ import org.elasticsearch.search.aggregations.AggregatorFactory;
import org.elasticsearch.search.aggregations.bucket.range.InternalRange; import org.elasticsearch.search.aggregations.bucket.range.InternalRange;
import org.elasticsearch.search.aggregations.bucket.range.RangeAggregator; import org.elasticsearch.search.aggregations.bucket.range.RangeAggregator;
import org.elasticsearch.search.aggregations.bucket.range.RangeAggregator.Unmapped; import org.elasticsearch.search.aggregations.bucket.range.RangeAggregator.Unmapped;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.GeoPointParser; import org.elasticsearch.search.aggregations.support.GeoPointParser;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
@ -186,18 +186,19 @@ public class GeoDistanceParser implements Aggregator.Parser {
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
return new Unmapped(name, ranges, keyed, null, aggregationContext, parent, rangeFactory, reducers, metaData); return new Unmapped(name, ranges, keyed, null, aggregationContext, parent, rangeFactory, pipelineAggregators, metaData);
} }
@Override @Override
protected Aggregator doCreateInternal(final ValuesSource.GeoPoint valuesSource, AggregationContext aggregationContext, protected Aggregator doCreateInternal(final ValuesSource.GeoPoint valuesSource, AggregationContext aggregationContext,
Aggregator parent, boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) Aggregator parent, boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData)
throws IOException { throws IOException {
DistanceSource distanceSource = new DistanceSource(valuesSource, distanceType, origin, unit); DistanceSource distanceSource = new DistanceSource(valuesSource, distanceType, origin, unit);
return new RangeAggregator(name, factories, distanceSource, null, rangeFactory, ranges, keyed, aggregationContext, parent, return new RangeAggregator(name, factories, distanceSource, null, rangeFactory, ranges, keyed, aggregationContext, parent,
reducers, metaData); pipelineAggregators, metaData);
} }
private static class DistanceSource extends ValuesSource.Numeric { private static class DistanceSource extends ValuesSource.Numeric {

View File

@ -26,7 +26,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext; import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.bucket.range.InternalRange; import org.elasticsearch.search.aggregations.bucket.range.InternalRange;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import java.io.IOException; import java.io.IOException;
@ -110,13 +110,13 @@ public class InternalGeoDistance extends InternalRange<InternalGeoDistance.Bucke
@Override @Override
public InternalGeoDistance create(String name, List<Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed, public InternalGeoDistance create(String name, List<Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed,
List<Reducer> reducers, Map<String, Object> metaData) { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
return new InternalGeoDistance(name, ranges, formatter, keyed, reducers, metaData); return new InternalGeoDistance(name, ranges, formatter, keyed, pipelineAggregators, metaData);
} }
@Override @Override
public InternalGeoDistance create(List<Bucket> ranges, InternalGeoDistance prototype) { public InternalGeoDistance create(List<Bucket> ranges, InternalGeoDistance prototype) {
return new InternalGeoDistance(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.reducers(), return new InternalGeoDistance(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.pipelineAggregators(),
prototype.metaData); prototype.metaData);
} }
@ -134,9 +134,9 @@ public class InternalGeoDistance extends InternalRange<InternalGeoDistance.Bucke
InternalGeoDistance() {} // for serialization InternalGeoDistance() {} // for serialization
public InternalGeoDistance(String name, List<Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<Reducer> reducers, public InternalGeoDistance(String name, List<Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, ranges, formatter, keyed, reducers, metaData); super(name, ranges, formatter, keyed, pipelineAggregators, metaData);
} }
@Override @Override

View File

@ -26,7 +26,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext; import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.bucket.range.InternalRange; import org.elasticsearch.search.aggregations.bucket.range.InternalRange;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import java.io.IOException; import java.io.IOException;
@ -119,13 +119,13 @@ public class InternalIPv4Range extends InternalRange<InternalIPv4Range.Bucket, I
@Override @Override
public InternalIPv4Range create(String name, List<Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed, public InternalIPv4Range create(String name, List<Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed,
List<Reducer> reducers, Map<String, Object> metaData) { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
return new InternalIPv4Range(name, ranges, keyed, reducers, metaData); return new InternalIPv4Range(name, ranges, keyed, pipelineAggregators, metaData);
} }
@Override @Override
public InternalIPv4Range create(List<Bucket> ranges, InternalIPv4Range prototype) { public InternalIPv4Range create(List<Bucket> ranges, InternalIPv4Range prototype) {
return new InternalIPv4Range(prototype.name, ranges, prototype.keyed, prototype.reducers(), prototype.metaData); return new InternalIPv4Range(prototype.name, ranges, prototype.keyed, prototype.pipelineAggregators(), prototype.metaData);
} }
@Override @Override
@ -142,9 +142,9 @@ public class InternalIPv4Range extends InternalRange<InternalIPv4Range.Bucket, I
public InternalIPv4Range() {} // for serialization public InternalIPv4Range() {} // for serialization
public InternalIPv4Range(String name, List<InternalIPv4Range.Bucket> ranges, boolean keyed, List<Reducer> reducers, public InternalIPv4Range(String name, List<InternalIPv4Range.Bucket> ranges, boolean keyed, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, ranges, ValueFormatter.IPv4, keyed, reducers, metaData); super(name, ranges, ValueFormatter.IPv4, keyed, pipelineAggregators, metaData);
} }
@Override @Override

View File

@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.AggregatorFactories; import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector; import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector;
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector; import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
@ -49,10 +49,10 @@ public class DiversifiedBytesHashSamplerAggregator extends SamplerAggregator {
private int maxDocsPerValue; private int maxDocsPerValue;
public DiversifiedBytesHashSamplerAggregator(String name, int shardSize, AggregatorFactories factories, public DiversifiedBytesHashSamplerAggregator(String name, int shardSize, AggregatorFactories factories,
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData, AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData,
ValuesSource valuesSource, ValuesSource valuesSource,
int maxDocsPerValue) throws IOException { int maxDocsPerValue) throws IOException {
super(name, shardSize, factories, aggregationContext, parent, reducers, metaData); super(name, shardSize, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.maxDocsPerValue = maxDocsPerValue; this.maxDocsPerValue = maxDocsPerValue;
} }

View File

@ -33,7 +33,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.AggregatorFactories; import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector; import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector;
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector; import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
@ -48,9 +48,9 @@ public class DiversifiedMapSamplerAggregator extends SamplerAggregator {
private BytesRefHash bucketOrds; private BytesRefHash bucketOrds;
public DiversifiedMapSamplerAggregator(String name, int shardSize, AggregatorFactories factories, public DiversifiedMapSamplerAggregator(String name, int shardSize, AggregatorFactories factories,
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData, AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData,
ValuesSource valuesSource, int maxDocsPerValue) throws IOException { ValuesSource valuesSource, int maxDocsPerValue) throws IOException {
super(name, shardSize, factories, aggregationContext, parent, reducers, metaData); super(name, shardSize, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.maxDocsPerValue = maxDocsPerValue; this.maxDocsPerValue = maxDocsPerValue;
bucketOrds = new BytesRefHash(shardSize, aggregationContext.bigArrays()); bucketOrds = new BytesRefHash(shardSize, aggregationContext.bigArrays());

View File

@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.AggregatorFactories; import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector; import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector;
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector; import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
@ -44,9 +44,9 @@ public class DiversifiedNumericSamplerAggregator extends SamplerAggregator {
private int maxDocsPerValue; private int maxDocsPerValue;
public DiversifiedNumericSamplerAggregator(String name, int shardSize, AggregatorFactories factories, public DiversifiedNumericSamplerAggregator(String name, int shardSize, AggregatorFactories factories,
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData, AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData,
ValuesSource.Numeric valuesSource, int maxDocsPerValue) throws IOException { ValuesSource.Numeric valuesSource, int maxDocsPerValue) throws IOException {
super(name, shardSize, factories, aggregationContext, parent, reducers, metaData); super(name, shardSize, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.maxDocsPerValue = maxDocsPerValue; this.maxDocsPerValue = maxDocsPerValue;
} }

View File

@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.AggregatorFactories; import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector; import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector;
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector; import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
@ -45,9 +45,9 @@ public class DiversifiedOrdinalsSamplerAggregator extends SamplerAggregator {
private int maxDocsPerValue; private int maxDocsPerValue;
public DiversifiedOrdinalsSamplerAggregator(String name, int shardSize, AggregatorFactories factories, public DiversifiedOrdinalsSamplerAggregator(String name, int shardSize, AggregatorFactories factories,
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData, AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData,
ValuesSource.Bytes.WithOrdinals.FieldData valuesSource, int maxDocsPerValue) throws IOException { ValuesSource.Bytes.WithOrdinals.FieldData valuesSource, int maxDocsPerValue) throws IOException {
super(name, shardSize, factories, aggregationContext, parent, reducers, metaData); super(name, shardSize, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.maxDocsPerValue = maxDocsPerValue; this.maxDocsPerValue = maxDocsPerValue;
} }

View File

@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation; import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -51,8 +51,8 @@ public class InternalSampler extends InternalSingleBucketAggregation implements
InternalSampler() { InternalSampler() {
} // for serialization } // for serialization
InternalSampler(String name, long docCount, InternalAggregations subAggregations, List<Reducer> reducers, Map<String, Object> metaData) { InternalSampler(String name, long docCount, InternalAggregations subAggregations, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, docCount, subAggregations, reducers, metaData); super(name, docCount, subAggregations, pipelineAggregators, metaData);
} }
@Override @Override
@ -63,6 +63,6 @@ public class InternalSampler extends InternalSingleBucketAggregation implements
@Override @Override
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, protected InternalSingleBucketAggregation newAggregation(String name, long docCount,
InternalAggregations subAggregations) { InternalAggregations subAggregations) {
return new InternalSampler(name, docCount, subAggregations, reducers(), metaData); return new InternalSampler(name, docCount, subAggregations, pipelineAggregators(), metaData);
} }
} }

View File

@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.NonCollectingAggregator;
import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector; import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector;
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector; import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator; import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSource.Numeric; import org.elasticsearch.search.aggregations.support.ValuesSource.Numeric;
@ -60,9 +60,11 @@ public class SamplerAggregator extends SingleBucketAggregator {
@Override @Override
Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource, Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource,
AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException {
return new DiversifiedMapSamplerAggregator(name, shardSize, factories, context, parent, reducers, metaData, valuesSource, return new DiversifiedMapSamplerAggregator(name, shardSize, factories, context, parent, pipelineAggregators, metaData,
valuesSource,
maxDocsPerValue); maxDocsPerValue);
} }
@ -76,9 +78,11 @@ public class SamplerAggregator extends SingleBucketAggregator {
@Override @Override
Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource, Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource,
AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException {
return new DiversifiedBytesHashSamplerAggregator(name, shardSize, factories, context, parent, reducers, metaData, return new DiversifiedBytesHashSamplerAggregator(name, shardSize, factories, context, parent, pipelineAggregators,
metaData,
valuesSource, valuesSource,
maxDocsPerValue); maxDocsPerValue);
} }
@ -93,8 +97,9 @@ public class SamplerAggregator extends SingleBucketAggregator {
@Override @Override
Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource, Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource,
AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
return new DiversifiedOrdinalsSamplerAggregator(name, shardSize, factories, context, parent, reducers, metaData, Map<String, Object> metaData) throws IOException {
return new DiversifiedOrdinalsSamplerAggregator(name, shardSize, factories, context, parent, pipelineAggregators, metaData,
(ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, maxDocsPerValue); (ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, maxDocsPerValue);
} }
@ -121,7 +126,7 @@ public class SamplerAggregator extends SingleBucketAggregator {
} }
abstract Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource, abstract Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource,
AggregationContext context, Aggregator parent, List<Reducer> reducers, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException; Map<String, Object> metaData) throws IOException;
abstract boolean needsGlobalOrdinals(); abstract boolean needsGlobalOrdinals();
@ -137,8 +142,8 @@ public class SamplerAggregator extends SingleBucketAggregator {
protected BestDocsDeferringCollector bdd; protected BestDocsDeferringCollector bdd;
public SamplerAggregator(String name, int shardSize, AggregatorFactories factories, AggregationContext aggregationContext, public SamplerAggregator(String name, int shardSize, AggregatorFactories factories, AggregationContext aggregationContext,
Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, reducers, metaData); super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
this.shardSize = shardSize; this.shardSize = shardSize;
} }
@ -163,13 +168,13 @@ public class SamplerAggregator extends SingleBucketAggregator {
@Override @Override
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException { public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
runDeferredCollections(owningBucketOrdinal); runDeferredCollections(owningBucketOrdinal);
return new InternalSampler(name, bdd == null ? 0 : bdd.getDocCount(), bucketAggregations(owningBucketOrdinal), reducers(), return new InternalSampler(name, bdd == null ? 0 : bdd.getDocCount(), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
metaData()); metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalSampler(name, 0, buildEmptySubAggregations(), reducers(), metaData()); return new InternalSampler(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
} }
public static class Factory extends AggregatorFactory { public static class Factory extends AggregatorFactory {
@ -183,12 +188,12 @@ public class SamplerAggregator extends SingleBucketAggregator {
@Override @Override
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket, public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
if (collectsFromSingleBucket == false) { if (collectsFromSingleBucket == false) {
return asMultiBucketAggregator(this, context, parent); return asMultiBucketAggregator(this, context, parent);
} }
return new SamplerAggregator(name, shardSize, factories, context, parent, reducers, metaData); return new SamplerAggregator(name, shardSize, factories, context, parent, pipelineAggregators, metaData);
} }
} }
@ -208,7 +213,8 @@ public class SamplerAggregator extends SingleBucketAggregator {
@Override @Override
protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext context, Aggregator parent, protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext context, Aggregator parent,
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException {
if (collectsFromSingleBucket == false) { if (collectsFromSingleBucket == false) {
return asMultiBucketAggregator(this, context, parent); return asMultiBucketAggregator(this, context, parent);
@ -216,7 +222,7 @@ public class SamplerAggregator extends SingleBucketAggregator {
if (valuesSource instanceof ValuesSource.Numeric) { if (valuesSource instanceof ValuesSource.Numeric) {
return new DiversifiedNumericSamplerAggregator(name, shardSize, factories, context, parent, reducers, metaData, return new DiversifiedNumericSamplerAggregator(name, shardSize, factories, context, parent, pipelineAggregators, metaData,
(Numeric) valuesSource, maxDocsPerValue); (Numeric) valuesSource, maxDocsPerValue);
} }
@ -234,7 +240,8 @@ public class SamplerAggregator extends SingleBucketAggregator {
if ((execution.needsGlobalOrdinals()) && (!(valuesSource instanceof ValuesSource.Bytes.WithOrdinals))) { if ((execution.needsGlobalOrdinals()) && (!(valuesSource instanceof ValuesSource.Bytes.WithOrdinals))) {
execution = ExecutionMode.MAP; execution = ExecutionMode.MAP;
} }
return execution.create(name, factories, shardSize, maxDocsPerValue, valuesSource, context, parent, reducers, metaData); return execution.create(name, factories, shardSize, maxDocsPerValue, valuesSource, context, parent, pipelineAggregators,
metaData);
} }
throw new AggregationExecutionException("Sampler aggregation cannot be applied to field [" + config.fieldContext().field() + throw new AggregationExecutionException("Sampler aggregation cannot be applied to field [" + config.fieldContext().field() +
@ -242,11 +249,12 @@ public class SamplerAggregator extends SingleBucketAggregator {
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
final UnmappedSampler aggregation = new UnmappedSampler(name, reducers, metaData); final UnmappedSampler aggregation = new UnmappedSampler(name, pipelineAggregators, metaData);
return new NonCollectingAggregator(name, aggregationContext, parent, factories, reducers, metaData) { return new NonCollectingAggregator(name, aggregationContext, parent, factories, pipelineAggregators, metaData) {
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return aggregation; return aggregation;

View File

@ -23,7 +23,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -53,8 +53,8 @@ public class UnmappedSampler extends InternalSampler {
UnmappedSampler() { UnmappedSampler() {
} }
public UnmappedSampler(String name, List<Reducer> reducers, Map<String, Object> metaData) { public UnmappedSampler(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, 0, InternalAggregations.EMPTY, reducers, metaData); super(name, 0, InternalAggregations.EMPTY, pipelineAggregators, metaData);
} }
@Override @Override

View File

@ -29,7 +29,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.bucket.terms.GlobalOrdinalsStringTermsAggregator; import org.elasticsearch.search.aggregations.bucket.terms.GlobalOrdinalsStringTermsAggregator;
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude; import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.internal.ContextIndexSearcher; import org.elasticsearch.search.internal.ContextIndexSearcher;
@ -51,10 +51,11 @@ public class GlobalOrdinalsSignificantTermsAggregator extends GlobalOrdinalsStri
public GlobalOrdinalsSignificantTermsAggregator(String name, AggregatorFactories factories, public GlobalOrdinalsSignificantTermsAggregator(String name, AggregatorFactories factories,
ValuesSource.Bytes.WithOrdinals.FieldData valuesSource, BucketCountThresholds bucketCountThresholds, ValuesSource.Bytes.WithOrdinals.FieldData valuesSource, BucketCountThresholds bucketCountThresholds,
IncludeExclude.OrdinalsFilter includeExclude, AggregationContext aggregationContext, Aggregator parent, IncludeExclude.OrdinalsFilter includeExclude, AggregationContext aggregationContext, Aggregator parent,
SignificantTermsAggregatorFactory termsAggFactory, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { SignificantTermsAggregatorFactory termsAggFactory, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException {
super(name, factories, valuesSource, null, bucketCountThresholds, includeExclude, aggregationContext, parent, super(name, factories, valuesSource, null, bucketCountThresholds, includeExclude, aggregationContext, parent,
SubAggCollectionMode.DEPTH_FIRST, false, reducers, metaData); SubAggCollectionMode.DEPTH_FIRST, false, pipelineAggregators, metaData);
this.termsAggFactory = termsAggFactory; this.termsAggFactory = termsAggFactory;
} }
@ -130,7 +131,7 @@ public class GlobalOrdinalsSignificantTermsAggregator extends GlobalOrdinalsStri
} }
return new SignificantStringTerms(subsetSize, supersetSize, name, bucketCountThresholds.getRequiredSize(), return new SignificantStringTerms(subsetSize, supersetSize, name, bucketCountThresholds.getRequiredSize(),
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), reducers(), bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), pipelineAggregators(),
metaData()); metaData());
} }
@ -142,7 +143,7 @@ public class GlobalOrdinalsSignificantTermsAggregator extends GlobalOrdinalsStri
int supersetSize = topReader.numDocs(); int supersetSize = topReader.numDocs();
return new SignificantStringTerms(0, supersetSize, name, bucketCountThresholds.getRequiredSize(), return new SignificantStringTerms(0, supersetSize, name, bucketCountThresholds.getRequiredSize(),
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(),
Collections.<InternalSignificantTerms.Bucket> emptyList(), reducers(), metaData()); Collections.<InternalSignificantTerms.Bucket> emptyList(), pipelineAggregators(), metaData());
} }
@Override @Override
@ -154,8 +155,12 @@ public class GlobalOrdinalsSignificantTermsAggregator extends GlobalOrdinalsStri
private final LongHash bucketOrds; private final LongHash bucketOrds;
public WithHash(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals.FieldData valuesSource, BucketCountThresholds bucketCountThresholds, IncludeExclude.OrdinalsFilter includeExclude, AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggFactory, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { public WithHash(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals.FieldData valuesSource,
super(name, factories, valuesSource, bucketCountThresholds, includeExclude, aggregationContext, parent, termsAggFactory, reducers, metaData); BucketCountThresholds bucketCountThresholds, IncludeExclude.OrdinalsFilter includeExclude,
AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggFactory,
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, valuesSource, bucketCountThresholds, includeExclude, aggregationContext, parent, termsAggFactory,
pipelineAggregators, metaData);
bucketOrds = new LongHash(1, aggregationContext.bigArrays()); bucketOrds = new LongHash(1, aggregationContext.bigArrays());
} }

View File

@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation; import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic; import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
@ -125,9 +125,9 @@ public abstract class InternalSignificantTerms<A extends InternalSignificantTerm
} }
protected InternalSignificantTerms(long subsetSize, long supersetSize, String name, int requiredSize, long minDocCount, protected InternalSignificantTerms(long subsetSize, long supersetSize, String name, int requiredSize, long minDocCount,
SignificanceHeuristic significanceHeuristic, List<? extends Bucket> buckets, List<Reducer> reducers, SignificanceHeuristic significanceHeuristic, List<? extends Bucket> buckets, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
this.requiredSize = requiredSize; this.requiredSize = requiredSize;
this.minDocCount = minDocCount; this.minDocCount = minDocCount;
this.buckets = buckets; this.buckets = buckets;

View File

@ -18,7 +18,6 @@
*/ */
package org.elasticsearch.search.aggregations.bucket.significant; package org.elasticsearch.search.aggregations.bucket.significant;
import org.elasticsearch.Version;
import org.elasticsearch.common.Nullable; import org.elasticsearch.common.Nullable;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
@ -29,7 +28,7 @@ import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic; import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic;
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams; import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams; import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
@ -169,9 +168,9 @@ public class SignificantLongTerms extends InternalSignificantTerms<SignificantLo
public SignificantLongTerms(long subsetSize, long supersetSize, String name, @Nullable ValueFormatter formatter, int requiredSize, public SignificantLongTerms(long subsetSize, long supersetSize, String name, @Nullable ValueFormatter formatter, int requiredSize,
long minDocCount, SignificanceHeuristic significanceHeuristic, List<? extends InternalSignificantTerms.Bucket> buckets, long minDocCount, SignificanceHeuristic significanceHeuristic, List<? extends InternalSignificantTerms.Bucket> buckets,
List<Reducer> reducers, Map<String, Object> metaData) { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(subsetSize, supersetSize, name, requiredSize, minDocCount, significanceHeuristic, buckets, reducers, metaData); super(subsetSize, supersetSize, name, requiredSize, minDocCount, significanceHeuristic, buckets, pipelineAggregators, metaData);
this.formatter = formatter; this.formatter = formatter;
} }
@ -183,7 +182,7 @@ public class SignificantLongTerms extends InternalSignificantTerms<SignificantLo
@Override @Override
public SignificantLongTerms create(List<SignificantLongTerms.Bucket> buckets) { public SignificantLongTerms create(List<SignificantLongTerms.Bucket> buckets) {
return new SignificantLongTerms(this.subsetSize, this.supersetSize, this.name, this.formatter, this.requiredSize, this.minDocCount, return new SignificantLongTerms(this.subsetSize, this.supersetSize, this.name, this.formatter, this.requiredSize, this.minDocCount,
this.significanceHeuristic, buckets, this.reducers(), this.metaData); this.significanceHeuristic, buckets, this.pipelineAggregators(), this.metaData);
} }
@Override @Override
@ -197,7 +196,7 @@ public class SignificantLongTerms extends InternalSignificantTerms<SignificantLo
List<org.elasticsearch.search.aggregations.bucket.significant.InternalSignificantTerms.Bucket> buckets, List<org.elasticsearch.search.aggregations.bucket.significant.InternalSignificantTerms.Bucket> buckets,
InternalSignificantTerms prototype) { InternalSignificantTerms prototype) {
return new SignificantLongTerms(subsetSize, supersetSize, prototype.getName(), ((SignificantLongTerms) prototype).formatter, return new SignificantLongTerms(subsetSize, supersetSize, prototype.getName(), ((SignificantLongTerms) prototype).formatter,
prototype.requiredSize, prototype.minDocCount, prototype.significanceHeuristic, buckets, prototype.reducers(), prototype.requiredSize, prototype.minDocCount, prototype.significanceHeuristic, buckets, prototype.pipelineAggregators(),
prototype.getMetaData()); prototype.getMetaData());
} }

View File

@ -28,7 +28,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.bucket.terms.LongTermsAggregator; import org.elasticsearch.search.aggregations.bucket.terms.LongTermsAggregator;
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude; import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.format.ValueFormat; import org.elasticsearch.search.aggregations.support.format.ValueFormat;
@ -48,10 +48,10 @@ public class SignificantLongTermsAggregator extends LongTermsAggregator {
public SignificantLongTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource, @Nullable ValueFormat format, public SignificantLongTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource, @Nullable ValueFormat format,
BucketCountThresholds bucketCountThresholds, AggregationContext aggregationContext, BucketCountThresholds bucketCountThresholds, AggregationContext aggregationContext,
Aggregator parent, SignificantTermsAggregatorFactory termsAggFactory, IncludeExclude.LongFilter includeExclude, Aggregator parent, SignificantTermsAggregatorFactory termsAggFactory, IncludeExclude.LongFilter includeExclude,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, valuesSource, format, null, bucketCountThresholds, aggregationContext, parent, super(name, factories, valuesSource, format, null, bucketCountThresholds, aggregationContext, parent,
SubAggCollectionMode.DEPTH_FIRST, false, includeExclude, reducers, metaData); SubAggCollectionMode.DEPTH_FIRST, false, includeExclude, pipelineAggregators, metaData);
this.termsAggFactory = termsAggFactory; this.termsAggFactory = termsAggFactory;
} }
@ -109,7 +109,7 @@ public class SignificantLongTermsAggregator extends LongTermsAggregator {
list[i] = bucket; list[i] = bucket;
} }
return new SignificantLongTerms(subsetSize, supersetSize, name, formatter, bucketCountThresholds.getRequiredSize(), return new SignificantLongTerms(subsetSize, supersetSize, name, formatter, bucketCountThresholds.getRequiredSize(),
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), reducers(), bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), pipelineAggregators(),
metaData()); metaData());
} }
@ -121,7 +121,7 @@ public class SignificantLongTermsAggregator extends LongTermsAggregator {
int supersetSize = topReader.numDocs(); int supersetSize = topReader.numDocs();
return new SignificantLongTerms(0, supersetSize, name, formatter, bucketCountThresholds.getRequiredSize(), return new SignificantLongTerms(0, supersetSize, name, formatter, bucketCountThresholds.getRequiredSize(),
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(),
Collections.<InternalSignificantTerms.Bucket> emptyList(), reducers(), metaData()); Collections.<InternalSignificantTerms.Bucket> emptyList(), pipelineAggregators(), metaData());
} }
@Override @Override

View File

@ -19,7 +19,6 @@
package org.elasticsearch.search.aggregations.bucket.significant; package org.elasticsearch.search.aggregations.bucket.significant;
import org.apache.lucene.util.BytesRef; import org.apache.lucene.util.BytesRef;
import org.elasticsearch.Version;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
@ -30,7 +29,7 @@ import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic; import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic;
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams; import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -161,9 +160,10 @@ public class SignificantStringTerms extends InternalSignificantTerms<Significant
SignificantStringTerms() {} // for serialization SignificantStringTerms() {} // for serialization
public SignificantStringTerms(long subsetSize, long supersetSize, String name, int requiredSize, long minDocCount, public SignificantStringTerms(long subsetSize, long supersetSize, String name, int requiredSize, long minDocCount,
SignificanceHeuristic significanceHeuristic, List<? extends InternalSignificantTerms.Bucket> buckets, List<Reducer> reducers, SignificanceHeuristic significanceHeuristic, List<? extends InternalSignificantTerms.Bucket> buckets,
List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(subsetSize, supersetSize, name, requiredSize, minDocCount, significanceHeuristic, buckets, reducers, metaData); super(subsetSize, supersetSize, name, requiredSize, minDocCount, significanceHeuristic, buckets, pipelineAggregators, metaData);
} }
@Override @Override
@ -174,7 +174,7 @@ public class SignificantStringTerms extends InternalSignificantTerms<Significant
@Override @Override
public SignificantStringTerms create(List<SignificantStringTerms.Bucket> buckets) { public SignificantStringTerms create(List<SignificantStringTerms.Bucket> buckets) {
return new SignificantStringTerms(this.subsetSize, this.supersetSize, this.name, this.requiredSize, this.minDocCount, return new SignificantStringTerms(this.subsetSize, this.supersetSize, this.name, this.requiredSize, this.minDocCount,
this.significanceHeuristic, buckets, this.reducers(), this.metaData); this.significanceHeuristic, buckets, this.pipelineAggregators(), this.metaData);
} }
@Override @Override
@ -187,7 +187,7 @@ public class SignificantStringTerms extends InternalSignificantTerms<Significant
protected SignificantStringTerms create(long subsetSize, long supersetSize, List<InternalSignificantTerms.Bucket> buckets, protected SignificantStringTerms create(long subsetSize, long supersetSize, List<InternalSignificantTerms.Bucket> buckets,
InternalSignificantTerms prototype) { InternalSignificantTerms prototype) {
return new SignificantStringTerms(subsetSize, supersetSize, prototype.getName(), prototype.requiredSize, prototype.minDocCount, return new SignificantStringTerms(subsetSize, supersetSize, prototype.getName(), prototype.requiredSize, prototype.minDocCount,
prototype.significanceHeuristic, buckets, prototype.reducers(), prototype.getMetaData()); prototype.significanceHeuristic, buckets, prototype.pipelineAggregators(), prototype.getMetaData());
} }
@Override @Override

View File

@ -28,7 +28,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.bucket.terms.StringTermsAggregator; import org.elasticsearch.search.aggregations.bucket.terms.StringTermsAggregator;
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude; import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.internal.ContextIndexSearcher; import org.elasticsearch.search.internal.ContextIndexSearcher;
@ -50,11 +50,12 @@ public class SignificantStringTermsAggregator extends StringTermsAggregator {
public SignificantStringTermsAggregator(String name, AggregatorFactories factories, ValuesSource valuesSource, public SignificantStringTermsAggregator(String name, AggregatorFactories factories, ValuesSource valuesSource,
BucketCountThresholds bucketCountThresholds, BucketCountThresholds bucketCountThresholds,
IncludeExclude.StringFilter includeExclude, AggregationContext aggregationContext, Aggregator parent, IncludeExclude.StringFilter includeExclude, AggregationContext aggregationContext, Aggregator parent,
SignificantTermsAggregatorFactory termsAggFactory, List<Reducer> reducers, Map<String, Object> metaData) SignificantTermsAggregatorFactory termsAggFactory, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData)
throws IOException { throws IOException {
super(name, factories, valuesSource, null, bucketCountThresholds, includeExclude, aggregationContext, parent, super(name, factories, valuesSource, null, bucketCountThresholds, includeExclude, aggregationContext, parent,
SubAggCollectionMode.DEPTH_FIRST, false, reducers, metaData); SubAggCollectionMode.DEPTH_FIRST, false, pipelineAggregators, metaData);
this.termsAggFactory = termsAggFactory; this.termsAggFactory = termsAggFactory;
} }
@ -115,7 +116,7 @@ public class SignificantStringTermsAggregator extends StringTermsAggregator {
} }
return new SignificantStringTerms(subsetSize, supersetSize, name, bucketCountThresholds.getRequiredSize(), return new SignificantStringTerms(subsetSize, supersetSize, name, bucketCountThresholds.getRequiredSize(),
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), reducers(), bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), pipelineAggregators(),
metaData()); metaData());
} }
@ -127,7 +128,7 @@ public class SignificantStringTermsAggregator extends StringTermsAggregator {
int supersetSize = topReader.numDocs(); int supersetSize = topReader.numDocs();
return new SignificantStringTerms(0, supersetSize, name, bucketCountThresholds.getRequiredSize(), return new SignificantStringTerms(0, supersetSize, name, bucketCountThresholds.getRequiredSize(),
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(),
Collections.<InternalSignificantTerms.Bucket> emptyList(), reducers(), metaData()); Collections.<InternalSignificantTerms.Bucket> emptyList(), pipelineAggregators(), metaData());
} }
@Override @Override

View File

@ -37,7 +37,7 @@ import org.elasticsearch.search.aggregations.NonCollectingAggregator;
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic; import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic;
import org.elasticsearch.search.aggregations.bucket.terms.TermsAggregator; import org.elasticsearch.search.aggregations.bucket.terms.TermsAggregator;
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude; import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -65,10 +65,10 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory, AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
final IncludeExclude.StringFilter filter = includeExclude == null ? null : includeExclude.convertToStringFilter(); final IncludeExclude.StringFilter filter = includeExclude == null ? null : includeExclude.convertToStringFilter();
return new SignificantStringTermsAggregator(name, factories, valuesSource, bucketCountThresholds, filter, return new SignificantStringTermsAggregator(name, factories, valuesSource, bucketCountThresholds, filter,
aggregationContext, parent, termsAggregatorFactory, reducers, metaData); aggregationContext, parent, termsAggregatorFactory, pipelineAggregators, metaData);
} }
}, },
@ -78,11 +78,13 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory, AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
ValuesSource.Bytes.WithOrdinals valueSourceWithOrdinals = (ValuesSource.Bytes.WithOrdinals) valuesSource; ValuesSource.Bytes.WithOrdinals valueSourceWithOrdinals = (ValuesSource.Bytes.WithOrdinals) valuesSource;
IndexSearcher indexSearcher = aggregationContext.searchContext().searcher(); IndexSearcher indexSearcher = aggregationContext.searchContext().searcher();
final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter(); final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter();
return new GlobalOrdinalsSignificantTermsAggregator(name, factories, (ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, bucketCountThresholds, filter, aggregationContext, parent, termsAggregatorFactory, reducers, metaData); return new GlobalOrdinalsSignificantTermsAggregator(name, factories,
(ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, bucketCountThresholds, filter, aggregationContext,
parent, termsAggregatorFactory, pipelineAggregators, metaData);
} }
}, },
@ -92,11 +94,12 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory, AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter(); final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter();
return new GlobalOrdinalsSignificantTermsAggregator.WithHash(name, factories, return new GlobalOrdinalsSignificantTermsAggregator.WithHash(name, factories,
(ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, bucketCountThresholds, filter, (ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, bucketCountThresholds, filter,
aggregationContext, parent, termsAggregatorFactory, reducers, metaData); aggregationContext,
parent, termsAggregatorFactory, pipelineAggregators, metaData);
} }
}; };
@ -118,7 +121,7 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
abstract Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, abstract Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory, AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException; List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException;
@Override @Override
public String toString() { public String toString() {
@ -155,11 +158,12 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
final InternalAggregation aggregation = new UnmappedSignificantTerms(name, bucketCountThresholds.getRequiredSize(), final InternalAggregation aggregation = new UnmappedSignificantTerms(name, bucketCountThresholds.getRequiredSize(),
bucketCountThresholds.getMinDocCount(), reducers, metaData); bucketCountThresholds.getMinDocCount(), pipelineAggregators, metaData);
return new NonCollectingAggregator(name, aggregationContext, parent, reducers, metaData) { return new NonCollectingAggregator(name, aggregationContext, parent, pipelineAggregators, metaData) {
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return aggregation; return aggregation;
@ -169,7 +173,8 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
@Override @Override
protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext aggregationContext, Aggregator parent, protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext aggregationContext, Aggregator parent,
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException {
if (collectsFromSingleBucket == false) { if (collectsFromSingleBucket == false) {
return asMultiBucketAggregator(this, aggregationContext, parent); return asMultiBucketAggregator(this, aggregationContext, parent);
} }
@ -193,7 +198,7 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
} }
assert execution != null; assert execution != null;
return execution.create(name, factories, valuesSource, bucketCountThresholds, includeExclude, aggregationContext, parent, this, return execution.create(name, factories, valuesSource, bucketCountThresholds, includeExclude, aggregationContext, parent, this,
reducers, metaData); pipelineAggregators, metaData);
} }
@ -212,7 +217,7 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
longFilter = includeExclude.convertToLongFilter(); longFilter = includeExclude.convertToLongFilter();
} }
return new SignificantLongTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(), return new SignificantLongTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(),
bucketCountThresholds, aggregationContext, parent, this, longFilter, reducers, metaData); bucketCountThresholds, aggregationContext, parent, this, longFilter, pipelineAggregators, metaData);
} }
throw new AggregationExecutionException("sigfnificant_terms aggregation cannot be applied to field [" + config.fieldContext().field() + throw new AggregationExecutionException("sigfnificant_terms aggregation cannot be applied to field [" + config.fieldContext().field() +

View File

@ -25,7 +25,7 @@ import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.JLHScore; import org.elasticsearch.search.aggregations.bucket.significant.heuristics.JLHScore;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.Collections; import java.util.Collections;
@ -57,10 +57,10 @@ public class UnmappedSignificantTerms extends InternalSignificantTerms<UnmappedS
UnmappedSignificantTerms() {} // for serialization UnmappedSignificantTerms() {} // for serialization
public UnmappedSignificantTerms(String name, int requiredSize, long minDocCount, List<Reducer> reducers, Map<String, Object> metaData) { public UnmappedSignificantTerms(String name, int requiredSize, long minDocCount, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
//We pass zero for index/subset sizes because for the purpose of significant term analysis //We pass zero for index/subset sizes because for the purpose of significant term analysis
// we assume an unmapped index's size is irrelevant to the proceedings. // we assume an unmapped index's size is irrelevant to the proceedings.
super(0, 0, name, requiredSize, minDocCount, JLHScore.INSTANCE, BUCKETS, reducers, metaData); super(0, 0, name, requiredSize, minDocCount, JLHScore.INSTANCE, BUCKETS, pipelineAggregators, metaData);
} }
@Override @Override
@ -70,7 +70,7 @@ public class UnmappedSignificantTerms extends InternalSignificantTerms<UnmappedS
@Override @Override
public UnmappedSignificantTerms create(List<InternalSignificantTerms.Bucket> buckets) { public UnmappedSignificantTerms create(List<InternalSignificantTerms.Bucket> buckets) {
return new UnmappedSignificantTerms(this.name, this.requiredSize, this.minDocCount, this.reducers(), this.metaData); return new UnmappedSignificantTerms(this.name, this.requiredSize, this.minDocCount, this.pipelineAggregators(), this.metaData);
} }
@Override @Override

View File

@ -22,7 +22,7 @@ package org.elasticsearch.search.aggregations.bucket.terms;
import org.elasticsearch.search.aggregations.Aggregator; import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.AggregatorFactories; import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -36,8 +36,8 @@ abstract class AbstractStringTermsAggregator extends TermsAggregator {
public AbstractStringTermsAggregator(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent, public AbstractStringTermsAggregator(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent,
Terms.Order order, BucketCountThresholds bucketCountThresholds, SubAggCollectionMode subAggCollectMode, Terms.Order order, BucketCountThresholds bucketCountThresholds, SubAggCollectionMode subAggCollectMode,
boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, context, parent, bucketCountThresholds, order, subAggCollectMode, reducers, metaData); super(name, factories, context, parent, bucketCountThresholds, order, subAggCollectMode, pipelineAggregators, metaData);
this.showTermDocCountError = showTermDocCountError; this.showTermDocCountError = showTermDocCountError;
} }
@ -45,7 +45,7 @@ abstract class AbstractStringTermsAggregator extends TermsAggregator {
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new StringTerms(name, order, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(), return new StringTerms(name, order, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(),
bucketCountThresholds.getMinDocCount(), Collections.<InternalTerms.Bucket> emptyList(), showTermDocCountError, 0, 0, bucketCountThresholds.getMinDocCount(), Collections.<InternalTerms.Bucket> emptyList(), showTermDocCountError, 0, 0,
reducers(), metaData()); pipelineAggregators(), metaData());
} }
} }

View File

@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext; import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams; import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
@ -162,8 +162,8 @@ public class DoubleTerms extends InternalTerms<DoubleTerms, DoubleTerms.Bucket>
public DoubleTerms(String name, Terms.Order order, @Nullable ValueFormatter formatter, int requiredSize, int shardSize, public DoubleTerms(String name, Terms.Order order, @Nullable ValueFormatter formatter, int requiredSize, int shardSize,
long minDocCount, List<? extends InternalTerms.Bucket> buckets, boolean showTermDocCountError, long docCountError, long minDocCount, List<? extends InternalTerms.Bucket> buckets, boolean showTermDocCountError, long docCountError,
long otherDocCount, List<Reducer> reducers, Map<String, Object> metaData) { long otherDocCount, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, reducers, super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, pipelineAggregators,
metaData); metaData);
this.formatter = formatter; this.formatter = formatter;
} }
@ -176,7 +176,7 @@ public class DoubleTerms extends InternalTerms<DoubleTerms, DoubleTerms.Bucket>
@Override @Override
public DoubleTerms create(List<Bucket> buckets) { public DoubleTerms create(List<Bucket> buckets) {
return new DoubleTerms(this.name, this.order, this.formatter, this.requiredSize, this.shardSize, this.minDocCount, buckets, return new DoubleTerms(this.name, this.order, this.formatter, this.requiredSize, this.shardSize, this.minDocCount, buckets,
this.showTermDocCountError, this.docCountError, this.otherDocCount, this.reducers(), this.metaData); this.showTermDocCountError, this.docCountError, this.otherDocCount, this.pipelineAggregators(), this.metaData);
} }
@Override @Override
@ -189,7 +189,7 @@ public class DoubleTerms extends InternalTerms<DoubleTerms, DoubleTerms.Bucket>
protected DoubleTerms create(String name, List<org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket> buckets, protected DoubleTerms create(String name, List<org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket> buckets,
long docCountError, long otherDocCount, InternalTerms prototype) { long docCountError, long otherDocCount, InternalTerms prototype) {
return new DoubleTerms(name, prototype.order, ((DoubleTerms) prototype).formatter, prototype.requiredSize, prototype.shardSize, return new DoubleTerms(name, prototype.order, ((DoubleTerms) prototype).formatter, prototype.requiredSize, prototype.shardSize,
prototype.minDocCount, buckets, prototype.showTermDocCountError, docCountError, otherDocCount, prototype.reducers(), prototype.minDocCount, buckets, prototype.showTermDocCountError, docCountError, otherDocCount, prototype.pipelineAggregators(),
prototype.getMetaData()); prototype.getMetaData());
} }

View File

@ -26,7 +26,7 @@ import org.elasticsearch.index.fielddata.FieldData;
import org.elasticsearch.search.aggregations.Aggregator; import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.AggregatorFactories; import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude; import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSource.Numeric; import org.elasticsearch.search.aggregations.support.ValuesSource.Numeric;
@ -45,9 +45,9 @@ public class DoubleTermsAggregator extends LongTermsAggregator {
public DoubleTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource, @Nullable ValueFormat format, public DoubleTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource, @Nullable ValueFormat format,
Terms.Order order, BucketCountThresholds bucketCountThresholds, Terms.Order order, BucketCountThresholds bucketCountThresholds,
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError,
IncludeExclude.LongFilter longFilter, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { IncludeExclude.LongFilter longFilter, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, valuesSource, format, order, bucketCountThresholds, aggregationContext, parent, collectionMode, super(name, factories, valuesSource, format, order, bucketCountThresholds, aggregationContext, parent, collectionMode,
showTermDocCountError, longFilter, reducers, metaData); showTermDocCountError, longFilter, pipelineAggregators, metaData);
} }
@Override @Override
@ -79,7 +79,7 @@ public class DoubleTermsAggregator extends LongTermsAggregator {
buckets[i] = convertToDouble(buckets[i]); buckets[i] = convertToDouble(buckets[i]);
} }
return new DoubleTerms(terms.getName(), terms.order, terms.formatter, terms.requiredSize, terms.shardSize, terms.minDocCount, return new DoubleTerms(terms.getName(), terms.order, terms.formatter, terms.requiredSize, terms.shardSize, terms.minDocCount,
Arrays.asList(buckets), terms.showTermDocCountError, terms.docCountError, terms.otherDocCount, terms.reducers(), Arrays.asList(buckets), terms.showTermDocCountError, terms.docCountError, terms.otherDocCount, terms.pipelineAggregators(),
terms.getMetaData()); terms.getMetaData());
} }

View File

@ -44,7 +44,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket; import org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket;
import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue; import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue;
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude; import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
@ -73,8 +73,11 @@ public class GlobalOrdinalsStringTermsAggregator extends AbstractStringTermsAggr
public GlobalOrdinalsStringTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals valuesSource, public GlobalOrdinalsStringTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals valuesSource,
Terms.Order order, BucketCountThresholds bucketCountThresholds, Terms.Order order, BucketCountThresholds bucketCountThresholds,
IncludeExclude.OrdinalsFilter includeExclude, AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { IncludeExclude.OrdinalsFilter includeExclude,
super(name, factories, aggregationContext, parent, order, bucketCountThresholds, collectionMode, showTermDocCountError, reducers, AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError,
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, order, bucketCountThresholds, collectionMode, showTermDocCountError,
pipelineAggregators,
metaData); metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.includeExclude = includeExclude; this.includeExclude = includeExclude;
@ -200,7 +203,7 @@ public class GlobalOrdinalsStringTermsAggregator extends AbstractStringTermsAggr
} }
return new StringTerms(name, order, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(), return new StringTerms(name, order, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(),
bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, reducers(), bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, pipelineAggregators(),
metaData()); metaData());
} }
@ -266,8 +269,11 @@ public class GlobalOrdinalsStringTermsAggregator extends AbstractStringTermsAggr
public WithHash(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals.FieldData valuesSource, public WithHash(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals.FieldData valuesSource,
Terms.Order order, BucketCountThresholds bucketCountThresholds, IncludeExclude.OrdinalsFilter includeExclude, AggregationContext aggregationContext, Terms.Order order, BucketCountThresholds bucketCountThresholds, IncludeExclude.OrdinalsFilter includeExclude, AggregationContext aggregationContext,
Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { Aggregator parent, SubAggCollectionMode collectionMode,
super(name, factories, valuesSource, order, bucketCountThresholds, includeExclude, aggregationContext, parent, collectionMode, showTermDocCountError, reducers, metaData); boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException {
super(name, factories, valuesSource, order, bucketCountThresholds, includeExclude, aggregationContext, parent, collectionMode,
showTermDocCountError, pipelineAggregators, metaData);
bucketOrds = new LongHash(1, aggregationContext.bigArrays()); bucketOrds = new LongHash(1, aggregationContext.bigArrays());
} }
@ -335,8 +341,12 @@ public class GlobalOrdinalsStringTermsAggregator extends AbstractStringTermsAggr
private RandomAccessOrds segmentOrds; private RandomAccessOrds segmentOrds;
public LowCardinality(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals valuesSource, public LowCardinality(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals valuesSource,
Terms.Order order, BucketCountThresholds bucketCountThresholds, AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { Terms.Order order,
super(name, factories, valuesSource, order, bucketCountThresholds, null, aggregationContext, parent, collectionMode, showTermDocCountError, reducers, metaData); BucketCountThresholds bucketCountThresholds, AggregationContext aggregationContext, Aggregator parent,
SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException {
super(name, factories, valuesSource, order, bucketCountThresholds, null, aggregationContext, parent, collectionMode,
showTermDocCountError, pipelineAggregators, metaData);
assert factories == null || factories.count() == 0; assert factories == null || factories.count() == 0;
this.segmentDocCounts = context.bigArrays().newIntArray(1, true); this.segmentDocCounts = context.bigArrays().newIntArray(1, true);
} }

View File

@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation; import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue; import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import java.util.ArrayList; import java.util.ArrayList;
@ -124,9 +124,9 @@ public abstract class InternalTerms<A extends InternalTerms, B extends InternalT
protected InternalTerms() {} // for serialization protected InternalTerms() {} // for serialization
protected InternalTerms(String name, Terms.Order order, int requiredSize, int shardSize, long minDocCount, protected InternalTerms(String name, Terms.Order order, int requiredSize, int shardSize, long minDocCount,
List<? extends Bucket> buckets, boolean showTermDocCountError, long docCountError, long otherDocCount, List<Reducer> reducers, List<? extends Bucket> buckets, boolean showTermDocCountError, long docCountError, long otherDocCount, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
this.order = order; this.order = order;
this.requiredSize = requiredSize; this.requiredSize = requiredSize;
this.shardSize = shardSize; this.shardSize = shardSize;

View File

@ -26,7 +26,7 @@ import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext; import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams; import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
@ -158,8 +158,8 @@ public class LongTerms extends InternalTerms<LongTerms, LongTerms.Bucket> {
public LongTerms(String name, Terms.Order order, @Nullable ValueFormatter formatter, int requiredSize, int shardSize, long minDocCount, public LongTerms(String name, Terms.Order order, @Nullable ValueFormatter formatter, int requiredSize, int shardSize, long minDocCount,
List<? extends InternalTerms.Bucket> buckets, boolean showTermDocCountError, long docCountError, long otherDocCount, List<? extends InternalTerms.Bucket> buckets, boolean showTermDocCountError, long docCountError, long otherDocCount,
List<Reducer> reducers, Map<String, Object> metaData) { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, reducers, super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, pipelineAggregators,
metaData); metaData);
this.formatter = formatter; this.formatter = formatter;
} }
@ -172,7 +172,7 @@ public class LongTerms extends InternalTerms<LongTerms, LongTerms.Bucket> {
@Override @Override
public LongTerms create(List<Bucket> buckets) { public LongTerms create(List<Bucket> buckets) {
return new LongTerms(this.name, this.order, this.formatter, this.requiredSize, this.shardSize, this.minDocCount, buckets, return new LongTerms(this.name, this.order, this.formatter, this.requiredSize, this.shardSize, this.minDocCount, buckets,
this.showTermDocCountError, this.docCountError, this.otherDocCount, this.reducers(), this.metaData); this.showTermDocCountError, this.docCountError, this.otherDocCount, this.pipelineAggregators(), this.metaData);
} }
@Override @Override
@ -185,7 +185,7 @@ public class LongTerms extends InternalTerms<LongTerms, LongTerms.Bucket> {
protected LongTerms create(String name, List<org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket> buckets, protected LongTerms create(String name, List<org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket> buckets,
long docCountError, long otherDocCount, InternalTerms prototype) { long docCountError, long otherDocCount, InternalTerms prototype) {
return new LongTerms(name, prototype.order, ((LongTerms) prototype).formatter, prototype.requiredSize, prototype.shardSize, return new LongTerms(name, prototype.order, ((LongTerms) prototype).formatter, prototype.requiredSize, prototype.shardSize,
prototype.minDocCount, buckets, prototype.showTermDocCountError, docCountError, otherDocCount, prototype.reducers(), prototype.minDocCount, buckets, prototype.showTermDocCountError, docCountError, otherDocCount, prototype.pipelineAggregators(),
prototype.getMetaData()); prototype.getMetaData());
} }

View File

@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue; import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue;
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude; import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude.LongFilter; import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude.LongFilter;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.format.ValueFormat; import org.elasticsearch.search.aggregations.support.format.ValueFormat;
@ -57,8 +57,8 @@ public class LongTermsAggregator extends TermsAggregator {
public LongTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource, @Nullable ValueFormat format, public LongTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource, @Nullable ValueFormat format,
Terms.Order order, BucketCountThresholds bucketCountThresholds, AggregationContext aggregationContext, Aggregator parent, Terms.Order order, BucketCountThresholds bucketCountThresholds, AggregationContext aggregationContext, Aggregator parent,
SubAggCollectionMode subAggCollectMode, boolean showTermDocCountError, IncludeExclude.LongFilter longFilter, SubAggCollectionMode subAggCollectMode, boolean showTermDocCountError, IncludeExclude.LongFilter longFilter,
List<Reducer> reducers, Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, bucketCountThresholds, order, subAggCollectMode, reducers, metaData); super(name, factories, aggregationContext, parent, bucketCountThresholds, order, subAggCollectMode, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.showTermDocCountError = showTermDocCountError; this.showTermDocCountError = showTermDocCountError;
this.formatter = format != null ? format.formatter() : null; this.formatter = format != null ? format.formatter() : null;
@ -162,7 +162,7 @@ public class LongTermsAggregator extends TermsAggregator {
} }
return new LongTerms(name, order, formatter, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(), return new LongTerms(name, order, formatter, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(),
bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, reducers(), bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, pipelineAggregators(),
metaData()); metaData());
} }
@ -170,7 +170,7 @@ public class LongTermsAggregator extends TermsAggregator {
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new LongTerms(name, order, formatter, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(), return new LongTerms(name, order, formatter, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(),
bucketCountThresholds.getMinDocCount(), Collections.<InternalTerms.Bucket> emptyList(), showTermDocCountError, 0, 0, bucketCountThresholds.getMinDocCount(), Collections.<InternalTerms.Bucket> emptyList(), showTermDocCountError, 0, 0,
reducers(), metaData()); pipelineAggregators(), metaData());
} }
@Override @Override

View File

@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext; import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
import org.elasticsearch.search.aggregations.bucket.BucketStreams; import org.elasticsearch.search.aggregations.bucket.BucketStreams;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -153,8 +153,8 @@ public class StringTerms extends InternalTerms<StringTerms, StringTerms.Bucket>
public StringTerms(String name, Terms.Order order, int requiredSize, int shardSize, long minDocCount, public StringTerms(String name, Terms.Order order, int requiredSize, int shardSize, long minDocCount,
List<? extends InternalTerms.Bucket> buckets, boolean showTermDocCountError, long docCountError, long otherDocCount, List<? extends InternalTerms.Bucket> buckets, boolean showTermDocCountError, long docCountError, long otherDocCount,
List<Reducer> reducers, Map<String, Object> metaData) { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, reducers, super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, pipelineAggregators,
metaData); metaData);
} }
@ -166,7 +166,7 @@ public class StringTerms extends InternalTerms<StringTerms, StringTerms.Bucket>
@Override @Override
public StringTerms create(List<Bucket> buckets) { public StringTerms create(List<Bucket> buckets) {
return new StringTerms(this.name, this.order, this.requiredSize, this.shardSize, this.minDocCount, buckets, return new StringTerms(this.name, this.order, this.requiredSize, this.shardSize, this.minDocCount, buckets,
this.showTermDocCountError, this.docCountError, this.otherDocCount, this.reducers(), this.metaData); this.showTermDocCountError, this.docCountError, this.otherDocCount, this.pipelineAggregators(), this.metaData);
} }
@Override @Override
@ -178,7 +178,7 @@ public class StringTerms extends InternalTerms<StringTerms, StringTerms.Bucket>
protected StringTerms create(String name, List<org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket> buckets, protected StringTerms create(String name, List<org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket> buckets,
long docCountError, long otherDocCount, InternalTerms prototype) { long docCountError, long otherDocCount, InternalTerms prototype) {
return new StringTerms(name, prototype.order, prototype.requiredSize, prototype.shardSize, prototype.minDocCount, buckets, return new StringTerms(name, prototype.order, prototype.requiredSize, prototype.shardSize, prototype.minDocCount, buckets,
prototype.showTermDocCountError, docCountError, otherDocCount, prototype.reducers(), prototype.getMetaData()); prototype.showTermDocCountError, docCountError, otherDocCount, prototype.pipelineAggregators(), prototype.getMetaData());
} }
@Override @Override

View File

@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue; import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue;
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude; import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
@ -52,10 +52,10 @@ public class StringTermsAggregator extends AbstractStringTermsAggregator {
public StringTermsAggregator(String name, AggregatorFactories factories, ValuesSource valuesSource, public StringTermsAggregator(String name, AggregatorFactories factories, ValuesSource valuesSource,
Terms.Order order, BucketCountThresholds bucketCountThresholds, Terms.Order order, BucketCountThresholds bucketCountThresholds,
IncludeExclude.StringFilter includeExclude, AggregationContext aggregationContext, IncludeExclude.StringFilter includeExclude, AggregationContext aggregationContext,
Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<Reducer> reducers, Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, factories, aggregationContext, parent, order, bucketCountThresholds, collectionMode, showTermDocCountError, reducers, super(name, factories, aggregationContext, parent, order, bucketCountThresholds, collectionMode, showTermDocCountError, pipelineAggregators,
metaData); metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.includeExclude = includeExclude; this.includeExclude = includeExclude;
@ -164,7 +164,7 @@ public class StringTermsAggregator extends AbstractStringTermsAggregator {
} }
return new StringTerms(name, order, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(), return new StringTerms(name, order, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(),
bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, reducers(), bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, pipelineAggregators(),
metaData()); metaData());
} }

View File

@ -28,7 +28,7 @@ import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.bucket.BucketsAggregator; import org.elasticsearch.search.aggregations.bucket.BucketsAggregator;
import org.elasticsearch.search.aggregations.bucket.terms.InternalOrder.Aggregation; import org.elasticsearch.search.aggregations.bucket.terms.InternalOrder.Aggregation;
import org.elasticsearch.search.aggregations.bucket.terms.InternalOrder.CompoundOrder; import org.elasticsearch.search.aggregations.bucket.terms.InternalOrder.CompoundOrder;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.AggregationPath; import org.elasticsearch.search.aggregations.support.AggregationPath;
@ -137,8 +137,8 @@ public abstract class TermsAggregator extends BucketsAggregator {
protected final Set<Aggregator> aggsUsedForSorting = new HashSet<>(); protected final Set<Aggregator> aggsUsedForSorting = new HashSet<>();
protected final SubAggCollectionMode collectMode; protected final SubAggCollectionMode collectMode;
public TermsAggregator(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent, BucketCountThresholds bucketCountThresholds, Terms.Order order, SubAggCollectionMode collectMode, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { public TermsAggregator(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent, BucketCountThresholds bucketCountThresholds, Terms.Order order, SubAggCollectionMode collectMode, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, factories, context, parent, reducers, metaData); super(name, factories, context, parent, pipelineAggregators, metaData);
this.bucketCountThresholds = bucketCountThresholds; this.bucketCountThresholds = bucketCountThresholds;
this.order = InternalOrder.validate(order, this); this.order = InternalOrder.validate(order, this);
this.collectMode = collectMode; this.collectMode = collectMode;

View File

@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.NonCollectingAggregator; import org.elasticsearch.search.aggregations.NonCollectingAggregator;
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude; import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -40,20 +40,21 @@ import java.util.Map;
/** /**
* *
*/ */
public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<ValuesSource> { public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<ValuesSource> {
public enum ExecutionMode { public enum ExecutionMode {
MAP(new ParseField("map")) { MAP(new ParseField("map")) {
@Override @Override
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Terms.Order order,
Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode, AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode,
boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException {
final IncludeExclude.StringFilter filter = includeExclude == null ? null : includeExclude.convertToStringFilter(); final IncludeExclude.StringFilter filter = includeExclude == null ? null : includeExclude.convertToStringFilter();
return new StringTermsAggregator(name, factories, valuesSource, order, bucketCountThresholds, filter, return new StringTermsAggregator(name, factories, valuesSource, order, bucketCountThresholds, filter, aggregationContext,
aggregationContext, parent, subAggCollectMode, showTermDocCountError, reducers, metaData); parent, subAggCollectMode, showTermDocCountError, pipelineAggregators, metaData);
} }
@Override @Override
@ -65,11 +66,15 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
GLOBAL_ORDINALS(new ParseField("global_ordinals")) { GLOBAL_ORDINALS(new ParseField("global_ordinals")) {
@Override @Override
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Terms.Order order,
Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode,
boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException {
final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter(); final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter();
return new GlobalOrdinalsStringTermsAggregator(name, factories, (ValuesSource.Bytes.WithOrdinals) valuesSource, order, bucketCountThresholds, filter, aggregationContext, parent, subAggCollectMode, showTermDocCountError, reducers, metaData); return new GlobalOrdinalsStringTermsAggregator(name, factories, (ValuesSource.Bytes.WithOrdinals) valuesSource, order,
bucketCountThresholds, filter, aggregationContext, parent, subAggCollectMode, showTermDocCountError,
pipelineAggregators, metaData);
} }
@Override @Override
@ -81,11 +86,15 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
GLOBAL_ORDINALS_HASH(new ParseField("global_ordinals_hash")) { GLOBAL_ORDINALS_HASH(new ParseField("global_ordinals_hash")) {
@Override @Override
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Terms.Order order,
Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode,
boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException {
final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter(); final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter();
return new GlobalOrdinalsStringTermsAggregator.WithHash(name, factories, (ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, order, bucketCountThresholds, filter, aggregationContext, parent, subAggCollectMode, showTermDocCountError, reducers, metaData); return new GlobalOrdinalsStringTermsAggregator.WithHash(name, factories,
(ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, order, bucketCountThresholds, filter, aggregationContext,
parent, subAggCollectMode, showTermDocCountError, pipelineAggregators, metaData);
} }
@Override @Override
@ -96,14 +105,18 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
GLOBAL_ORDINALS_LOW_CARDINALITY(new ParseField("global_ordinals_low_cardinality")) { GLOBAL_ORDINALS_LOW_CARDINALITY(new ParseField("global_ordinals_low_cardinality")) {
@Override @Override
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Terms.Order order,
Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode, AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode,
boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException {
if (includeExclude != null || factories.count() > 0) { if (includeExclude != null || factories.count() > 0) {
return GLOBAL_ORDINALS.create(name, factories, valuesSource, order, bucketCountThresholds, includeExclude, aggregationContext, parent, subAggCollectMode, showTermDocCountError, reducers, metaData); return GLOBAL_ORDINALS.create(name, factories, valuesSource, order, bucketCountThresholds, includeExclude,
aggregationContext, parent, subAggCollectMode, showTermDocCountError, pipelineAggregators, metaData);
} }
return new GlobalOrdinalsStringTermsAggregator.LowCardinality(name, factories, (ValuesSource.Bytes.WithOrdinals) valuesSource, order, bucketCountThresholds, aggregationContext, parent, subAggCollectMode, showTermDocCountError, reducers, metaData); return new GlobalOrdinalsStringTermsAggregator.LowCardinality(name, factories,
(ValuesSource.Bytes.WithOrdinals) valuesSource, order, bucketCountThresholds, aggregationContext, parent,
subAggCollectMode, showTermDocCountError, pipelineAggregators, metaData);
} }
@Override @Override
@ -127,10 +140,11 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
this.parseField = parseField; this.parseField = parseField;
} }
abstract Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, abstract Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Terms.Order order,
Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
IncludeExclude includeExclude, AggregationContext aggregationContext, Aggregator parent, AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode,
SubAggCollectionMode subAggCollectMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException; boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException;
abstract boolean needsGlobalOrdinals(); abstract boolean needsGlobalOrdinals();
@ -147,7 +161,9 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
private final TermsAggregator.BucketCountThresholds bucketCountThresholds; private final TermsAggregator.BucketCountThresholds bucketCountThresholds;
private final boolean showTermDocCountError; private final boolean showTermDocCountError;
public TermsAggregatorFactory(String name, ValuesSourceConfig config, Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, String executionHint, SubAggCollectionMode executionMode, boolean showTermDocCountError) { public TermsAggregatorFactory(String name, ValuesSourceConfig config, Terms.Order order,
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, String executionHint,
SubAggCollectionMode executionMode, boolean showTermDocCountError) {
super(name, StringTerms.TYPE.name(), config); super(name, StringTerms.TYPE.name(), config);
this.order = order; this.order = order;
this.includeExclude = includeExclude; this.includeExclude = includeExclude;
@ -158,15 +174,16 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
final InternalAggregation aggregation = new UnmappedTerms(name, order, bucketCountThresholds.getRequiredSize(), final InternalAggregation aggregation = new UnmappedTerms(name, order, bucketCountThresholds.getRequiredSize(),
bucketCountThresholds.getShardSize(), bucketCountThresholds.getMinDocCount(), reducers, metaData); bucketCountThresholds.getShardSize(), bucketCountThresholds.getMinDocCount(), pipelineAggregators, metaData);
return new NonCollectingAggregator(name, aggregationContext, parent, factories, reducers, metaData) { return new NonCollectingAggregator(name, aggregationContext, parent, factories, pipelineAggregators, metaData) {
{ {
// even in the case of an unmapped aggregator, validate the order // even in the case of an unmapped aggregator, validate the order
InternalOrder.validate(order, this); InternalOrder.validate(order, this);
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return aggregation; return aggregation;
@ -176,7 +193,8 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
@Override @Override
protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext aggregationContext, Aggregator parent, protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext aggregationContext, Aggregator parent,
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException {
if (collectsFromSingleBucket == false) { if (collectsFromSingleBucket == false) {
return asMultiBucketAggregator(this, aggregationContext, parent); return asMultiBucketAggregator(this, aggregationContext, parent);
} }
@ -226,12 +244,16 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
} }
assert execution != null; assert execution != null;
return execution.create(name, factories, valuesSource, order, bucketCountThresholds, includeExclude, aggregationContext, parent, collectMode, showTermDocCountError, reducers, metaData); return execution.create(name, factories, valuesSource, order, bucketCountThresholds, includeExclude, aggregationContext,
parent, collectMode, showTermDocCountError, pipelineAggregators, metaData);
} }
if ((includeExclude != null) && (includeExclude.isRegexBased())) { if ((includeExclude != null) && (includeExclude.isRegexBased())) {
throw new AggregationExecutionException("Aggregation [" + name + "] cannot support regular expression style include/exclude " + throw new AggregationExecutionException(
"settings as they can only be applied to string fields. Use an array of numeric values for include/exclude clauses used to filter numeric fields"); "Aggregation ["
+ name
+ "] cannot support regular expression style include/exclude "
+ "settings as they can only be applied to string fields. Use an array of numeric values for include/exclude clauses used to filter numeric fields");
} }
if (valuesSource instanceof ValuesSource.Numeric) { if (valuesSource instanceof ValuesSource.Numeric) {
@ -240,19 +262,20 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
if (includeExclude != null) { if (includeExclude != null) {
longFilter = includeExclude.convertToDoubleFilter(); longFilter = includeExclude.convertToDoubleFilter();
} }
return new DoubleTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(), return new DoubleTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(), order,
order, bucketCountThresholds, aggregationContext, parent, collectMode, bucketCountThresholds, aggregationContext, parent, collectMode, showTermDocCountError, longFilter,
showTermDocCountError, longFilter, reducers, metaData); pipelineAggregators, metaData);
} }
if (includeExclude != null) { if (includeExclude != null) {
longFilter = includeExclude.convertToLongFilter(); longFilter = includeExclude.convertToLongFilter();
} }
return new LongTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(), return new LongTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(), order,
order, bucketCountThresholds, aggregationContext, parent, collectMode, showTermDocCountError, longFilter, reducers, metaData); bucketCountThresholds, aggregationContext, parent, collectMode, showTermDocCountError, longFilter, pipelineAggregators,
metaData);
} }
throw new AggregationExecutionException("terms aggregation cannot be applied to field [" + config.fieldContext().field() + throw new AggregationExecutionException("terms aggregation cannot be applied to field [" + config.fieldContext().field()
"]. It can only be applied to numeric or string fields."); + "]. It can only be applied to numeric or string fields.");
} }
} }

View File

@ -24,7 +24,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.InternalAggregations; import org.elasticsearch.search.aggregations.InternalAggregations;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.Collections; import java.util.Collections;
@ -56,9 +56,9 @@ public class UnmappedTerms extends InternalTerms<UnmappedTerms, InternalTerms.Bu
UnmappedTerms() {} // for serialization UnmappedTerms() {} // for serialization
public UnmappedTerms(String name, Terms.Order order, int requiredSize, int shardSize, long minDocCount, List<Reducer> reducers, public UnmappedTerms(String name, Terms.Order order, int requiredSize, int shardSize, long minDocCount, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, order, requiredSize, shardSize, minDocCount, BUCKETS, false, 0, 0, reducers, metaData); super(name, order, requiredSize, shardSize, minDocCount, BUCKETS, false, 0, 0, pipelineAggregators, metaData);
} }
@Override @Override
@ -68,7 +68,7 @@ public class UnmappedTerms extends InternalTerms<UnmappedTerms, InternalTerms.Bu
@Override @Override
public UnmappedTerms create(List<InternalTerms.Bucket> buckets) { public UnmappedTerms create(List<InternalTerms.Bucket> buckets) {
return new UnmappedTerms(this.name, this.order, this.requiredSize, this.shardSize, this.minDocCount, this.reducers(), this.metaData); return new UnmappedTerms(this.name, this.order, this.requiredSize, this.shardSize, this.minDocCount, this.pipelineAggregators(), this.metaData);
} }
@Override @Override

View File

@ -20,7 +20,7 @@
package org.elasticsearch.search.aggregations.metrics; package org.elasticsearch.search.aggregations.metrics;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
@ -29,7 +29,7 @@ public abstract class InternalMetricsAggregation extends InternalAggregation {
protected InternalMetricsAggregation() {} // for serialization protected InternalMetricsAggregation() {} // for serialization
protected InternalMetricsAggregation(String name, List<Reducer> reducers, Map<String, Object> metaData) { protected InternalMetricsAggregation(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
} }
} }

View File

@ -18,7 +18,7 @@
*/ */
package org.elasticsearch.search.aggregations.metrics; package org.elasticsearch.search.aggregations.metrics;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import java.util.List; import java.util.List;
@ -35,8 +35,8 @@ public abstract class InternalNumericMetricsAggregation extends InternalMetricsA
protected SingleValue() {} protected SingleValue() {}
protected SingleValue(String name, List<Reducer> reducers, Map<String, Object> metaData) { protected SingleValue(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
} }
@Override @Override
@ -65,8 +65,8 @@ public abstract class InternalNumericMetricsAggregation extends InternalMetricsA
protected MultiValue() {} protected MultiValue() {}
protected MultiValue(String name, List<Reducer> reducers, Map<String, Object> metaData) { protected MultiValue(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
} }
public abstract double value(String name); public abstract double value(String name);
@ -93,8 +93,8 @@ public abstract class InternalNumericMetricsAggregation extends InternalMetricsA
private InternalNumericMetricsAggregation() {} // for serialization private InternalNumericMetricsAggregation() {} // for serialization
private InternalNumericMetricsAggregation(String name, List<Reducer> reducers, Map<String, Object> metaData) { private InternalNumericMetricsAggregation(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
} }
} }

View File

@ -22,7 +22,7 @@ package org.elasticsearch.search.aggregations.metrics;
import org.elasticsearch.search.aggregations.Aggregator; import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.AggregatorBase; import org.elasticsearch.search.aggregations.AggregatorBase;
import org.elasticsearch.search.aggregations.AggregatorFactories; import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -31,8 +31,8 @@ import java.util.Map;
public abstract class MetricsAggregator extends AggregatorBase { public abstract class MetricsAggregator extends AggregatorBase {
protected MetricsAggregator(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers, protected MetricsAggregator(String name, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, AggregatorFactories.EMPTY, context, parent, reducers, metaData); super(name, AggregatorFactories.EMPTY, context, parent, pipelineAggregators, metaData);
} }
} }

View File

@ -19,7 +19,7 @@
package org.elasticsearch.search.aggregations.metrics; package org.elasticsearch.search.aggregations.metrics;
import org.elasticsearch.search.aggregations.Aggregator; import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import java.io.IOException; import java.io.IOException;
@ -31,16 +31,16 @@ import java.util.Map;
*/ */
public abstract class NumericMetricsAggregator extends MetricsAggregator { public abstract class NumericMetricsAggregator extends MetricsAggregator {
private NumericMetricsAggregator(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers, private NumericMetricsAggregator(String name, AggregationContext context, Aggregator parent,
Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, context, parent, reducers, metaData); super(name, context, parent, pipelineAggregators, metaData);
} }
public static abstract class SingleValue extends NumericMetricsAggregator { public static abstract class SingleValue extends NumericMetricsAggregator {
protected SingleValue(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers, protected SingleValue(String name, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, context, parent, reducers, metaData); super(name, context, parent, pipelineAggregators, metaData);
} }
public abstract double metric(long owningBucketOrd); public abstract double metric(long owningBucketOrd);
@ -48,9 +48,9 @@ public abstract class NumericMetricsAggregator extends MetricsAggregator {
public static abstract class MultiValue extends NumericMetricsAggregator { public static abstract class MultiValue extends NumericMetricsAggregator {
protected MultiValue(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers, protected MultiValue(String name, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, context, parent, reducers, metaData); super(name, context, parent, pipelineAggregators, metaData);
} }
public abstract boolean hasMetric(String name); public abstract boolean hasMetric(String name);

View File

@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.metrics.NumericMetricsAggregator; import org.elasticsearch.search.aggregations.metrics.NumericMetricsAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -53,8 +53,9 @@ public class AvgAggregator extends NumericMetricsAggregator.SingleValue {
ValueFormatter formatter; ValueFormatter formatter;
public AvgAggregator(String name, ValuesSource.Numeric valuesSource, @Nullable ValueFormatter formatter, public AvgAggregator(String name, ValuesSource.Numeric valuesSource, @Nullable ValueFormatter formatter,
AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { AggregationContext context,
super(name, context, parent, reducers, metaData); Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, context, parent, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.formatter = formatter; this.formatter = formatter;
if (valuesSource != null) { if (valuesSource != null) {
@ -105,12 +106,12 @@ public class AvgAggregator extends NumericMetricsAggregator.SingleValue {
if (valuesSource == null || bucket >= sums.size()) { if (valuesSource == null || bucket >= sums.size()) {
return buildEmptyAggregation(); return buildEmptyAggregation();
} }
return new InternalAvg(name, sums.get(bucket), counts.get(bucket), formatter, reducers(), metaData()); return new InternalAvg(name, sums.get(bucket), counts.get(bucket), formatter, pipelineAggregators(), metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalAvg(name, 0.0, 0l, formatter, reducers(), metaData()); return new InternalAvg(name, 0.0, 0l, formatter, pipelineAggregators(), metaData());
} }
public static class Factory extends ValuesSourceAggregatorFactory.LeafOnly<ValuesSource.Numeric> { public static class Factory extends ValuesSourceAggregatorFactory.LeafOnly<ValuesSource.Numeric> {
@ -120,15 +121,17 @@ public class AvgAggregator extends NumericMetricsAggregator.SingleValue {
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
return new AvgAggregator(name, null, config.formatter(), aggregationContext, parent, reducers, metaData); return new AvgAggregator(name, null, config.formatter(), aggregationContext, parent, pipelineAggregators, metaData);
} }
@Override @Override
protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent, protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent,
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
return new AvgAggregator(name, valuesSource, config.formatter(), aggregationContext, parent, reducers, metaData); throws IOException {
return new AvgAggregator(name, valuesSource, config.formatter(), aggregationContext, parent, pipelineAggregators, metaData);
} }
} }

View File

@ -25,7 +25,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.metrics.InternalNumericMetricsAggregation; import org.elasticsearch.search.aggregations.metrics.InternalNumericMetricsAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams; import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
@ -58,9 +58,9 @@ public class InternalAvg extends InternalNumericMetricsAggregation.SingleValue i
InternalAvg() {} // for serialization InternalAvg() {} // for serialization
public InternalAvg(String name, double sum, long count, @Nullable ValueFormatter formatter, List<Reducer> reducers, public InternalAvg(String name, double sum, long count, @Nullable ValueFormatter formatter, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
this.sum = sum; this.sum = sum;
this.count = count; this.count = count;
this.valueFormatter = formatter; this.valueFormatter = formatter;
@ -89,7 +89,7 @@ public class InternalAvg extends InternalNumericMetricsAggregation.SingleValue i
count += ((InternalAvg) aggregation).count; count += ((InternalAvg) aggregation).count;
sum += ((InternalAvg) aggregation).sum; sum += ((InternalAvg) aggregation).sum;
} }
return new InternalAvg(getName(), sum, count, valueFormatter, reducers(), getMetaData()); return new InternalAvg(getName(), sum, count, valueFormatter, pipelineAggregators(), getMetaData());
} }
@Override @Override

View File

@ -42,7 +42,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.metrics.NumericMetricsAggregator; import org.elasticsearch.search.aggregations.metrics.NumericMetricsAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
@ -68,8 +68,8 @@ public class CardinalityAggregator extends NumericMetricsAggregator.SingleValue
private ValueFormatter formatter; private ValueFormatter formatter;
public CardinalityAggregator(String name, ValuesSource valuesSource, boolean rehash, int precision, @Nullable ValueFormatter formatter, public CardinalityAggregator(String name, ValuesSource valuesSource, boolean rehash, int precision, @Nullable ValueFormatter formatter,
AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
super(name, context, parent, reducers, metaData); super(name, context, parent, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.rehash = rehash; this.rehash = rehash;
this.precision = precision; this.precision = precision;
@ -158,12 +158,12 @@ public class CardinalityAggregator extends NumericMetricsAggregator.SingleValue
// this Aggregator (and its HLL++ counters) is released. // this Aggregator (and its HLL++ counters) is released.
HyperLogLogPlusPlus copy = new HyperLogLogPlusPlus(precision, BigArrays.NON_RECYCLING_INSTANCE, 1); HyperLogLogPlusPlus copy = new HyperLogLogPlusPlus(precision, BigArrays.NON_RECYCLING_INSTANCE, 1);
copy.merge(0, counts, owningBucketOrdinal); copy.merge(0, counts, owningBucketOrdinal);
return new InternalCardinality(name, copy, formatter, reducers(), metaData()); return new InternalCardinality(name, copy, formatter, pipelineAggregators(), metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalCardinality(name, null, formatter, reducers(), metaData()); return new InternalCardinality(name, null, formatter, pipelineAggregators(), metaData());
} }
@Override @Override

View File

@ -22,7 +22,7 @@ package org.elasticsearch.search.aggregations.metrics.cardinality;
import org.elasticsearch.search.aggregations.AggregationExecutionException; import org.elasticsearch.search.aggregations.AggregationExecutionException;
import org.elasticsearch.search.aggregations.Aggregator; import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator; import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -48,18 +48,18 @@ final class CardinalityAggregatorFactory extends ValuesSourceAggregatorFactory<V
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) protected Aggregator createUnmapped(AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException { throws IOException {
return new CardinalityAggregator(name, null, true, precision(parent), config.formatter(), context, parent, reducers, metaData); return new CardinalityAggregator(name, null, true, precision(parent), config.formatter(), context, parent, pipelineAggregators, metaData);
} }
@Override @Override
protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext context, Aggregator parent, protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext context, Aggregator parent,
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
if (!(valuesSource instanceof ValuesSource.Numeric) && !rehash) { if (!(valuesSource instanceof ValuesSource.Numeric) && !rehash) {
throw new AggregationExecutionException("Turning off rehashing for cardinality aggregation [" + name + "] on non-numeric values in not allowed"); throw new AggregationExecutionException("Turning off rehashing for cardinality aggregation [" + name + "] on non-numeric values in not allowed");
} }
return new CardinalityAggregator(name, valuesSource, rehash, precision(parent), config.formatter(), context, parent, reducers, return new CardinalityAggregator(name, valuesSource, rehash, precision(parent), config.formatter(), context, parent, pipelineAggregators,
metaData); metaData);
} }

View File

@ -27,7 +27,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.metrics.InternalNumericMetricsAggregation; import org.elasticsearch.search.aggregations.metrics.InternalNumericMetricsAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams; import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
@ -54,9 +54,9 @@ public final class InternalCardinality extends InternalNumericMetricsAggregation
private HyperLogLogPlusPlus counts; private HyperLogLogPlusPlus counts;
InternalCardinality(String name, HyperLogLogPlusPlus counts, @Nullable ValueFormatter formatter, List<Reducer> reducers, InternalCardinality(String name, HyperLogLogPlusPlus counts, @Nullable ValueFormatter formatter, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) { Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
this.counts = counts; this.counts = counts;
this.valueFormatter = formatter; this.valueFormatter = formatter;
} }
@ -108,7 +108,7 @@ public final class InternalCardinality extends InternalNumericMetricsAggregation
if (cardinality.counts != null) { if (cardinality.counts != null) {
if (reduced == null) { if (reduced == null) {
reduced = new InternalCardinality(name, new HyperLogLogPlusPlus(cardinality.counts.precision(), reduced = new InternalCardinality(name, new HyperLogLogPlusPlus(cardinality.counts.precision(),
BigArrays.NON_RECYCLING_INSTANCE, 1), this.valueFormatter, reducers(), getMetaData()); BigArrays.NON_RECYCLING_INSTANCE, 1), this.valueFormatter, pipelineAggregators(), getMetaData());
} }
reduced.merge(cardinality); reduced.merge(cardinality);
} }

View File

@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.metrics.MetricsAggregator; import org.elasticsearch.search.aggregations.metrics.MetricsAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -52,9 +52,9 @@ public final class GeoBoundsAggregator extends MetricsAggregator {
DoubleArray negRights; DoubleArray negRights;
protected GeoBoundsAggregator(String name, AggregationContext aggregationContext, Aggregator parent, protected GeoBoundsAggregator(String name, AggregationContext aggregationContext, Aggregator parent,
ValuesSource.GeoPoint valuesSource, boolean wrapLongitude, List<Reducer> reducers, ValuesSource.GeoPoint valuesSource, boolean wrapLongitude, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, aggregationContext, parent, reducers, metaData); super(name, aggregationContext, parent, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.wrapLongitude = wrapLongitude; this.wrapLongitude = wrapLongitude;
if (valuesSource != null) { if (valuesSource != null) {
@ -152,13 +152,13 @@ public final class GeoBoundsAggregator extends MetricsAggregator {
double posRight = posRights.get(owningBucketOrdinal); double posRight = posRights.get(owningBucketOrdinal);
double negLeft = negLefts.get(owningBucketOrdinal); double negLeft = negLefts.get(owningBucketOrdinal);
double negRight = negRights.get(owningBucketOrdinal); double negRight = negRights.get(owningBucketOrdinal);
return new InternalGeoBounds(name, top, bottom, posLeft, posRight, negLeft, negRight, wrapLongitude, reducers(), metaData()); return new InternalGeoBounds(name, top, bottom, posLeft, posRight, negLeft, negRight, wrapLongitude, pipelineAggregators(), metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalGeoBounds(name, Double.NEGATIVE_INFINITY, Double.POSITIVE_INFINITY, Double.POSITIVE_INFINITY, return new InternalGeoBounds(name, Double.NEGATIVE_INFINITY, Double.POSITIVE_INFINITY, Double.POSITIVE_INFINITY,
Double.NEGATIVE_INFINITY, Double.POSITIVE_INFINITY, Double.NEGATIVE_INFINITY, wrapLongitude, reducers(), metaData()); Double.NEGATIVE_INFINITY, Double.POSITIVE_INFINITY, Double.NEGATIVE_INFINITY, wrapLongitude, pipelineAggregators(), metaData());
} }
@Override @Override
@ -176,15 +176,16 @@ public final class GeoBoundsAggregator extends MetricsAggregator {
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
return new GeoBoundsAggregator(name, aggregationContext, parent, null, wrapLongitude, reducers, metaData); return new GeoBoundsAggregator(name, aggregationContext, parent, null, wrapLongitude, pipelineAggregators, metaData);
} }
@Override @Override
protected Aggregator doCreateInternal(ValuesSource.GeoPoint valuesSource, AggregationContext aggregationContext, protected Aggregator doCreateInternal(ValuesSource.GeoPoint valuesSource, AggregationContext aggregationContext, Aggregator parent,
Aggregator parent, boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
return new GeoBoundsAggregator(name, aggregationContext, parent, valuesSource, wrapLongitude, reducers, metaData); throws IOException {
return new GeoBoundsAggregator(name, aggregationContext, parent, valuesSource, wrapLongitude, pipelineAggregators, metaData);
} }
} }

View File

@ -26,7 +26,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.metrics.InternalMetricsAggregation; import org.elasticsearch.search.aggregations.metrics.InternalMetricsAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -57,8 +57,8 @@ public class InternalGeoBounds extends InternalMetricsAggregation implements Geo
InternalGeoBounds(String name, double top, double bottom, double posLeft, double posRight, InternalGeoBounds(String name, double top, double bottom, double posLeft, double posRight,
double negLeft, double negRight, boolean wrapLongitude, double negLeft, double negRight, boolean wrapLongitude,
List<Reducer> reducers, Map<String, Object> metaData) { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
this.top = top; this.top = top;
this.bottom = bottom; this.bottom = bottom;
this.posLeft = posLeft; this.posLeft = posLeft;
@ -104,7 +104,7 @@ public class InternalGeoBounds extends InternalMetricsAggregation implements Geo
negRight = bounds.negRight; negRight = bounds.negRight;
} }
} }
return new InternalGeoBounds(name, top, bottom, posLeft, posRight, negLeft, negRight, wrapLongitude, reducers(), getMetaData()); return new InternalGeoBounds(name, top, bottom, posLeft, posRight, negLeft, negRight, wrapLongitude, pipelineAggregators(), getMetaData());
} }
@Override @Override

View File

@ -25,7 +25,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.search.aggregations.AggregationStreams; import org.elasticsearch.search.aggregations.AggregationStreams;
import org.elasticsearch.search.aggregations.InternalAggregation; import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.metrics.InternalNumericMetricsAggregation; import org.elasticsearch.search.aggregations.metrics.InternalNumericMetricsAggregation;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.format.ValueFormatter; import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams; import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
@ -57,8 +57,8 @@ public class InternalMax extends InternalNumericMetricsAggregation.SingleValue i
InternalMax() {} // for serialization InternalMax() {} // for serialization
public InternalMax(String name, double max, @Nullable ValueFormatter formatter, List<Reducer> reducers, Map<String, Object> metaData) { public InternalMax(String name, double max, @Nullable ValueFormatter formatter, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
super(name, reducers, metaData); super(name, pipelineAggregators, metaData);
this.valueFormatter = formatter; this.valueFormatter = formatter;
this.max = max; this.max = max;
} }
@ -84,7 +84,7 @@ public class InternalMax extends InternalNumericMetricsAggregation.SingleValue i
for (InternalAggregation aggregation : aggregations) { for (InternalAggregation aggregation : aggregations) {
max = Math.max(max, ((InternalMax) aggregation).max); max = Math.max(max, ((InternalMax) aggregation).max);
} }
return new InternalMax(name, max, valueFormatter, reducers(), getMetaData()); return new InternalMax(name, max, valueFormatter, pipelineAggregators(), getMetaData());
} }
@Override @Override

View File

@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
import org.elasticsearch.search.aggregations.LeafBucketCollector; import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.metrics.NumericMetricsAggregator; import org.elasticsearch.search.aggregations.metrics.NumericMetricsAggregator;
import org.elasticsearch.search.aggregations.reducers.Reducer; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
@ -53,9 +53,10 @@ public class MaxAggregator extends NumericMetricsAggregator.SingleValue {
DoubleArray maxes; DoubleArray maxes;
public MaxAggregator(String name, ValuesSource.Numeric valuesSource, @Nullable ValueFormatter formatter, public MaxAggregator(String name, ValuesSource.Numeric valuesSource, @Nullable ValueFormatter formatter,
AggregationContext context, Aggregator parent, List<Reducer> reducers, AggregationContext context,
Aggregator parent, List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, context, parent, reducers, metaData); super(name, context, parent, pipelineAggregators, metaData);
this.valuesSource = valuesSource; this.valuesSource = valuesSource;
this.formatter = formatter; this.formatter = formatter;
if (valuesSource != null) { if (valuesSource != null) {
@ -106,12 +107,12 @@ public class MaxAggregator extends NumericMetricsAggregator.SingleValue {
if (valuesSource == null || bucket >= maxes.size()) { if (valuesSource == null || bucket >= maxes.size()) {
return buildEmptyAggregation(); return buildEmptyAggregation();
} }
return new InternalMax(name, maxes.get(bucket), formatter, reducers(), metaData()); return new InternalMax(name, maxes.get(bucket), formatter, pipelineAggregators(), metaData());
} }
@Override @Override
public InternalAggregation buildEmptyAggregation() { public InternalAggregation buildEmptyAggregation() {
return new InternalMax(name, Double.NEGATIVE_INFINITY, formatter, reducers(), metaData()); return new InternalMax(name, Double.NEGATIVE_INFINITY, formatter, pipelineAggregators(), metaData());
} }
public static class Factory extends ValuesSourceAggregatorFactory.LeafOnly<ValuesSource.Numeric> { public static class Factory extends ValuesSourceAggregatorFactory.LeafOnly<ValuesSource.Numeric> {
@ -121,15 +122,16 @@ public class MaxAggregator extends NumericMetricsAggregator.SingleValue {
} }
@Override @Override
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
Map<String, Object> metaData) throws IOException { List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
return new MaxAggregator(name, null, config.formatter(), aggregationContext, parent, reducers, metaData); return new MaxAggregator(name, null, config.formatter(), aggregationContext, parent, pipelineAggregators, metaData);
} }
@Override @Override
protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent, protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent,
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException { boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
return new MaxAggregator(name, valuesSource, config.formatter(), aggregationContext, parent, reducers, metaData); throws IOException {
return new MaxAggregator(name, valuesSource, config.formatter(), aggregationContext, parent, pipelineAggregators, metaData);
} }
} }

Some files were not shown because too many files have changed in this diff Show More