Aggregations: Renaming reducers to Pipeline Aggregators
|
@ -23,7 +23,7 @@ it is often easier to break them into two main families:
|
|||
<<search-aggregations-metrics, _Metric_>>::
|
||||
Aggregations that keep track and compute metrics over a set of documents.
|
||||
|
||||
<<search-aggregations-reducer, _Reducer_>>::
|
||||
<<search-aggregations-pipeline, _Pipeline_>>::
|
||||
Aggregations that aggregate the output of other aggregations and their associated metrics
|
||||
|
||||
The interesting part comes next. Since each bucket effectively defines a document set (all documents belonging to
|
||||
|
@ -100,6 +100,6 @@ include::aggregations/metrics.asciidoc[]
|
|||
|
||||
include::aggregations/bucket.asciidoc[]
|
||||
|
||||
include::aggregations/reducer.asciidoc[]
|
||||
include::aggregations/pipeline.asciidoc[]
|
||||
|
||||
include::aggregations/misc.asciidoc[]
|
||||
|
|
|
@ -1,39 +1,39 @@
|
|||
[[search-aggregations-reducer]]
|
||||
[[search-aggregations-pipeline]]
|
||||
|
||||
== Reducer Aggregations
|
||||
== Pipeline Aggregations
|
||||
|
||||
coming[2.0.0]
|
||||
|
||||
experimental[]
|
||||
|
||||
Reducer aggregations work on the outputs produced from other aggregations rather than from document sets, adding
|
||||
information to the output tree. There are many different types of reducer, each computing different information from
|
||||
Pipeline aggregations work on the outputs produced from other aggregations rather than from document sets, adding
|
||||
information to the output tree. There are many different types of pipeline aggregation, each computing different information from
|
||||
other aggregations, but these types can broken down into two families:
|
||||
|
||||
_Parent_::
|
||||
A family of reducer aggregations that is provided with the output of its parent aggregation and is able
|
||||
A family of pipeline aggregations that is provided with the output of its parent aggregation and is able
|
||||
to compute new buckets or new aggregations to add to existing buckets.
|
||||
|
||||
_Sibling_::
|
||||
Reducer aggregations that are provided with the output of a sibling aggregation and are able to compute a
|
||||
Pipeline aggregations that are provided with the output of a sibling aggregation and are able to compute a
|
||||
new aggregation which will be at the same level as the sibling aggregation.
|
||||
|
||||
Reducer aggregations can reference the aggregations they need to perform their computation by using the `buckets_paths`
|
||||
Pipeline aggregations can reference the aggregations they need to perform their computation by using the `buckets_paths`
|
||||
parameter to indicate the paths to the required metrics. The syntax for defining these paths can be found in the
|
||||
<<bucket-path-syntax, `buckets_path` Syntax>> section below.
|
||||
|
||||
Reducer aggregations cannot have sub-aggregations but depending on the type it can reference another reducer in the `buckets_path`
|
||||
allowing reducers to be chained. For example, you can chain together two derivatives to calculate the second derivative
|
||||
Pipeline aggregations cannot have sub-aggregations but depending on the type it can reference another pipeline in the `buckets_path`
|
||||
allowing pipeline aggregations to be chained. For example, you can chain together two derivatives to calculate the second derivative
|
||||
(e.g. a derivative of a derivative).
|
||||
|
||||
NOTE: Because reducer aggregations only add to the output, when chaining reducer aggregations the output of each reducer will be
|
||||
included in the final output.
|
||||
NOTE: Because pipeline aggregations only add to the output, when chaining pipeline aggregations the output of each pipeline aggregation
|
||||
will be included in the final output.
|
||||
|
||||
[[bucket-path-syntax]]
|
||||
[float]
|
||||
=== `buckets_path` Syntax
|
||||
|
||||
Most reducers require another aggregation as their input. The input aggregation is defined via the `buckets_path`
|
||||
Most pipeline aggregations require another aggregation as their input. The input aggregation is defined via the `buckets_path`
|
||||
parameter, which follows a specific format:
|
||||
|
||||
--------------------------------------------------
|
||||
|
@ -47,7 +47,7 @@ PATH := <AGG_NAME>[<AGG_SEPARATOR><AGG_NAME>]*[<METRIC_SEPARATOR
|
|||
For example, the path `"my_bucket>my_stats.avg"` will path to the `avg` value in the `"my_stats"` metric, which is
|
||||
contained in the `"my_bucket"` bucket aggregation.
|
||||
|
||||
Paths are relative from the position of the reducer; they are not absolute paths, and the path cannot go back "up" the
|
||||
Paths are relative from the position of the pipeline aggregation; they are not absolute paths, and the path cannot go back "up" the
|
||||
aggregation tree. For example, this moving average is embedded inside a date_histogram and refers to a "sibling"
|
||||
metric `"the_sum"`:
|
||||
|
||||
|
@ -73,7 +73,7 @@ metric `"the_sum"`:
|
|||
<1> The metric is called `"the_sum"`
|
||||
<2> The `buckets_path` refers to the metric via a relative path `"the_sum"`
|
||||
|
||||
`buckets_path` is also used for Sibling reducer aggregations, where the aggregation is "next" to a series of buckets
|
||||
`buckets_path` is also used for Sibling pipeline aggregations, where the aggregation is "next" to a series of buckets
|
||||
instead of embedded "inside" them. For example, the `max_bucket` aggregation uses the `buckets_path` to specify
|
||||
a metric embedded inside a sibling aggregation:
|
||||
|
||||
|
@ -109,7 +109,7 @@ a metric embedded inside a sibling aggregation:
|
|||
==== Special Paths
|
||||
|
||||
Instead of pathing to a metric, `buckets_path` can use a special `"_count"` path. This instructs
|
||||
the reducer to use the document count as it's input. For example, a moving average can be calculated on the document
|
||||
the pipeline aggregation to use the document count as it's input. For example, a moving average can be calculated on the document
|
||||
count of each bucket, instead of a specific metric:
|
||||
|
||||
[source,js]
|
||||
|
@ -141,7 +141,7 @@ There are a couple of reasons why the data output by the enclosing histogram may
|
|||
on the enclosing histogram or with a query matching only a small number of documents)
|
||||
|
||||
Where there is no data available in a bucket for a given metric it presents a problem for calculating the derivative value for both
|
||||
the current bucket and the next bucket. In the derivative reducer aggregation has a `gap policy` parameter to define what the behavior
|
||||
the current bucket and the next bucket. In the derivative pipeline aggregation has a `gap policy` parameter to define what the behavior
|
||||
should be when a gap in the data is found. There are currently two options for controlling the gap policy:
|
||||
|
||||
_skip_::
|
||||
|
@ -154,9 +154,9 @@ _insert_zeros_::
|
|||
|
||||
|
||||
|
||||
include::reducer/avg-bucket-aggregation.asciidoc[]
|
||||
include::reducer/derivative-aggregation.asciidoc[]
|
||||
include::reducer/max-bucket-aggregation.asciidoc[]
|
||||
include::reducer/min-bucket-aggregation.asciidoc[]
|
||||
include::reducer/sum-bucket-aggregation.asciidoc[]
|
||||
include::reducer/movavg-aggregation.asciidoc[]
|
||||
include::pipeline/avg-bucket-aggregation.asciidoc[]
|
||||
include::pipeline/derivative-aggregation.asciidoc[]
|
||||
include::pipeline/max-bucket-aggregation.asciidoc[]
|
||||
include::pipeline/min-bucket-aggregation.asciidoc[]
|
||||
include::pipeline/sum-bucket-aggregation.asciidoc[]
|
||||
include::pipeline/movavg-aggregation.asciidoc[]
|
|
@ -1,7 +1,7 @@
|
|||
[[search-aggregations-reducer-avg-bucket-aggregation]]
|
||||
[[search-aggregations-pipeline-avg-bucket-aggregation]]
|
||||
=== Avg Bucket Aggregation
|
||||
|
||||
A sibling reducer aggregation which calculates the (mean) average value of a specified metric in a sibling aggregation.
|
||||
A sibling pipeline aggregation which calculates the (mean) average value of a specified metric in a sibling aggregation.
|
||||
The specified metric must be numeric and the sibling aggregation must be a multi-bucket aggregation.
|
||||
|
||||
==== Syntax
|
|
@ -1,7 +1,7 @@
|
|||
[[search-aggregations-reducer-derivative-aggregation]]
|
||||
[[search-aggregations-pipeline-derivative-aggregation]]
|
||||
=== Derivative Aggregation
|
||||
|
||||
A parent reducer aggregation which calculates the derivative of a specified metric in a parent histogram (or date_histogram)
|
||||
A parent pipeline aggregation which calculates the derivative of a specified metric in a parent histogram (or date_histogram)
|
||||
aggregation. The specified metric must be numeric and the enclosing histogram must have `min_doc_count` set to `0` (default
|
||||
for `histogram` aggregations).
|
||||
|
||||
|
@ -112,8 +112,8 @@ would be $/month assuming the `price` field has units of $.
|
|||
|
||||
==== Second Order Derivative
|
||||
|
||||
A second order derivative can be calculated by chaining the derivative reducer aggregation onto the result of another derivative
|
||||
reducer aggregation as in the following example which will calculate both the first and the second order derivative of the total
|
||||
A second order derivative can be calculated by chaining the derivative pipeline aggregation onto the result of another derivative
|
||||
pipeline aggregation as in the following example which will calculate both the first and the second order derivative of the total
|
||||
monthly sales:
|
||||
|
||||
[source,js]
|
|
@ -1,7 +1,7 @@
|
|||
[[search-aggregations-reducer-max-bucket-aggregation]]
|
||||
[[search-aggregations-pipeline-max-bucket-aggregation]]
|
||||
=== Max Bucket Aggregation
|
||||
|
||||
A sibling reducer aggregation which identifies the bucket(s) with the maximum value of a specified metric in a sibling aggregation
|
||||
A sibling pipeline aggregation which identifies the bucket(s) with the maximum value of a specified metric in a sibling aggregation
|
||||
and outputs both the value and the key(s) of the bucket(s). The specified metric must be numeric and the sibling aggregation must
|
||||
be a multi-bucket aggregation.
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
[[search-aggregations-reducer-min-bucket-aggregation]]
|
||||
[[search-aggregations-pipeline-min-bucket-aggregation]]
|
||||
=== Min Bucket Aggregation
|
||||
|
||||
A sibling reducer aggregation which identifies the bucket(s) with the minimum value of a specified metric in a sibling aggregation
|
||||
A sibling pipeline aggregation which identifies the bucket(s) with the minimum value of a specified metric in a sibling aggregation
|
||||
and outputs both the value and the key(s) of the bucket(s). The specified metric must be numeric and the sibling aggregation must
|
||||
be a multi-bucket aggregation.
|
||||
|
|
@ -1,4 +1,4 @@
|
|||
[[search-aggregations-reducers-movavg-reducer]]
|
||||
[[search-aggregations-pipeline-movavg-aggregation]]
|
||||
=== Moving Average Aggregation
|
||||
|
||||
Given an ordered series of data, the Moving Average aggregation will slide a window across the data and emit the average
|
||||
|
@ -109,14 +109,14 @@ track the data and only smooth out small scale fluctuations:
|
|||
|
||||
[[movavg_10window]]
|
||||
.Moving average with window of size 10
|
||||
image::images/reducers_movavg/movavg_10window.png[]
|
||||
image::images/pipeline_movavg/movavg_10window.png[]
|
||||
|
||||
In contrast, a `simple` moving average with larger window (`"window": 100`) will smooth out all higher-frequency fluctuations,
|
||||
leaving only low-frequency, long term trends. It also tends to "lag" behind the actual data by a substantial amount:
|
||||
|
||||
[[movavg_100window]]
|
||||
.Moving average with window of size 100
|
||||
image::images/reducers_movavg/movavg_100window.png[]
|
||||
image::images/pipeline_movavg/movavg_100window.png[]
|
||||
|
||||
|
||||
==== Linear
|
||||
|
@ -143,7 +143,7 @@ will closely track the data and only smooth out small scale fluctuations:
|
|||
|
||||
[[linear_10window]]
|
||||
.Linear moving average with window of size 10
|
||||
image::images/reducers_movavg/linear_10window.png[]
|
||||
image::images/pipeline_movavg/linear_10window.png[]
|
||||
|
||||
In contrast, a `linear` moving average with larger window (`"window": 100`) will smooth out all higher-frequency fluctuations,
|
||||
leaving only low-frequency, long term trends. It also tends to "lag" behind the actual data by a substantial amount,
|
||||
|
@ -151,7 +151,7 @@ although typically less than the `simple` model:
|
|||
|
||||
[[linear_100window]]
|
||||
.Linear moving average with window of size 100
|
||||
image::images/reducers_movavg/linear_100window.png[]
|
||||
image::images/pipeline_movavg/linear_100window.png[]
|
||||
|
||||
==== EWMA (Exponentially Weighted)
|
||||
|
||||
|
@ -181,11 +181,11 @@ The default value of `alpha` is `0.5`, and the setting accepts any float from 0-
|
|||
|
||||
[[single_0.2alpha]]
|
||||
.Single Exponential moving average with window of size 10, alpha = 0.2
|
||||
image::images/reducers_movavg/single_0.2alpha.png[]
|
||||
image::images/pipeline_movavg/single_0.2alpha.png[]
|
||||
|
||||
[[single_0.7alpha]]
|
||||
.Single Exponential moving average with window of size 10, alpha = 0.7
|
||||
image::images/reducers_movavg/single_0.7alpha.png[]
|
||||
image::images/pipeline_movavg/single_0.7alpha.png[]
|
||||
|
||||
==== Holt-Linear
|
||||
|
||||
|
@ -224,11 +224,11 @@ values emphasize short-term trends. This will become more apparently when you a
|
|||
|
||||
[[double_0.2beta]]
|
||||
.Double Exponential moving average with window of size 100, alpha = 0.5, beta = 0.2
|
||||
image::images/reducers_movavg/double_0.2beta.png[]
|
||||
image::images/pipeline_movavg/double_0.2beta.png[]
|
||||
|
||||
[[double_0.7beta]]
|
||||
.Double Exponential moving average with window of size 100, alpha = 0.5, beta = 0.7
|
||||
image::images/reducers_movavg/double_0.7beta.png[]
|
||||
image::images/pipeline_movavg/double_0.7beta.png[]
|
||||
|
||||
==== Prediction
|
||||
|
||||
|
@ -256,7 +256,7 @@ of the last value in the series, producing a flat:
|
|||
|
||||
[[simple_prediction]]
|
||||
.Simple moving average with window of size 10, predict = 50
|
||||
image::images/reducers_movavg/simple_prediction.png[]
|
||||
image::images/pipeline_movavg/simple_prediction.png[]
|
||||
|
||||
In contrast, the `holt` model can extrapolate based on local or global constant trends. If we set a high `beta`
|
||||
value, we can extrapolate based on local constant trends (in this case the predictions head down, because the data at the end
|
||||
|
@ -264,11 +264,11 @@ of the series was heading in a downward direction):
|
|||
|
||||
[[double_prediction_local]]
|
||||
.Double Exponential moving average with window of size 100, predict = 20, alpha = 0.5, beta = 0.8
|
||||
image::images/reducers_movavg/double_prediction_local.png[]
|
||||
image::images/pipeline_movavg/double_prediction_local.png[]
|
||||
|
||||
In contrast, if we choose a small `beta`, the predictions are based on the global constant trend. In this series, the
|
||||
global trend is slightly positive, so the prediction makes a sharp u-turn and begins a positive slope:
|
||||
|
||||
[[double_prediction_global]]
|
||||
.Double Exponential moving average with window of size 100, predict = 20, alpha = 0.5, beta = 0.1
|
||||
image::images/reducers_movavg/double_prediction_global.png[]
|
||||
image::images/pipeline_movavg/double_prediction_global.png[]
|
|
@ -1,7 +1,7 @@
|
|||
[[search-aggregations-reducer-sum-bucket-aggregation]]
|
||||
[[search-aggregations-pipeline-sum-bucket-aggregation]]
|
||||
=== Sum Bucket Aggregation
|
||||
|
||||
A sibling reducer aggregation which calculates the sum across all bucket of a specified metric in a sibling aggregation.
|
||||
A sibling pipeline aggregation which calculates the sum across all bucket of a specified metric in a sibling aggregation.
|
||||
The specified metric must be numeric and the sibling aggregation must be a multi-bucket aggregation.
|
||||
|
||||
==== Syntax
|
Before Width: | Height: | Size: 69 KiB After Width: | Height: | Size: 69 KiB |
Before Width: | Height: | Size: 72 KiB After Width: | Height: | Size: 72 KiB |
Before Width: | Height: | Size: 70 KiB After Width: | Height: | Size: 70 KiB |
Before Width: | Height: | Size: 66 KiB After Width: | Height: | Size: 66 KiB |
Before Width: | Height: | Size: 65 KiB After Width: | Height: | Size: 65 KiB |
Before Width: | Height: | Size: 70 KiB After Width: | Height: | Size: 70 KiB |
Before Width: | Height: | Size: 64 KiB After Width: | Height: | Size: 64 KiB |
Before Width: | Height: | Size: 66 KiB After Width: | Height: | Size: 66 KiB |
Before Width: | Height: | Size: 67 KiB After Width: | Height: | Size: 67 KiB |
Before Width: | Height: | Size: 63 KiB After Width: | Height: | Size: 63 KiB |
Before Width: | Height: | Size: 67 KiB After Width: | Height: | Size: 67 KiB |
|
@ -20,6 +20,7 @@
|
|||
package org.elasticsearch.action;
|
||||
|
||||
import com.google.common.base.Preconditions;
|
||||
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.support.PlainListenableActionFuture;
|
||||
import org.elasticsearch.client.Client;
|
||||
|
@ -27,7 +28,7 @@ import org.elasticsearch.client.ClusterAdminClient;
|
|||
import org.elasticsearch.client.ElasticsearchClient;
|
||||
import org.elasticsearch.client.IndicesAdminClient;
|
||||
import org.elasticsearch.common.unit.TimeValue;
|
||||
import org.elasticsearch.search.aggregations.reducers.ReducerBuilder;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorBuilder;
|
||||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
|
||||
/**
|
||||
|
|
|
@ -28,9 +28,9 @@ import org.elasticsearch.common.io.stream.StreamOutput;
|
|||
import org.elasticsearch.index.shard.ShardId;
|
||||
import org.elasticsearch.percolator.PercolateContext;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.ReducerStreams;
|
||||
import org.elasticsearch.search.aggregations.reducers.SiblingReducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorStreams;
|
||||
import org.elasticsearch.search.aggregations.pipeline.SiblingPipelineAggregator;
|
||||
import org.elasticsearch.search.highlight.HighlightField;
|
||||
import org.elasticsearch.search.query.QuerySearchResult;
|
||||
|
||||
|
@ -56,7 +56,7 @@ public class PercolateShardResponse extends BroadcastShardOperationResponse {
|
|||
private int requestedSize;
|
||||
|
||||
private InternalAggregations aggregations;
|
||||
private List<SiblingReducer> reducers;
|
||||
private List<SiblingPipelineAggregator> pipelineAggregators;
|
||||
|
||||
PercolateShardResponse() {
|
||||
hls = new ArrayList<>();
|
||||
|
@ -75,7 +75,7 @@ public class PercolateShardResponse extends BroadcastShardOperationResponse {
|
|||
if (result.aggregations() != null) {
|
||||
this.aggregations = (InternalAggregations) result.aggregations();
|
||||
}
|
||||
this.reducers = result.reducers();
|
||||
this.pipelineAggregators = result.pipelineAggregators();
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -119,8 +119,8 @@ public class PercolateShardResponse extends BroadcastShardOperationResponse {
|
|||
return aggregations;
|
||||
}
|
||||
|
||||
public List<SiblingReducer> reducers() {
|
||||
return reducers;
|
||||
public List<SiblingPipelineAggregator> pipelineAggregators() {
|
||||
return pipelineAggregators;
|
||||
}
|
||||
|
||||
public byte percolatorTypeId() {
|
||||
|
@ -156,14 +156,14 @@ public class PercolateShardResponse extends BroadcastShardOperationResponse {
|
|||
}
|
||||
aggregations = InternalAggregations.readOptionalAggregations(in);
|
||||
if (in.readBoolean()) {
|
||||
int reducersSize = in.readVInt();
|
||||
List<SiblingReducer> reducers = new ArrayList<>(reducersSize);
|
||||
for (int i = 0; i < reducersSize; i++) {
|
||||
int pipelineAggregatorsSize = in.readVInt();
|
||||
List<SiblingPipelineAggregator> pipelineAggregators = new ArrayList<>(pipelineAggregatorsSize);
|
||||
for (int i = 0; i < pipelineAggregatorsSize; i++) {
|
||||
BytesReference type = in.readBytesReference();
|
||||
Reducer reducer = ReducerStreams.stream(type).readResult(in);
|
||||
reducers.add((SiblingReducer) reducer);
|
||||
PipelineAggregator pipelineAggregator = PipelineAggregatorStreams.stream(type).readResult(in);
|
||||
pipelineAggregators.add((SiblingPipelineAggregator) pipelineAggregator);
|
||||
}
|
||||
this.reducers = reducers;
|
||||
this.pipelineAggregators = pipelineAggregators;
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -190,14 +190,14 @@ public class PercolateShardResponse extends BroadcastShardOperationResponse {
|
|||
}
|
||||
}
|
||||
out.writeOptionalStreamable(aggregations);
|
||||
if (reducers == null) {
|
||||
if (pipelineAggregators == null) {
|
||||
out.writeBoolean(false);
|
||||
} else {
|
||||
out.writeBoolean(true);
|
||||
out.writeVInt(reducers.size());
|
||||
for (Reducer reducer : reducers) {
|
||||
out.writeBytesReference(reducer.type().stream());
|
||||
reducer.writeTo(out);
|
||||
out.writeVInt(pipelineAggregators.size());
|
||||
for (PipelineAggregator pipelineAggregator : pipelineAggregators) {
|
||||
out.writeBytesReference(pipelineAggregator.type().stream());
|
||||
pipelineAggregator.writeTo(out);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -86,8 +86,8 @@ import org.elasticsearch.search.aggregations.AggregationPhase;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregation.ReduceContext;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.SiblingReducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.SiblingPipelineAggregator;
|
||||
import org.elasticsearch.search.highlight.HighlightField;
|
||||
import org.elasticsearch.search.highlight.HighlightPhase;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
|
@ -852,11 +852,11 @@ public class PercolatorService extends AbstractComponent {
|
|||
}
|
||||
InternalAggregations aggregations = InternalAggregations.reduce(aggregationsList, new ReduceContext(bigArrays, scriptService));
|
||||
if (aggregations != null) {
|
||||
List<SiblingReducer> reducers = shardResults.get(0).reducers();
|
||||
if (reducers != null) {
|
||||
List<InternalAggregation> newAggs = new ArrayList<>(Lists.transform(aggregations.asList(), Reducer.AGGREGATION_TRANFORM_FUNCTION));
|
||||
for (SiblingReducer reducer : reducers) {
|
||||
InternalAggregation newAgg = reducer.doReduce(new InternalAggregations(newAggs), new ReduceContext(bigArrays,
|
||||
List<SiblingPipelineAggregator> pipelineAggregators = shardResults.get(0).pipelineAggregators();
|
||||
if (pipelineAggregators != null) {
|
||||
List<InternalAggregation> newAggs = new ArrayList<>(Lists.transform(aggregations.asList(), PipelineAggregator.AGGREGATION_TRANFORM_FUNCTION));
|
||||
for (SiblingPipelineAggregator pipelineAggregator : pipelineAggregators) {
|
||||
InternalAggregation newAgg = pipelineAggregator.doReduce(new InternalAggregations(newAggs), new ReduceContext(bigArrays,
|
||||
scriptService));
|
||||
newAggs.add(newAgg);
|
||||
}
|
||||
|
|
|
@ -56,14 +56,14 @@ import org.elasticsearch.search.aggregations.metrics.stats.extended.ExtendedStat
|
|||
import org.elasticsearch.search.aggregations.metrics.sum.SumParser;
|
||||
import org.elasticsearch.search.aggregations.metrics.tophits.TopHitsParser;
|
||||
import org.elasticsearch.search.aggregations.metrics.valuecount.ValueCountParser;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.avg.AvgBucketParser;
|
||||
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.max.MaxBucketParser;
|
||||
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.min.MinBucketParser;
|
||||
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.sum.SumBucketParser;
|
||||
import org.elasticsearch.search.aggregations.reducers.derivative.DerivativeParser;
|
||||
import org.elasticsearch.search.aggregations.reducers.movavg.MovAvgParser;
|
||||
import org.elasticsearch.search.aggregations.reducers.movavg.models.MovAvgModelModule;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.avg.AvgBucketParser;
|
||||
import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.max.MaxBucketParser;
|
||||
import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.min.MinBucketParser;
|
||||
import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.sum.SumBucketParser;
|
||||
import org.elasticsearch.search.aggregations.pipeline.derivative.DerivativeParser;
|
||||
import org.elasticsearch.search.aggregations.pipeline.movavg.MovAvgParser;
|
||||
import org.elasticsearch.search.aggregations.pipeline.movavg.models.MovAvgModelModule;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
|
@ -73,7 +73,7 @@ import java.util.List;
|
|||
public class AggregationModule extends AbstractModule implements SpawnModules{
|
||||
|
||||
private List<Class<? extends Aggregator.Parser>> aggParsers = Lists.newArrayList();
|
||||
private List<Class<? extends Reducer.Parser>> reducerParsers = Lists.newArrayList();
|
||||
private List<Class<? extends PipelineAggregator.Parser>> pipelineAggParsers = Lists.newArrayList();
|
||||
|
||||
public AggregationModule() {
|
||||
aggParsers.add(AvgParser.class);
|
||||
|
@ -108,12 +108,12 @@ public class AggregationModule extends AbstractModule implements SpawnModules{
|
|||
aggParsers.add(ScriptedMetricParser.class);
|
||||
aggParsers.add(ChildrenParser.class);
|
||||
|
||||
reducerParsers.add(DerivativeParser.class);
|
||||
reducerParsers.add(MaxBucketParser.class);
|
||||
reducerParsers.add(MinBucketParser.class);
|
||||
reducerParsers.add(AvgBucketParser.class);
|
||||
reducerParsers.add(SumBucketParser.class);
|
||||
reducerParsers.add(MovAvgParser.class);
|
||||
pipelineAggParsers.add(DerivativeParser.class);
|
||||
pipelineAggParsers.add(MaxBucketParser.class);
|
||||
pipelineAggParsers.add(MinBucketParser.class);
|
||||
pipelineAggParsers.add(AvgBucketParser.class);
|
||||
pipelineAggParsers.add(SumBucketParser.class);
|
||||
pipelineAggParsers.add(MovAvgParser.class);
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -131,9 +131,9 @@ public class AggregationModule extends AbstractModule implements SpawnModules{
|
|||
for (Class<? extends Aggregator.Parser> parser : aggParsers) {
|
||||
multibinderAggParser.addBinding().to(parser);
|
||||
}
|
||||
Multibinder<Reducer.Parser> multibinderReducerParser = Multibinder.newSetBinder(binder(), Reducer.Parser.class);
|
||||
for (Class<? extends Reducer.Parser> parser : reducerParsers) {
|
||||
multibinderReducerParser.addBinding().to(parser);
|
||||
Multibinder<PipelineAggregator.Parser> multibinderPipelineAggParser = Multibinder.newSetBinder(binder(), PipelineAggregator.Parser.class);
|
||||
for (Class<? extends PipelineAggregator.Parser> parser : pipelineAggParsers) {
|
||||
multibinderPipelineAggParser.addBinding().to(parser);
|
||||
}
|
||||
bind(AggregatorParsers.class).asEagerSingleton();
|
||||
bind(AggregationParseElement.class).asEagerSingleton();
|
||||
|
|
|
@ -23,14 +23,13 @@ import com.google.common.collect.ImmutableMap;
|
|||
import org.apache.lucene.search.BooleanClause.Occur;
|
||||
import org.apache.lucene.search.BooleanQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.lucene.search.Queries;
|
||||
import org.elasticsearch.search.SearchParseElement;
|
||||
import org.elasticsearch.search.SearchPhase;
|
||||
import org.elasticsearch.search.aggregations.bucket.global.GlobalAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.SiblingReducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.SiblingPipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
import org.elasticsearch.search.query.QueryPhaseExecutionException;
|
||||
|
@ -145,19 +144,20 @@ public class AggregationPhase implements SearchPhase {
|
|||
}
|
||||
context.queryResult().aggregations(new InternalAggregations(aggregations));
|
||||
try {
|
||||
List<Reducer> reducers = context.aggregations().factories().createReducers();
|
||||
List<SiblingReducer> siblingReducers = new ArrayList<>(reducers.size());
|
||||
for (Reducer reducer : reducers) {
|
||||
if (reducer instanceof SiblingReducer) {
|
||||
siblingReducers.add((SiblingReducer) reducer);
|
||||
List<PipelineAggregator> pipelineAggregators = context.aggregations().factories().createPipelineAggregators();
|
||||
List<SiblingPipelineAggregator> siblingPipelineAggregators = new ArrayList<>(pipelineAggregators.size());
|
||||
for (PipelineAggregator pipelineAggregator : pipelineAggregators) {
|
||||
if (pipelineAggregator instanceof SiblingPipelineAggregator) {
|
||||
siblingPipelineAggregators.add((SiblingPipelineAggregator) pipelineAggregator);
|
||||
} else {
|
||||
throw new AggregationExecutionException("Invalid reducer named [" + reducer.name() + "] of type ["
|
||||
+ reducer.type().name() + "]. Only sibling reducers are allowed at the top level");
|
||||
throw new AggregationExecutionException("Invalid pipeline aggregation named [" + pipelineAggregator.name()
|
||||
+ "] of type [" + pipelineAggregator.type().name()
|
||||
+ "]. Only sibling pipeline aggregations are allowed at the top level");
|
||||
}
|
||||
}
|
||||
context.queryResult().reducers(siblingReducers);
|
||||
context.queryResult().pipelineAggregators(siblingPipelineAggregators);
|
||||
} catch (IOException e) {
|
||||
throw new AggregationExecutionException("Failed to build top level reducers", e);
|
||||
throw new AggregationExecutionException("Failed to build top level pipeline aggregators", e);
|
||||
}
|
||||
|
||||
// disable aggregations so that they don't run on next pages in case of scrolling
|
||||
|
|
|
@ -21,7 +21,7 @@ package org.elasticsearch.search.aggregations;
|
|||
import org.apache.lucene.index.LeafReaderContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BestBucketsDeferringCollector;
|
||||
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.internal.SearchContext.Lifetime;
|
||||
import org.elasticsearch.search.query.QueryPhaseExecutionException;
|
||||
|
@ -47,7 +47,7 @@ public abstract class AggregatorBase extends Aggregator {
|
|||
|
||||
private Map<String, Aggregator> subAggregatorbyName;
|
||||
private DeferringBucketCollector recordingWrapper;
|
||||
private final List<Reducer> reducers;
|
||||
private final List<PipelineAggregator> pipelineAggregators;
|
||||
|
||||
/**
|
||||
* Constructs a new Aggregator.
|
||||
|
@ -59,9 +59,9 @@ public abstract class AggregatorBase extends Aggregator {
|
|||
* @param metaData The metaData associated with this aggregator
|
||||
*/
|
||||
protected AggregatorBase(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
this.name = name;
|
||||
this.reducers = reducers;
|
||||
this.pipelineAggregators = pipelineAggregators;
|
||||
this.metaData = metaData;
|
||||
this.parent = parent;
|
||||
this.context = context;
|
||||
|
@ -116,8 +116,8 @@ public abstract class AggregatorBase extends Aggregator {
|
|||
return this.metaData;
|
||||
}
|
||||
|
||||
public List<Reducer> reducers() {
|
||||
return this.reducers;
|
||||
public List<PipelineAggregator> pipelineAggregators() {
|
||||
return this.pipelineAggregators;
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -18,8 +18,8 @@
|
|||
*/
|
||||
package org.elasticsearch.search.aggregations;
|
||||
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.ReducerFactory;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorFactory;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationPath;
|
||||
|
||||
|
@ -41,23 +41,23 @@ public class AggregatorFactories {
|
|||
|
||||
private AggregatorFactory parent;
|
||||
private AggregatorFactory[] factories;
|
||||
private List<ReducerFactory> reducerFactories;
|
||||
private List<PipelineAggregatorFactory> pipelineAggregatorFactories;
|
||||
|
||||
public static Builder builder() {
|
||||
return new Builder();
|
||||
}
|
||||
|
||||
private AggregatorFactories(AggregatorFactory[] factories, List<ReducerFactory> reducers) {
|
||||
private AggregatorFactories(AggregatorFactory[] factories, List<PipelineAggregatorFactory> pipelineAggregators) {
|
||||
this.factories = factories;
|
||||
this.reducerFactories = reducers;
|
||||
this.pipelineAggregatorFactories = pipelineAggregators;
|
||||
}
|
||||
|
||||
public List<Reducer> createReducers() throws IOException {
|
||||
List<Reducer> reducers = new ArrayList<>();
|
||||
for (ReducerFactory factory : this.reducerFactories) {
|
||||
reducers.add(factory.create());
|
||||
public List<PipelineAggregator> createPipelineAggregators() throws IOException {
|
||||
List<PipelineAggregator> pipelineAggregators = new ArrayList<>();
|
||||
for (PipelineAggregatorFactory factory : this.pipelineAggregatorFactories) {
|
||||
pipelineAggregators.add(factory.create());
|
||||
}
|
||||
return reducers;
|
||||
return pipelineAggregators;
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -103,8 +103,8 @@ public class AggregatorFactories {
|
|||
for (AggregatorFactory factory : factories) {
|
||||
factory.validate();
|
||||
}
|
||||
for (ReducerFactory factory : reducerFactories) {
|
||||
factory.validate(parent, factories, reducerFactories);
|
||||
for (PipelineAggregatorFactory factory : pipelineAggregatorFactories) {
|
||||
factory.validate(parent, factories, pipelineAggregatorFactories);
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -112,10 +112,10 @@ public class AggregatorFactories {
|
|||
|
||||
private static final AggregatorFactory[] EMPTY_FACTORIES = new AggregatorFactory[0];
|
||||
private static final Aggregator[] EMPTY_AGGREGATORS = new Aggregator[0];
|
||||
private static final List<ReducerFactory> EMPTY_REDUCERS = new ArrayList<>();
|
||||
private static final List<PipelineAggregatorFactory> EMPTY_PIPELINE_AGGREGATORS = new ArrayList<>();
|
||||
|
||||
private Empty() {
|
||||
super(EMPTY_FACTORIES, EMPTY_REDUCERS);
|
||||
super(EMPTY_FACTORIES, EMPTY_PIPELINE_AGGREGATORS);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -134,7 +134,7 @@ public class AggregatorFactories {
|
|||
|
||||
private final Set<String> names = new HashSet<>();
|
||||
private final List<AggregatorFactory> factories = new ArrayList<>();
|
||||
private final List<ReducerFactory> reducerFactories = new ArrayList<>();
|
||||
private final List<PipelineAggregatorFactory> pipelineAggregatorFactories = new ArrayList<>();
|
||||
|
||||
public Builder addAggregator(AggregatorFactory factory) {
|
||||
if (!names.add(factory.name)) {
|
||||
|
@ -144,43 +144,43 @@ public class AggregatorFactories {
|
|||
return this;
|
||||
}
|
||||
|
||||
public Builder addReducer(ReducerFactory reducerFactory) {
|
||||
this.reducerFactories.add(reducerFactory);
|
||||
public Builder addPipelineAggregator(PipelineAggregatorFactory pipelineAggregatorFactory) {
|
||||
this.pipelineAggregatorFactories.add(pipelineAggregatorFactory);
|
||||
return this;
|
||||
}
|
||||
|
||||
public AggregatorFactories build() {
|
||||
if (factories.isEmpty() && reducerFactories.isEmpty()) {
|
||||
if (factories.isEmpty() && pipelineAggregatorFactories.isEmpty()) {
|
||||
return EMPTY;
|
||||
}
|
||||
List<ReducerFactory> orderedReducers = resolveReducerOrder(this.reducerFactories, this.factories);
|
||||
return new AggregatorFactories(factories.toArray(new AggregatorFactory[factories.size()]), orderedReducers);
|
||||
List<PipelineAggregatorFactory> orderedpipelineAggregators = resolvePipelineAggregatorOrder(this.pipelineAggregatorFactories, this.factories);
|
||||
return new AggregatorFactories(factories.toArray(new AggregatorFactory[factories.size()]), orderedpipelineAggregators);
|
||||
}
|
||||
|
||||
private List<ReducerFactory> resolveReducerOrder(List<ReducerFactory> reducerFactories, List<AggregatorFactory> aggFactories) {
|
||||
Map<String, ReducerFactory> reducerFactoriesMap = new HashMap<>();
|
||||
for (ReducerFactory factory : reducerFactories) {
|
||||
reducerFactoriesMap.put(factory.getName(), factory);
|
||||
private List<PipelineAggregatorFactory> resolvePipelineAggregatorOrder(List<PipelineAggregatorFactory> pipelineAggregatorFactories, List<AggregatorFactory> aggFactories) {
|
||||
Map<String, PipelineAggregatorFactory> pipelineAggregatorFactoriesMap = new HashMap<>();
|
||||
for (PipelineAggregatorFactory factory : pipelineAggregatorFactories) {
|
||||
pipelineAggregatorFactoriesMap.put(factory.getName(), factory);
|
||||
}
|
||||
Set<String> aggFactoryNames = new HashSet<>();
|
||||
for (AggregatorFactory aggFactory : aggFactories) {
|
||||
aggFactoryNames.add(aggFactory.name);
|
||||
}
|
||||
List<ReducerFactory> orderedReducers = new LinkedList<>();
|
||||
List<ReducerFactory> unmarkedFactories = new ArrayList<ReducerFactory>(reducerFactories);
|
||||
Set<ReducerFactory> temporarilyMarked = new HashSet<ReducerFactory>();
|
||||
List<PipelineAggregatorFactory> orderedPipelineAggregatorrs = new LinkedList<>();
|
||||
List<PipelineAggregatorFactory> unmarkedFactories = new ArrayList<PipelineAggregatorFactory>(pipelineAggregatorFactories);
|
||||
Set<PipelineAggregatorFactory> temporarilyMarked = new HashSet<PipelineAggregatorFactory>();
|
||||
while (!unmarkedFactories.isEmpty()) {
|
||||
ReducerFactory factory = unmarkedFactories.get(0);
|
||||
resolveReducerOrder(aggFactoryNames, reducerFactoriesMap, orderedReducers, unmarkedFactories, temporarilyMarked, factory);
|
||||
PipelineAggregatorFactory factory = unmarkedFactories.get(0);
|
||||
resolvePipelineAggregatorOrder(aggFactoryNames, pipelineAggregatorFactoriesMap, orderedPipelineAggregatorrs, unmarkedFactories, temporarilyMarked, factory);
|
||||
}
|
||||
return orderedReducers;
|
||||
return orderedPipelineAggregatorrs;
|
||||
}
|
||||
|
||||
private void resolveReducerOrder(Set<String> aggFactoryNames, Map<String, ReducerFactory> reducerFactoriesMap,
|
||||
List<ReducerFactory> orderedReducers, List<ReducerFactory> unmarkedFactories, Set<ReducerFactory> temporarilyMarked,
|
||||
ReducerFactory factory) {
|
||||
private void resolvePipelineAggregatorOrder(Set<String> aggFactoryNames, Map<String, PipelineAggregatorFactory> pipelineAggregatorFactoriesMap,
|
||||
List<PipelineAggregatorFactory> orderedPipelineAggregators, List<PipelineAggregatorFactory> unmarkedFactories, Set<PipelineAggregatorFactory> temporarilyMarked,
|
||||
PipelineAggregatorFactory factory) {
|
||||
if (temporarilyMarked.contains(factory)) {
|
||||
throw new IllegalStateException("Cyclical dependancy found with reducer [" + factory.getName() + "]");
|
||||
throw new IllegalStateException("Cyclical dependancy found with pipeline aggregator [" + factory.getName() + "]");
|
||||
} else if (unmarkedFactories.contains(factory)) {
|
||||
temporarilyMarked.add(factory);
|
||||
String[] bucketsPaths = factory.getBucketsPaths();
|
||||
|
@ -190,9 +190,9 @@ public class AggregatorFactories {
|
|||
if (bucketsPath.equals("_count") || bucketsPath.equals("_key") || aggFactoryNames.contains(firstAggName)) {
|
||||
continue;
|
||||
} else {
|
||||
ReducerFactory matchingFactory = reducerFactoriesMap.get(firstAggName);
|
||||
PipelineAggregatorFactory matchingFactory = pipelineAggregatorFactoriesMap.get(firstAggName);
|
||||
if (matchingFactory != null) {
|
||||
resolveReducerOrder(aggFactoryNames, reducerFactoriesMap, orderedReducers, unmarkedFactories,
|
||||
resolvePipelineAggregatorOrder(aggFactoryNames, pipelineAggregatorFactoriesMap, orderedPipelineAggregators, unmarkedFactories,
|
||||
temporarilyMarked, matchingFactory);
|
||||
} else {
|
||||
throw new IllegalStateException("No aggregation found for path [" + bucketsPath + "]");
|
||||
|
@ -201,7 +201,7 @@ public class AggregatorFactories {
|
|||
}
|
||||
unmarkedFactories.remove(factory);
|
||||
temporarilyMarked.remove(factory);
|
||||
orderedReducers.add(factory);
|
||||
orderedPipelineAggregators.add(factory);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -23,7 +23,7 @@ import org.apache.lucene.search.Scorer;
|
|||
import org.elasticsearch.common.lease.Releasables;
|
||||
import org.elasticsearch.common.util.BigArrays;
|
||||
import org.elasticsearch.common.util.ObjectArray;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.internal.SearchContext.Lifetime;
|
||||
|
||||
|
@ -86,7 +86,7 @@ public abstract class AggregatorFactory {
|
|||
}
|
||||
|
||||
protected abstract Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException;
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException;
|
||||
|
||||
/**
|
||||
* Creates the aggregator
|
||||
|
@ -99,7 +99,7 @@ public abstract class AggregatorFactory {
|
|||
* @return The created aggregator
|
||||
*/
|
||||
public final Aggregator create(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket) throws IOException {
|
||||
return createInternal(context, parent, collectsFromSingleBucket, this.factories.createReducers(), this.metaData);
|
||||
return createInternal(context, parent, collectsFromSingleBucket, this.factories.createPipelineAggregators(), this.metaData);
|
||||
}
|
||||
|
||||
public void doValidate() {
|
||||
|
|
|
@ -24,8 +24,8 @@ import org.elasticsearch.common.collect.MapBuilder;
|
|||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.search.SearchParseException;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.ReducerFactory;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorFactory;
|
||||
import org.elasticsearch.search.internal.SearchContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -41,7 +41,7 @@ public class AggregatorParsers {
|
|||
|
||||
public static final Pattern VALID_AGG_NAME = Pattern.compile("[^\\[\\]>]+");
|
||||
private final ImmutableMap<String, Aggregator.Parser> aggParsers;
|
||||
private final ImmutableMap<String, Reducer.Parser> reducerParsers;
|
||||
private final ImmutableMap<String, PipelineAggregator.Parser> pipelineAggregatorParsers;
|
||||
|
||||
|
||||
/**
|
||||
|
@ -53,17 +53,17 @@ public class AggregatorParsers {
|
|||
* ).
|
||||
*/
|
||||
@Inject
|
||||
public AggregatorParsers(Set<Aggregator.Parser> aggParsers, Set<Reducer.Parser> reducerParsers) {
|
||||
public AggregatorParsers(Set<Aggregator.Parser> aggParsers, Set<PipelineAggregator.Parser> pipelineAggregatorParsers) {
|
||||
MapBuilder<String, Aggregator.Parser> aggParsersBuilder = MapBuilder.newMapBuilder();
|
||||
for (Aggregator.Parser parser : aggParsers) {
|
||||
aggParsersBuilder.put(parser.type(), parser);
|
||||
}
|
||||
this.aggParsers = aggParsersBuilder.immutableMap();
|
||||
MapBuilder<String, Reducer.Parser> reducerParsersBuilder = MapBuilder.newMapBuilder();
|
||||
for (Reducer.Parser parser : reducerParsers) {
|
||||
reducerParsersBuilder.put(parser.type(), parser);
|
||||
MapBuilder<String, PipelineAggregator.Parser> pipelineAggregatorParsersBuilder = MapBuilder.newMapBuilder();
|
||||
for (PipelineAggregator.Parser parser : pipelineAggregatorParsers) {
|
||||
pipelineAggregatorParsersBuilder.put(parser.type(), parser);
|
||||
}
|
||||
this.reducerParsers = reducerParsersBuilder.immutableMap();
|
||||
this.pipelineAggregatorParsers = pipelineAggregatorParsersBuilder.immutableMap();
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -77,14 +77,15 @@ public class AggregatorParsers {
|
|||
}
|
||||
|
||||
/**
|
||||
* Returns the parser that is registered under the given reducer type.
|
||||
* Returns the parser that is registered under the given pipeline aggregator
|
||||
* type.
|
||||
*
|
||||
* @param type
|
||||
* The reducer type
|
||||
* @return The parser associated with the given reducer type.
|
||||
* The pipeline aggregator type
|
||||
* @return The parser associated with the given pipeline aggregator type.
|
||||
*/
|
||||
public Reducer.Parser reducer(String type) {
|
||||
return reducerParsers.get(type);
|
||||
public PipelineAggregator.Parser pipelineAggregator(String type) {
|
||||
return pipelineAggregatorParsers.get(type);
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -125,7 +126,7 @@ public class AggregatorParsers {
|
|||
}
|
||||
|
||||
AggregatorFactory aggFactory = null;
|
||||
ReducerFactory reducerFactory = null;
|
||||
PipelineAggregatorFactory pipelineAggregatorFactory = null;
|
||||
AggregatorFactories subFactories = null;
|
||||
|
||||
Map<String, Object> metaData = null;
|
||||
|
@ -161,20 +162,19 @@ public class AggregatorParsers {
|
|||
throw new SearchParseException(context, "Found two aggregation type definitions in [" + aggregationName + "]: ["
|
||||
+ aggFactory.type + "] and [" + fieldName + "]", parser.getTokenLocation());
|
||||
}
|
||||
if (reducerFactory != null) {
|
||||
// TODO we would need a .type property on reducers too for this error message?
|
||||
if (pipelineAggregatorFactory != null) {
|
||||
throw new SearchParseException(context, "Found two aggregation type definitions in [" + aggregationName + "]: ["
|
||||
+ reducerFactory + "] and [" + fieldName + "]", parser.getTokenLocation());
|
||||
+ pipelineAggregatorFactory + "] and [" + fieldName + "]", parser.getTokenLocation());
|
||||
}
|
||||
|
||||
Aggregator.Parser aggregatorParser = parser(fieldName);
|
||||
if (aggregatorParser == null) {
|
||||
Reducer.Parser reducerParser = reducer(fieldName);
|
||||
if (reducerParser == null) {
|
||||
PipelineAggregator.Parser pipelineAggregatorParser = pipelineAggregator(fieldName);
|
||||
if (pipelineAggregatorParser == null) {
|
||||
throw new SearchParseException(context, "Could not find aggregator type [" + fieldName + "] in ["
|
||||
+ aggregationName + "]", parser.getTokenLocation());
|
||||
} else {
|
||||
reducerFactory = reducerParser.parse(aggregationName, parser, context);
|
||||
pipelineAggregatorFactory = pipelineAggregatorParser.parse(aggregationName, parser, context);
|
||||
}
|
||||
} else {
|
||||
aggFactory = aggregatorParser.parse(aggregationName, parser, context);
|
||||
|
@ -182,11 +182,11 @@ public class AggregatorParsers {
|
|||
}
|
||||
}
|
||||
|
||||
if (aggFactory == null && reducerFactory == null) {
|
||||
if (aggFactory == null && pipelineAggregatorFactory == null) {
|
||||
throw new SearchParseException(context, "Missing definition for aggregation [" + aggregationName + "]",
|
||||
parser.getTokenLocation());
|
||||
} else if (aggFactory != null) {
|
||||
assert reducerFactory == null;
|
||||
assert pipelineAggregatorFactory == null;
|
||||
if (metaData != null) {
|
||||
aggFactory.setMetaData(metaData);
|
||||
}
|
||||
|
@ -201,12 +201,12 @@ public class AggregatorParsers {
|
|||
|
||||
factories.addAggregator(aggFactory);
|
||||
} else {
|
||||
assert reducerFactory != null;
|
||||
assert pipelineAggregatorFactory != null;
|
||||
if (subFactories != null) {
|
||||
throw new SearchParseException(context, "Aggregation [" + aggregationName + "] cannot define sub-aggregations",
|
||||
parser.getTokenLocation());
|
||||
}
|
||||
factories.addReducer(reducerFactory);
|
||||
factories.addPipelineAggregator(pipelineAggregatorFactory);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -31,8 +31,8 @@ import org.elasticsearch.common.xcontent.ToXContent;
|
|||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilderString;
|
||||
import org.elasticsearch.script.ScriptService;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.ReducerStreams;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorStreams;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationPath;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -115,7 +115,7 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
|
|||
|
||||
protected Map<String, Object> metaData;
|
||||
|
||||
private List<Reducer> reducers;
|
||||
private List<PipelineAggregator> pipelineAggregators;
|
||||
|
||||
/** Constructs an un initialized addAggregation (used for serialization) **/
|
||||
protected InternalAggregation() {}
|
||||
|
@ -125,9 +125,9 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
|
|||
*
|
||||
* @param name The name of the get.
|
||||
*/
|
||||
protected InternalAggregation(String name, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
protected InternalAggregation(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
this.name = name;
|
||||
this.reducers = reducers;
|
||||
this.pipelineAggregators = pipelineAggregators;
|
||||
this.metaData = metaData;
|
||||
}
|
||||
|
||||
|
@ -149,8 +149,8 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
|
|||
*/
|
||||
public final InternalAggregation reduce(List<InternalAggregation> aggregations, ReduceContext reduceContext) {
|
||||
InternalAggregation aggResult = doReduce(aggregations, reduceContext);
|
||||
for (Reducer reducer : reducers) {
|
||||
aggResult = reducer.reduce(aggResult, reduceContext);
|
||||
for (PipelineAggregator pipelineAggregator : pipelineAggregators) {
|
||||
aggResult = pipelineAggregator.reduce(aggResult, reduceContext);
|
||||
}
|
||||
return aggResult;
|
||||
}
|
||||
|
@ -188,8 +188,8 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
|
|||
return metaData;
|
||||
}
|
||||
|
||||
public List<Reducer> reducers() {
|
||||
return reducers;
|
||||
public List<PipelineAggregator> pipelineAggregators() {
|
||||
return pipelineAggregators;
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -210,10 +210,10 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
|
|||
public final void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeString(name);
|
||||
out.writeGenericValue(metaData);
|
||||
out.writeVInt(reducers.size());
|
||||
for (Reducer reducer : reducers) {
|
||||
out.writeBytesReference(reducer.type().stream());
|
||||
reducer.writeTo(out);
|
||||
out.writeVInt(pipelineAggregators.size());
|
||||
for (PipelineAggregator pipelineAggregator : pipelineAggregators) {
|
||||
out.writeBytesReference(pipelineAggregator.type().stream());
|
||||
pipelineAggregator.writeTo(out);
|
||||
}
|
||||
doWriteTo(out);
|
||||
}
|
||||
|
@ -226,13 +226,13 @@ public abstract class InternalAggregation implements Aggregation, ToXContent, St
|
|||
metaData = in.readMap();
|
||||
int size = in.readVInt();
|
||||
if (size == 0) {
|
||||
reducers = ImmutableList.of();
|
||||
pipelineAggregators = ImmutableList.of();
|
||||
} else {
|
||||
reducers = Lists.newArrayListWithCapacity(size);
|
||||
pipelineAggregators = Lists.newArrayListWithCapacity(size);
|
||||
for (int i = 0; i < size; i++) {
|
||||
BytesReference type = in.readBytesReference();
|
||||
Reducer reducer = ReducerStreams.stream(type).readResult(in);
|
||||
reducers.add(reducer);
|
||||
PipelineAggregator pipelineAggregator = PipelineAggregatorStreams.stream(type).readResult(in);
|
||||
pipelineAggregators.add(pipelineAggregator);
|
||||
}
|
||||
}
|
||||
doReadFrom(in);
|
||||
|
|
|
@ -20,7 +20,7 @@
|
|||
package org.elasticsearch.search.aggregations;
|
||||
|
||||
import org.elasticsearch.search.aggregations.bucket.MultiBucketsAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
@ -31,8 +31,8 @@ public abstract class InternalMultiBucketAggregation<A extends InternalMultiBuck
|
|||
public InternalMultiBucketAggregation() {
|
||||
}
|
||||
|
||||
public InternalMultiBucketAggregation(String name, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
public InternalMultiBucketAggregation(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -20,7 +20,7 @@
|
|||
package org.elasticsearch.search.aggregations;
|
||||
|
||||
import org.apache.lucene.index.LeafReaderContext;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -34,13 +34,13 @@ import java.util.Map;
|
|||
public abstract class NonCollectingAggregator extends AggregatorBase {
|
||||
|
||||
protected NonCollectingAggregator(String name, AggregationContext context, Aggregator parent, AggregatorFactories subFactories,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, subFactories, context, parent, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, subFactories, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
protected NonCollectingAggregator(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
this(name, context, parent, AggregatorFactories.EMPTY, reducers, metaData);
|
||||
protected NonCollectingAggregator(String name, AggregationContext context, Aggregator parent,
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
this(name, context, parent, AggregatorFactories.EMPTY, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -59,16 +59,16 @@ import org.elasticsearch.search.aggregations.metrics.stats.extended.InternalExte
|
|||
import org.elasticsearch.search.aggregations.metrics.sum.InternalSum;
|
||||
import org.elasticsearch.search.aggregations.metrics.tophits.InternalTopHits;
|
||||
import org.elasticsearch.search.aggregations.metrics.valuecount.InternalValueCount;
|
||||
import org.elasticsearch.search.aggregations.reducers.InternalSimpleValue;
|
||||
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.InternalBucketMetricValue;
|
||||
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.avg.AvgBucketReducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.max.MaxBucketReducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.min.MinBucketReducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.bucketmetrics.sum.SumBucketReducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.derivative.DerivativeReducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.derivative.InternalDerivative;
|
||||
import org.elasticsearch.search.aggregations.reducers.movavg.MovAvgReducer;
|
||||
import org.elasticsearch.search.aggregations.reducers.movavg.models.TransportMovAvgModelModule;
|
||||
import org.elasticsearch.search.aggregations.pipeline.InternalSimpleValue;
|
||||
import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.InternalBucketMetricValue;
|
||||
import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.avg.AvgBucketPipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.max.MaxBucketPipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.min.MinBucketPipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.bucketmetrics.sum.SumBucketPipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.derivative.DerivativePipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.derivative.InternalDerivative;
|
||||
import org.elasticsearch.search.aggregations.pipeline.movavg.MovAvgPipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.pipeline.movavg.models.TransportMovAvgModelModule;
|
||||
|
||||
/**
|
||||
* A module that registers all the transport streams for the addAggregation
|
||||
|
@ -117,16 +117,16 @@ public class TransportAggregationModule extends AbstractModule implements SpawnM
|
|||
InternalGeoBounds.registerStream();
|
||||
InternalChildren.registerStream();
|
||||
|
||||
// Reducers
|
||||
DerivativeReducer.registerStreams();
|
||||
// Pipeline Aggregations
|
||||
DerivativePipelineAggregator.registerStreams();
|
||||
InternalDerivative.registerStreams();
|
||||
InternalSimpleValue.registerStreams();
|
||||
InternalBucketMetricValue.registerStreams();
|
||||
MaxBucketReducer.registerStreams();
|
||||
MinBucketReducer.registerStreams();
|
||||
AvgBucketReducer.registerStreams();
|
||||
SumBucketReducer.registerStreams();
|
||||
MovAvgReducer.registerStreams();
|
||||
MaxBucketPipelineAggregator.registerStreams();
|
||||
MinBucketPipelineAggregator.registerStreams();
|
||||
AvgBucketPipelineAggregator.registerStreams();
|
||||
SumBucketPipelineAggregator.registerStreams();
|
||||
MovAvgPipelineAggregator.registerStreams();
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.AggregatorFactories;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -44,8 +44,8 @@ public abstract class BucketsAggregator extends AggregatorBase {
|
|||
private IntArray docCounts;
|
||||
|
||||
public BucketsAggregator(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, context, parent, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, context, parent, pipelineAggregators, metaData);
|
||||
bigArrays = context.bigArrays();
|
||||
docCounts = bigArrays.newIntArray(1, true);
|
||||
}
|
||||
|
|
|
@ -23,7 +23,7 @@ import org.elasticsearch.common.io.stream.StreamOutput;
|
|||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
|
@ -47,8 +47,8 @@ public abstract class InternalSingleBucketAggregation extends InternalAggregatio
|
|||
* @param docCount The document count in the single bucket.
|
||||
* @param aggregations The already built sub-aggregations that are associated with the bucket.
|
||||
*/
|
||||
protected InternalSingleBucketAggregation(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
protected InternalSingleBucketAggregation(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.docCount = docCount;
|
||||
this.aggregations = aggregations;
|
||||
}
|
||||
|
|
|
@ -20,7 +20,7 @@ package org.elasticsearch.search.aggregations.bucket;
|
|||
|
||||
import org.elasticsearch.search.aggregations.Aggregator;
|
||||
import org.elasticsearch.search.aggregations.AggregatorFactories;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -34,8 +34,8 @@ public abstract class SingleBucketAggregator extends BucketsAggregator {
|
|||
|
||||
protected SingleBucketAggregator(String name, AggregatorFactories factories,
|
||||
AggregationContext aggregationContext, Aggregator parent,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -23,7 +23,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
@ -51,9 +51,9 @@ public class InternalChildren extends InternalSingleBucketAggregation implements
|
|||
public InternalChildren() {
|
||||
}
|
||||
|
||||
public InternalChildren(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers,
|
||||
public InternalChildren(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, docCount, aggregations, reducers, metaData);
|
||||
super(name, docCount, aggregations, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -63,6 +63,6 @@ public class InternalChildren extends InternalSingleBucketAggregation implements
|
|||
|
||||
@Override
|
||||
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
|
||||
return new InternalChildren(name, docCount, subAggregations, reducers(), getMetaData());
|
||||
return new InternalChildren(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
}
|
||||
|
|
|
@ -36,7 +36,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.NonCollectingAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -74,8 +74,8 @@ public class ParentToChildrenAggregator extends SingleBucketAggregator {
|
|||
public ParentToChildrenAggregator(String name, AggregatorFactories factories, AggregationContext aggregationContext,
|
||||
Aggregator parent, String parentType, Filter childFilter, Filter parentFilter,
|
||||
ValuesSource.Bytes.WithOrdinals.ParentChild valuesSource,
|
||||
long maxOrd, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, reducers, metaData);
|
||||
long maxOrd, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.parentType = parentType;
|
||||
// these two filters are cached in the parser
|
||||
this.childFilter = aggregationContext.searchContext().searcher().createNormalizedWeight(childFilter, false);
|
||||
|
@ -88,13 +88,13 @@ public class ParentToChildrenAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
|
||||
return new InternalChildren(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(),
|
||||
return new InternalChildren(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalChildren(name, 0, buildEmptySubAggregations(), reducers(), metaData());
|
||||
return new InternalChildren(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -196,13 +196,13 @@ public class ParentToChildrenAggregator extends SingleBucketAggregator {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
return new NonCollectingAggregator(name, aggregationContext, parent, reducers, metaData) {
|
||||
return new NonCollectingAggregator(name, aggregationContext, parent, pipelineAggregators, metaData) {
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalChildren(name, 0, buildEmptySubAggregations(), reducers(), metaData());
|
||||
return new InternalChildren(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
};
|
||||
|
@ -210,11 +210,11 @@ public class ParentToChildrenAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(ValuesSource.Bytes.WithOrdinals.ParentChild valuesSource,
|
||||
AggregationContext aggregationContext, Aggregator parent, boolean collectsFromSingleBucket, List<Reducer> reducers,
|
||||
AggregationContext aggregationContext, Aggregator parent, boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
long maxOrd = valuesSource.globalMaxOrd(aggregationContext.searchContext().searcher(), parentType);
|
||||
return new ParentToChildrenAggregator(name, factories, aggregationContext, parent, parentType, childFilter, parentFilter,
|
||||
valuesSource, maxOrd, reducers, metaData);
|
||||
valuesSource, maxOrd, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -48,9 +48,9 @@ public class FilterAggregator extends SingleBucketAggregator {
|
|||
Query filter,
|
||||
AggregatorFactories factories,
|
||||
AggregationContext aggregationContext,
|
||||
Aggregator parent, List<Reducer> reducers,
|
||||
Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.filter = aggregationContext.searchContext().searcher().createNormalizedWeight(filter, false);
|
||||
}
|
||||
|
||||
|
@ -71,13 +71,13 @@ public class FilterAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
|
||||
return new InternalFilter(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(),
|
||||
return new InternalFilter(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalFilter(name, 0, buildEmptySubAggregations(), reducers(), metaData());
|
||||
return new InternalFilter(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
public static class Factory extends AggregatorFactory {
|
||||
|
@ -91,8 +91,8 @@ public class FilterAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
return new FilterAggregator(name, filter, factories, context, parent, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
return new FilterAggregator(name, filter, factories, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
@ -50,8 +50,8 @@ public class InternalFilter extends InternalSingleBucketAggregation implements F
|
|||
|
||||
InternalFilter() {} // for serialization
|
||||
|
||||
InternalFilter(String name, long docCount, InternalAggregations subAggregations, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, docCount, subAggregations, reducers, metaData);
|
||||
InternalFilter(String name, long docCount, InternalAggregations subAggregations, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, docCount, subAggregations, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -61,6 +61,6 @@ public class InternalFilter extends InternalSingleBucketAggregation implements F
|
|||
|
||||
@Override
|
||||
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
|
||||
return new InternalFilter(name, docCount, subAggregations, reducers(), getMetaData());
|
||||
return new InternalFilter(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
}
|
|
@ -34,7 +34,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketsAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -62,9 +62,9 @@ public class FiltersAggregator extends BucketsAggregator {
|
|||
private final boolean keyed;
|
||||
|
||||
public FiltersAggregator(String name, AggregatorFactories factories, List<KeyedFilter> filters, boolean keyed, AggregationContext aggregationContext,
|
||||
Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData)
|
||||
Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
super(name, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.keyed = keyed;
|
||||
this.keys = new String[filters.size()];
|
||||
this.filters = new Weight[filters.size()];
|
||||
|
@ -103,7 +103,7 @@ public class FiltersAggregator extends BucketsAggregator {
|
|||
InternalFilters.Bucket bucket = new InternalFilters.Bucket(keys[i], bucketDocCount(bucketOrd), bucketAggregations(bucketOrd), keyed);
|
||||
buckets.add(bucket);
|
||||
}
|
||||
return new InternalFilters(name, buckets, keyed, reducers(), metaData());
|
||||
return new InternalFilters(name, buckets, keyed, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -114,7 +114,7 @@ public class FiltersAggregator extends BucketsAggregator {
|
|||
InternalFilters.Bucket bucket = new InternalFilters.Bucket(keys[i], 0, subAggs, keyed);
|
||||
buckets.add(bucket);
|
||||
}
|
||||
return new InternalFilters(name, buckets, keyed, reducers(), metaData());
|
||||
return new InternalFilters(name, buckets, keyed, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
final long bucketOrd(long owningBucketOrdinal, int filterOrd) {
|
||||
|
@ -134,8 +134,8 @@ public class FiltersAggregator extends BucketsAggregator {
|
|||
|
||||
@Override
|
||||
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
return new FiltersAggregator(name, factories, filters, keyed, context, parent, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
return new FiltersAggregator(name, factories, filters, keyed, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -32,7 +32,7 @@ import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
|
|||
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation.InternalBucket;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
|
@ -165,8 +165,8 @@ public class InternalFilters extends InternalMultiBucketAggregation<InternalFilt
|
|||
|
||||
public InternalFilters() {} // for serialization
|
||||
|
||||
public InternalFilters(String name, List<Bucket> buckets, boolean keyed, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
public InternalFilters(String name, List<Bucket> buckets, boolean keyed, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.buckets = buckets;
|
||||
this.keyed = keyed;
|
||||
}
|
||||
|
@ -178,7 +178,7 @@ public class InternalFilters extends InternalMultiBucketAggregation<InternalFilt
|
|||
|
||||
@Override
|
||||
public InternalFilters create(List<Bucket> buckets) {
|
||||
return new InternalFilters(this.name, buckets, this.keyed, this.reducers(), this.metaData);
|
||||
return new InternalFilters(this.name, buckets, this.keyed, this.pipelineAggregators(), this.metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -222,7 +222,7 @@ public class InternalFilters extends InternalMultiBucketAggregation<InternalFilt
|
|||
}
|
||||
}
|
||||
|
||||
InternalFilters reduced = new InternalFilters(name, new ArrayList<Bucket>(bucketsList.size()), keyed, reducers(), getMetaData());
|
||||
InternalFilters reduced = new InternalFilters(name, new ArrayList<Bucket>(bucketsList.size()), keyed, pipelineAggregators(), getMetaData());
|
||||
for (List<Bucket> sameRangeList : bucketsList) {
|
||||
reduced.buckets.add((sameRangeList.get(0)).reduce(sameRangeList, reduceContext));
|
||||
}
|
||||
|
|
|
@ -28,7 +28,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketsAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
|
||||
|
@ -51,9 +51,9 @@ public class GeoHashGridAggregator extends BucketsAggregator {
|
|||
private final LongHash bucketOrds;
|
||||
|
||||
public GeoHashGridAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource,
|
||||
int requiredSize, int shardSize, AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
int requiredSize, int shardSize, AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.requiredSize = requiredSize;
|
||||
this.shardSize = shardSize;
|
||||
|
@ -129,12 +129,12 @@ public class GeoHashGridAggregator extends BucketsAggregator {
|
|||
bucket.aggregations = bucketAggregations(bucket.bucketOrd);
|
||||
list[i] = bucket;
|
||||
}
|
||||
return new InternalGeoHashGrid(name, requiredSize, Arrays.asList(list), reducers(), metaData());
|
||||
return new InternalGeoHashGrid(name, requiredSize, Arrays.asList(list), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalGeoHashGrid buildEmptyAggregation() {
|
||||
return new InternalGeoHashGrid(name, requiredSize, Collections.<InternalGeoHashGrid.Bucket> emptyList(), reducers(), metaData());
|
||||
return new InternalGeoHashGrid(name, requiredSize, Collections.<InternalGeoHashGrid.Bucket> emptyList(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
|
||||
|
|
|
@ -34,7 +34,7 @@ import org.elasticsearch.search.aggregations.AggregatorFactory;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.NonCollectingAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketUtils;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -125,11 +125,11 @@ public class GeoHashGridParser implements Aggregator.Parser {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
final InternalAggregation aggregation = new InternalGeoHashGrid(name, requiredSize,
|
||||
Collections.<InternalGeoHashGrid.Bucket> emptyList(), reducers, metaData);
|
||||
return new NonCollectingAggregator(name, aggregationContext, parent, reducers, metaData) {
|
||||
Collections.<InternalGeoHashGrid.Bucket> emptyList(), pipelineAggregators, metaData);
|
||||
return new NonCollectingAggregator(name, aggregationContext, parent, pipelineAggregators, metaData) {
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return aggregation;
|
||||
}
|
||||
|
@ -138,13 +138,13 @@ public class GeoHashGridParser implements Aggregator.Parser {
|
|||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(final ValuesSource.GeoPoint valuesSource, AggregationContext aggregationContext,
|
||||
Aggregator parent, boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData)
|
||||
Aggregator parent, boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
if (collectsFromSingleBucket == false) {
|
||||
return asMultiBucketAggregator(this, aggregationContext, parent);
|
||||
}
|
||||
ValuesSource.Numeric cellIdSource = new CellIdSource(valuesSource, precision);
|
||||
return new GeoHashGridAggregator(name, factories, cellIdSource, requiredSize, shardSize, aggregationContext, parent, reducers,
|
||||
return new GeoHashGridAggregator(name, factories, cellIdSource, requiredSize, shardSize, aggregationContext, parent, pipelineAggregators,
|
||||
metaData);
|
||||
|
||||
}
|
||||
|
|
|
@ -32,7 +32,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
|
|||
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
|
@ -171,9 +171,9 @@ public class InternalGeoHashGrid extends InternalMultiBucketAggregation<Internal
|
|||
InternalGeoHashGrid() {
|
||||
} // for serialization
|
||||
|
||||
public InternalGeoHashGrid(String name, int requiredSize, Collection<Bucket> buckets, List<Reducer> reducers,
|
||||
public InternalGeoHashGrid(String name, int requiredSize, Collection<Bucket> buckets, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.requiredSize = requiredSize;
|
||||
this.buckets = buckets;
|
||||
}
|
||||
|
@ -185,7 +185,7 @@ public class InternalGeoHashGrid extends InternalMultiBucketAggregation<Internal
|
|||
|
||||
@Override
|
||||
public InternalGeoHashGrid create(List<Bucket> buckets) {
|
||||
return new InternalGeoHashGrid(this.name, this.requiredSize, buckets, this.reducers(), this.metaData);
|
||||
return new InternalGeoHashGrid(this.name, this.requiredSize, buckets, this.pipelineAggregators(), this.metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -229,7 +229,7 @@ public class InternalGeoHashGrid extends InternalMultiBucketAggregation<Internal
|
|||
for (int i = ordered.size() - 1; i >= 0; i--) {
|
||||
list[i] = ordered.pop();
|
||||
}
|
||||
return new InternalGeoHashGrid(getName(), requiredSize, Arrays.asList(list), reducers(), getMetaData());
|
||||
return new InternalGeoHashGrid(getName(), requiredSize, Arrays.asList(list), pipelineAggregators(), getMetaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -39,9 +39,9 @@ import java.util.Map;
|
|||
*/
|
||||
public class GlobalAggregator extends SingleBucketAggregator {
|
||||
|
||||
public GlobalAggregator(String name, AggregatorFactories subFactories, AggregationContext aggregationContext, List<Reducer> reducers,
|
||||
public GlobalAggregator(String name, AggregatorFactories subFactories, AggregationContext aggregationContext, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, subFactories, aggregationContext, null, reducers, metaData);
|
||||
super(name, subFactories, aggregationContext, null, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -59,7 +59,7 @@ public class GlobalAggregator extends SingleBucketAggregator {
|
|||
@Override
|
||||
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
|
||||
assert owningBucketOrdinal == 0 : "global aggregator can only be a top level aggregator";
|
||||
return new InternalGlobal(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(),
|
||||
return new InternalGlobal(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
|
@ -76,7 +76,7 @@ public class GlobalAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
if (parent != null) {
|
||||
throw new AggregationExecutionException("Aggregation [" + parent.name() + "] cannot have a global " +
|
||||
"sub-aggregation [" + name + "]. Global aggregations can only be defined as top level aggregations");
|
||||
|
@ -84,7 +84,7 @@ public class GlobalAggregator extends SingleBucketAggregator {
|
|||
if (collectsFromSingleBucket == false) {
|
||||
throw new IllegalStateException();
|
||||
}
|
||||
return new GlobalAggregator(name, factories, context, reducers, metaData);
|
||||
return new GlobalAggregator(name, factories, context, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
@ -51,8 +51,8 @@ public class InternalGlobal extends InternalSingleBucketAggregation implements G
|
|||
|
||||
InternalGlobal() {} // for serialization
|
||||
|
||||
InternalGlobal(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, docCount, aggregations, reducers, metaData);
|
||||
InternalGlobal(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, docCount, aggregations, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -62,6 +62,6 @@ public class InternalGlobal extends InternalSingleBucketAggregation implements G
|
|||
|
||||
@Override
|
||||
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
|
||||
return new InternalGlobal(name, docCount, subAggregations, reducers(), getMetaData());
|
||||
return new InternalGlobal(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
}
|
||||
|
|
|
@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketsAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -62,9 +62,9 @@ public class HistogramAggregator extends BucketsAggregator {
|
|||
boolean keyed, long minDocCount, @Nullable ExtendedBounds extendedBounds,
|
||||
@Nullable ValuesSource.Numeric valuesSource, @Nullable ValueFormatter formatter,
|
||||
InternalHistogram.Factory<?> histogramFactory, AggregationContext aggregationContext,
|
||||
Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
|
||||
super(name, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.rounding = rounding;
|
||||
this.order = order;
|
||||
this.keyed = keyed;
|
||||
|
@ -130,13 +130,13 @@ public class HistogramAggregator extends BucketsAggregator {
|
|||
|
||||
// value source will be null for unmapped fields
|
||||
InternalHistogram.EmptyBucketInfo emptyBucketInfo = minDocCount == 0 ? new InternalHistogram.EmptyBucketInfo(rounding, buildEmptySubAggregations(), extendedBounds) : null;
|
||||
return histogramFactory.create(name, buckets, order, minDocCount, emptyBucketInfo, formatter, keyed, reducers(), metaData());
|
||||
return histogramFactory.create(name, buckets, order, minDocCount, emptyBucketInfo, formatter, keyed, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
InternalHistogram.EmptyBucketInfo emptyBucketInfo = minDocCount == 0 ? new InternalHistogram.EmptyBucketInfo(rounding, buildEmptySubAggregations(), extendedBounds) : null;
|
||||
return histogramFactory.create(name, Collections.emptyList(), order, minDocCount, emptyBucketInfo, formatter, keyed, reducers(),
|
||||
return histogramFactory.create(name, Collections.emptyList(), order, minDocCount, emptyBucketInfo, formatter, keyed, pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
|
@ -172,15 +172,15 @@ public class HistogramAggregator extends BucketsAggregator {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
return new HistogramAggregator(name, factories, rounding, order, keyed, minDocCount, null, null, config.formatter(),
|
||||
histogramFactory, aggregationContext, parent, reducers, metaData);
|
||||
histogramFactory, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent,
|
||||
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
if (collectsFromSingleBucket == false) {
|
||||
return asMultiBucketAggregator(this, aggregationContext, parent);
|
||||
}
|
||||
|
@ -194,7 +194,7 @@ public class HistogramAggregator extends BucketsAggregator {
|
|||
roundedBounds = extendedBounds.round(rounding);
|
||||
}
|
||||
return new HistogramAggregator(name, factories, rounding, order, keyed, minDocCount, roundedBounds, valuesSource,
|
||||
config.formatter(), histogramFactory, aggregationContext, parent, reducers, metaData);
|
||||
config.formatter(), histogramFactory, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -37,7 +37,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
|
|||
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
|
||||
|
||||
|
@ -243,14 +243,16 @@ public class InternalHistogram<B extends InternalHistogram.Bucket> extends Inter
|
|||
}
|
||||
|
||||
public InternalHistogram<B> create(String name, List<B> buckets, InternalOrder order, long minDocCount,
|
||||
EmptyBucketInfo emptyBucketInfo, @Nullable ValueFormatter formatter, boolean keyed, List<Reducer> reducers,
|
||||
EmptyBucketInfo emptyBucketInfo, @Nullable ValueFormatter formatter, boolean keyed,
|
||||
List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
return new InternalHistogram<>(name, buckets, order, minDocCount, emptyBucketInfo, formatter, keyed, this, reducers, metaData);
|
||||
return new InternalHistogram<>(name, buckets, order, minDocCount, emptyBucketInfo, formatter, keyed, this, pipelineAggregators,
|
||||
metaData);
|
||||
}
|
||||
|
||||
public InternalHistogram<B> create(List<B> buckets, InternalHistogram<B> prototype) {
|
||||
return new InternalHistogram<>(prototype.name, buckets, prototype.order, prototype.minDocCount, prototype.emptyBucketInfo,
|
||||
prototype.formatter, prototype.keyed, this, prototype.reducers(), prototype.metaData);
|
||||
prototype.formatter, prototype.keyed, this, prototype.pipelineAggregators(), prototype.metaData);
|
||||
}
|
||||
|
||||
public B createBucket(InternalAggregations aggregations, B prototype) {
|
||||
|
@ -284,8 +286,9 @@ public class InternalHistogram<B extends InternalHistogram.Bucket> extends Inter
|
|||
|
||||
InternalHistogram(String name, List<B> buckets, InternalOrder order, long minDocCount,
|
||||
EmptyBucketInfo emptyBucketInfo,
|
||||
@Nullable ValueFormatter formatter, boolean keyed, Factory<B> factory, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
@Nullable ValueFormatter formatter, boolean keyed, Factory<B> factory, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.buckets = buckets;
|
||||
this.order = order;
|
||||
assert (minDocCount == 0) == (emptyBucketInfo != null);
|
||||
|
@ -470,7 +473,7 @@ public class InternalHistogram<B extends InternalHistogram.Bucket> extends Inter
|
|||
CollectionUtil.introSort(reducedBuckets, order.comparator());
|
||||
}
|
||||
|
||||
return getFactory().create(getName(), reducedBuckets, order, minDocCount, emptyBucketInfo, formatter, keyed, reducers(),
|
||||
return getFactory().create(getName(), reducedBuckets, order, minDocCount, emptyBucketInfo, formatter, keyed, pipelineAggregators(),
|
||||
getMetaData());
|
||||
}
|
||||
|
||||
|
|
|
@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
@ -52,8 +52,8 @@ public class InternalMissing extends InternalSingleBucketAggregation implements
|
|||
InternalMissing() {
|
||||
}
|
||||
|
||||
InternalMissing(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, docCount, aggregations, reducers, metaData);
|
||||
InternalMissing(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, docCount, aggregations, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -63,6 +63,6 @@ public class InternalMissing extends InternalSingleBucketAggregation implements
|
|||
|
||||
@Override
|
||||
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
|
||||
return new InternalMissing(name, docCount, subAggregations, reducers(), getMetaData());
|
||||
return new InternalMissing(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
}
|
||||
|
|
|
@ -26,7 +26,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -44,9 +44,9 @@ public class MissingAggregator extends SingleBucketAggregator {
|
|||
private final ValuesSource valuesSource;
|
||||
|
||||
public MissingAggregator(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
}
|
||||
|
||||
|
@ -72,13 +72,13 @@ public class MissingAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
|
||||
return new InternalMissing(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(),
|
||||
return new InternalMissing(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalMissing(name, 0, buildEmptySubAggregations(), reducers(), metaData());
|
||||
return new InternalMissing(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
public static class Factory extends ValuesSourceAggregatorFactory<ValuesSource> {
|
||||
|
@ -88,15 +88,15 @@ public class MissingAggregator extends SingleBucketAggregator {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected MissingAggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
protected MissingAggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
return new MissingAggregator(name, factories, null, aggregationContext, parent, reducers, metaData);
|
||||
return new MissingAggregator(name, factories, null, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected MissingAggregator doCreateInternal(ValuesSource valuesSource, AggregationContext aggregationContext, Aggregator parent,
|
||||
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
return new MissingAggregator(name, factories, valuesSource, aggregationContext, parent, reducers, metaData);
|
||||
boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
return new MissingAggregator(name, factories, valuesSource, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
@ -51,9 +51,9 @@ public class InternalNested extends InternalSingleBucketAggregation implements N
|
|||
public InternalNested() {
|
||||
}
|
||||
|
||||
public InternalNested(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers,
|
||||
public InternalNested(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, docCount, aggregations, reducers, metaData);
|
||||
super(name, docCount, aggregations, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -63,6 +63,6 @@ public class InternalNested extends InternalSingleBucketAggregation implements N
|
|||
|
||||
@Override
|
||||
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
|
||||
return new InternalNested(name, docCount, subAggregations, reducers(), getMetaData());
|
||||
return new InternalNested(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
}
|
||||
|
|
|
@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
@ -51,9 +51,9 @@ public class InternalReverseNested extends InternalSingleBucketAggregation imple
|
|||
public InternalReverseNested() {
|
||||
}
|
||||
|
||||
public InternalReverseNested(String name, long docCount, InternalAggregations aggregations, List<Reducer> reducers,
|
||||
public InternalReverseNested(String name, long docCount, InternalAggregations aggregations, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, docCount, aggregations, reducers, metaData);
|
||||
super(name, docCount, aggregations, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -63,6 +63,6 @@ public class InternalReverseNested extends InternalSingleBucketAggregation imple
|
|||
|
||||
@Override
|
||||
protected InternalSingleBucketAggregation newAggregation(String name, long docCount, InternalAggregations subAggregations) {
|
||||
return new InternalReverseNested(name, docCount, subAggregations, reducers(), getMetaData());
|
||||
return new InternalReverseNested(name, docCount, subAggregations, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
}
|
||||
|
|
|
@ -38,7 +38,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.NonCollectingAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -56,8 +56,8 @@ public class NestedAggregator extends SingleBucketAggregator {
|
|||
private DocIdSetIterator childDocs;
|
||||
private BitSet parentDocs;
|
||||
|
||||
public NestedAggregator(String name, AggregatorFactories factories, ObjectMapper objectMapper, AggregationContext aggregationContext, Aggregator parentAggregator, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parentAggregator, reducers, metaData);
|
||||
public NestedAggregator(String name, AggregatorFactories factories, ObjectMapper objectMapper, AggregationContext aggregationContext, Aggregator parentAggregator, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parentAggregator, pipelineAggregators, metaData);
|
||||
childFilter = objectMapper.nestedTypeFilter();
|
||||
}
|
||||
|
||||
|
@ -121,13 +121,13 @@ public class NestedAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
|
||||
return new InternalNested(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(),
|
||||
return new InternalNested(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalNested(name, 0, buildEmptySubAggregations(), reducers(), metaData());
|
||||
return new InternalNested(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
private static Filter findClosestNestedPath(Aggregator parent) {
|
||||
|
@ -152,34 +152,34 @@ public class NestedAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
if (collectsFromSingleBucket == false) {
|
||||
return asMultiBucketAggregator(this, context, parent);
|
||||
}
|
||||
MapperService.SmartNameObjectMapper mapper = context.searchContext().smartNameObjectMapper(path);
|
||||
if (mapper == null) {
|
||||
return new Unmapped(name, context, parent, reducers, metaData);
|
||||
return new Unmapped(name, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
ObjectMapper objectMapper = mapper.mapper();
|
||||
if (objectMapper == null) {
|
||||
return new Unmapped(name, context, parent, reducers, metaData);
|
||||
return new Unmapped(name, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
if (!objectMapper.nested().isNested()) {
|
||||
throw new AggregationExecutionException("[nested] nested path [" + path + "] is not nested");
|
||||
}
|
||||
return new NestedAggregator(name, factories, objectMapper, context, parent, reducers, metaData);
|
||||
return new NestedAggregator(name, factories, objectMapper, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
private final static class Unmapped extends NonCollectingAggregator {
|
||||
|
||||
public Unmapped(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData)
|
||||
public Unmapped(String name, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
super(name, context, parent, reducers, metaData);
|
||||
super(name, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalNested(name, 0, buildEmptySubAggregations(), reducers(), metaData());
|
||||
return new InternalNested(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -40,7 +40,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.NonCollectingAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -55,9 +55,9 @@ public class ReverseNestedAggregator extends SingleBucketAggregator {
|
|||
private final BitDocIdSetFilter parentFilter;
|
||||
|
||||
public ReverseNestedAggregator(String name, AggregatorFactories factories, ObjectMapper objectMapper,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData)
|
||||
AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
super(name, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
if (objectMapper == null) {
|
||||
parentFilter = context.searchContext().bitsetFilterCache().getBitDocIdSetFilter(Queries.newNonNestedFilter());
|
||||
} else {
|
||||
|
@ -111,13 +111,13 @@ public class ReverseNestedAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
|
||||
return new InternalReverseNested(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), reducers(),
|
||||
return new InternalReverseNested(name, bucketDocCount(owningBucketOrdinal), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalReverseNested(name, 0, buildEmptySubAggregations(), reducers(), metaData());
|
||||
return new InternalReverseNested(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
Filter getParentFilter() {
|
||||
|
@ -135,7 +135,7 @@ public class ReverseNestedAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
// Early validation
|
||||
NestedAggregator closestNestedAggregator = findClosestNestedAggregator(parent);
|
||||
if (closestNestedAggregator == null) {
|
||||
|
@ -147,11 +147,11 @@ public class ReverseNestedAggregator extends SingleBucketAggregator {
|
|||
if (path != null) {
|
||||
MapperService.SmartNameObjectMapper mapper = context.searchContext().smartNameObjectMapper(path);
|
||||
if (mapper == null) {
|
||||
return new Unmapped(name, context, parent, reducers, metaData);
|
||||
return new Unmapped(name, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
objectMapper = mapper.mapper();
|
||||
if (objectMapper == null) {
|
||||
return new Unmapped(name, context, parent, reducers, metaData);
|
||||
return new Unmapped(name, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
if (!objectMapper.nested().isNested()) {
|
||||
throw new AggregationExecutionException("[reverse_nested] nested path [" + path + "] is not nested");
|
||||
|
@ -159,19 +159,19 @@ public class ReverseNestedAggregator extends SingleBucketAggregator {
|
|||
} else {
|
||||
objectMapper = null;
|
||||
}
|
||||
return new ReverseNestedAggregator(name, factories, objectMapper, context, parent, reducers, metaData);
|
||||
return new ReverseNestedAggregator(name, factories, objectMapper, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
private final static class Unmapped extends NonCollectingAggregator {
|
||||
|
||||
public Unmapped(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData)
|
||||
public Unmapped(String name, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
super(name, context, parent, reducers, metaData);
|
||||
super(name, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalReverseNested(name, 0, buildEmptySubAggregations(), reducers(), metaData());
|
||||
return new InternalReverseNested(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
|
|||
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
|
||||
|
||||
|
@ -231,9 +231,9 @@ public class InternalRange<B extends InternalRange.Bucket, R extends InternalRan
|
|||
return TYPE.name();
|
||||
}
|
||||
|
||||
public R create(String name, List<B> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<Reducer> reducers,
|
||||
public R create(String name, List<B> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
return (R) new InternalRange<>(name, ranges, formatter, keyed, reducers, metaData);
|
||||
return (R) new InternalRange<>(name, ranges, formatter, keyed, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
public B createBucket(String key, double from, double to, long docCount, InternalAggregations aggregations, boolean keyed,
|
||||
|
@ -242,7 +242,7 @@ public class InternalRange<B extends InternalRange.Bucket, R extends InternalRan
|
|||
}
|
||||
|
||||
public R create(List<B> ranges, R prototype) {
|
||||
return (R) new InternalRange<>(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.reducers(),
|
||||
return (R) new InternalRange<>(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.pipelineAggregators(),
|
||||
prototype.metaData);
|
||||
}
|
||||
|
||||
|
@ -260,9 +260,9 @@ public class InternalRange<B extends InternalRange.Bucket, R extends InternalRan
|
|||
|
||||
public InternalRange() {} // for serialization
|
||||
|
||||
public InternalRange(String name, List<B> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<Reducer> reducers,
|
||||
public InternalRange(String name, List<B> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.ranges = ranges;
|
||||
this.formatter = formatter;
|
||||
this.keyed = keyed;
|
||||
|
@ -311,7 +311,7 @@ public class InternalRange<B extends InternalRange.Bucket, R extends InternalRan
|
|||
for (int i = 0; i < this.ranges.size(); ++i) {
|
||||
ranges.add((B) rangeList[i].get(0).reduce(rangeList[i], reduceContext));
|
||||
}
|
||||
return getFactory().create(name, ranges, formatter, keyed, reducers(), getMetaData());
|
||||
return getFactory().create(name, ranges, formatter, keyed, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -33,7 +33,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.NonCollectingAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketsAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -105,10 +105,10 @@ public class RangeAggregator extends BucketsAggregator {
|
|||
List<Range> ranges,
|
||||
boolean keyed,
|
||||
AggregationContext aggregationContext,
|
||||
Aggregator parent, List<Reducer> reducers,
|
||||
Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
|
||||
super(name, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
assert valuesSource != null;
|
||||
this.valuesSource = valuesSource;
|
||||
this.formatter = format != null ? format.formatter() : null;
|
||||
|
@ -216,7 +216,7 @@ public class RangeAggregator extends BucketsAggregator {
|
|||
buckets.add(bucket);
|
||||
}
|
||||
// value source can be null in the case of unmapped fields
|
||||
return rangeFactory.create(name, buckets, formatter, keyed, reducers(), metaData());
|
||||
return rangeFactory.create(name, buckets, formatter, keyed, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -230,7 +230,7 @@ public class RangeAggregator extends BucketsAggregator {
|
|||
buckets.add(bucket);
|
||||
}
|
||||
// value source can be null in the case of unmapped fields
|
||||
return rangeFactory.create(name, buckets, formatter, keyed, reducers(), metaData());
|
||||
return rangeFactory.create(name, buckets, formatter, keyed, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
private static final void sortRanges(final Range[] ranges) {
|
||||
|
@ -267,10 +267,10 @@ public class RangeAggregator extends BucketsAggregator {
|
|||
ValueFormat format,
|
||||
AggregationContext context,
|
||||
Aggregator parent,
|
||||
InternalRange.Factory factory, List<Reducer> reducers,
|
||||
InternalRange.Factory factory, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
|
||||
super(name, context, parent, reducers, metaData);
|
||||
super(name, context, parent, pipelineAggregators, metaData);
|
||||
this.ranges = ranges;
|
||||
ValueParser parser = format != null ? format.parser() : ValueParser.RAW;
|
||||
for (Range range : this.ranges) {
|
||||
|
@ -288,7 +288,7 @@ public class RangeAggregator extends BucketsAggregator {
|
|||
for (RangeAggregator.Range range : ranges) {
|
||||
buckets.add(factory.createBucket(range.key, range.from, range.to, 0, subAggs, keyed, formatter));
|
||||
}
|
||||
return factory.create(name, buckets, formatter, keyed, reducers(), metaData());
|
||||
return factory.create(name, buckets, formatter, keyed, pipelineAggregators(), metaData());
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -306,15 +306,15 @@ public class RangeAggregator extends BucketsAggregator {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
return new Unmapped(name, ranges, keyed, config.format(), aggregationContext, parent, rangeFactory, reducers, metaData);
|
||||
return new Unmapped(name, ranges, keyed, config.format(), aggregationContext, parent, rangeFactory, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent,
|
||||
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
return new RangeAggregator(name, factories, valuesSource, config.format(), rangeFactory, ranges, keyed, aggregationContext, parent, reducers, metaData);
|
||||
boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
return new RangeAggregator(name, factories, valuesSource, config.format(), rangeFactory, ranges, keyed, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -26,7 +26,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
|
|||
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.bucket.range.InternalRange;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
import org.joda.time.DateTime;
|
||||
import org.joda.time.DateTimeZone;
|
||||
|
@ -122,13 +122,13 @@ public class InternalDateRange extends InternalRange<InternalDateRange.Bucket, I
|
|||
|
||||
@Override
|
||||
public InternalDateRange create(String name, List<InternalDateRange.Bucket> ranges, ValueFormatter formatter, boolean keyed,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
return new InternalDateRange(name, ranges, formatter, keyed, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
return new InternalDateRange(name, ranges, formatter, keyed, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalDateRange create(List<Bucket> ranges, InternalDateRange prototype) {
|
||||
return new InternalDateRange(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.reducers(),
|
||||
return new InternalDateRange(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.pipelineAggregators(),
|
||||
prototype.metaData);
|
||||
}
|
||||
|
||||
|
@ -147,8 +147,8 @@ public class InternalDateRange extends InternalRange<InternalDateRange.Bucket, I
|
|||
InternalDateRange() {} // for serialization
|
||||
|
||||
InternalDateRange(String name, List<InternalDateRange.Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, ranges, formatter, keyed, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, ranges, formatter, keyed, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -35,7 +35,7 @@ import org.elasticsearch.search.aggregations.AggregatorFactory;
|
|||
import org.elasticsearch.search.aggregations.bucket.range.InternalRange;
|
||||
import org.elasticsearch.search.aggregations.bucket.range.RangeAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.range.RangeAggregator.Unmapped;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.GeoPointParser;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
|
@ -186,18 +186,19 @@ public class GeoDistanceParser implements Aggregator.Parser {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
return new Unmapped(name, ranges, keyed, null, aggregationContext, parent, rangeFactory, reducers, metaData);
|
||||
return new Unmapped(name, ranges, keyed, null, aggregationContext, parent, rangeFactory, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(final ValuesSource.GeoPoint valuesSource, AggregationContext aggregationContext,
|
||||
Aggregator parent, boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData)
|
||||
Aggregator parent, boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
DistanceSource distanceSource = new DistanceSource(valuesSource, distanceType, origin, unit);
|
||||
return new RangeAggregator(name, factories, distanceSource, null, rangeFactory, ranges, keyed, aggregationContext, parent,
|
||||
reducers, metaData);
|
||||
pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
private static class DistanceSource extends ValuesSource.Numeric {
|
||||
|
|
|
@ -26,7 +26,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
|
|||
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.bucket.range.InternalRange;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -110,13 +110,13 @@ public class InternalGeoDistance extends InternalRange<InternalGeoDistance.Bucke
|
|||
|
||||
@Override
|
||||
public InternalGeoDistance create(String name, List<Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
return new InternalGeoDistance(name, ranges, formatter, keyed, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
return new InternalGeoDistance(name, ranges, formatter, keyed, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalGeoDistance create(List<Bucket> ranges, InternalGeoDistance prototype) {
|
||||
return new InternalGeoDistance(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.reducers(),
|
||||
return new InternalGeoDistance(prototype.name, ranges, prototype.formatter, prototype.keyed, prototype.pipelineAggregators(),
|
||||
prototype.metaData);
|
||||
}
|
||||
|
||||
|
@ -134,9 +134,9 @@ public class InternalGeoDistance extends InternalRange<InternalGeoDistance.Bucke
|
|||
|
||||
InternalGeoDistance() {} // for serialization
|
||||
|
||||
public InternalGeoDistance(String name, List<Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<Reducer> reducers,
|
||||
public InternalGeoDistance(String name, List<Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, ranges, formatter, keyed, reducers, metaData);
|
||||
super(name, ranges, formatter, keyed, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -26,7 +26,7 @@ import org.elasticsearch.search.aggregations.InternalAggregations;
|
|||
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.bucket.range.InternalRange;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -119,13 +119,13 @@ public class InternalIPv4Range extends InternalRange<InternalIPv4Range.Bucket, I
|
|||
|
||||
@Override
|
||||
public InternalIPv4Range create(String name, List<Bucket> ranges, @Nullable ValueFormatter formatter, boolean keyed,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
return new InternalIPv4Range(name, ranges, keyed, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
return new InternalIPv4Range(name, ranges, keyed, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalIPv4Range create(List<Bucket> ranges, InternalIPv4Range prototype) {
|
||||
return new InternalIPv4Range(prototype.name, ranges, prototype.keyed, prototype.reducers(), prototype.metaData);
|
||||
return new InternalIPv4Range(prototype.name, ranges, prototype.keyed, prototype.pipelineAggregators(), prototype.metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -142,9 +142,9 @@ public class InternalIPv4Range extends InternalRange<InternalIPv4Range.Bucket, I
|
|||
|
||||
public InternalIPv4Range() {} // for serialization
|
||||
|
||||
public InternalIPv4Range(String name, List<InternalIPv4Range.Bucket> ranges, boolean keyed, List<Reducer> reducers,
|
||||
public InternalIPv4Range(String name, List<InternalIPv4Range.Bucket> ranges, boolean keyed, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, ranges, ValueFormatter.IPv4, keyed, reducers, metaData);
|
||||
super(name, ranges, ValueFormatter.IPv4, keyed, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
|
|||
import org.elasticsearch.search.aggregations.AggregatorFactories;
|
||||
import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector;
|
||||
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
|
||||
|
@ -49,10 +49,10 @@ public class DiversifiedBytesHashSamplerAggregator extends SamplerAggregator {
|
|||
private int maxDocsPerValue;
|
||||
|
||||
public DiversifiedBytesHashSamplerAggregator(String name, int shardSize, AggregatorFactories factories,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData,
|
||||
ValuesSource valuesSource,
|
||||
int maxDocsPerValue) throws IOException {
|
||||
super(name, shardSize, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, shardSize, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.maxDocsPerValue = maxDocsPerValue;
|
||||
}
|
||||
|
|
|
@ -33,7 +33,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
|
|||
import org.elasticsearch.search.aggregations.AggregatorFactories;
|
||||
import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector;
|
||||
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
|
||||
|
@ -48,9 +48,9 @@ public class DiversifiedMapSamplerAggregator extends SamplerAggregator {
|
|||
private BytesRefHash bucketOrds;
|
||||
|
||||
public DiversifiedMapSamplerAggregator(String name, int shardSize, AggregatorFactories factories,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData,
|
||||
ValuesSource valuesSource, int maxDocsPerValue) throws IOException {
|
||||
super(name, shardSize, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, shardSize, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.maxDocsPerValue = maxDocsPerValue;
|
||||
bucketOrds = new BytesRefHash(shardSize, aggregationContext.bigArrays());
|
||||
|
|
|
@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
|
|||
import org.elasticsearch.search.aggregations.AggregatorFactories;
|
||||
import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector;
|
||||
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
|
||||
|
@ -44,9 +44,9 @@ public class DiversifiedNumericSamplerAggregator extends SamplerAggregator {
|
|||
private int maxDocsPerValue;
|
||||
|
||||
public DiversifiedNumericSamplerAggregator(String name, int shardSize, AggregatorFactories factories,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData,
|
||||
ValuesSource.Numeric valuesSource, int maxDocsPerValue) throws IOException {
|
||||
super(name, shardSize, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, shardSize, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.maxDocsPerValue = maxDocsPerValue;
|
||||
}
|
||||
|
|
|
@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
|
|||
import org.elasticsearch.search.aggregations.AggregatorFactories;
|
||||
import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector;
|
||||
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
|
||||
|
@ -45,9 +45,9 @@ public class DiversifiedOrdinalsSamplerAggregator extends SamplerAggregator {
|
|||
private int maxDocsPerValue;
|
||||
|
||||
public DiversifiedOrdinalsSamplerAggregator(String name, int shardSize, AggregatorFactories factories,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData,
|
||||
AggregationContext aggregationContext, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData,
|
||||
ValuesSource.Bytes.WithOrdinals.FieldData valuesSource, int maxDocsPerValue) throws IOException {
|
||||
super(name, shardSize, factories, aggregationContext, parent, reducers, metaData);
|
||||
super(name, shardSize, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.maxDocsPerValue = maxDocsPerValue;
|
||||
}
|
||||
|
|
|
@ -22,7 +22,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.InternalSingleBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
@ -51,8 +51,8 @@ public class InternalSampler extends InternalSingleBucketAggregation implements
|
|||
InternalSampler() {
|
||||
} // for serialization
|
||||
|
||||
InternalSampler(String name, long docCount, InternalAggregations subAggregations, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, docCount, subAggregations, reducers, metaData);
|
||||
InternalSampler(String name, long docCount, InternalAggregations subAggregations, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, docCount, subAggregations, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -63,6 +63,6 @@ public class InternalSampler extends InternalSingleBucketAggregation implements
|
|||
@Override
|
||||
protected InternalSingleBucketAggregation newAggregation(String name, long docCount,
|
||||
InternalAggregations subAggregations) {
|
||||
return new InternalSampler(name, docCount, subAggregations, reducers(), metaData);
|
||||
return new InternalSampler(name, docCount, subAggregations, pipelineAggregators(), metaData);
|
||||
}
|
||||
}
|
|
@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.NonCollectingAggregator;
|
|||
import org.elasticsearch.search.aggregations.bucket.BestDocsDeferringCollector;
|
||||
import org.elasticsearch.search.aggregations.bucket.DeferringBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource.Numeric;
|
||||
|
@ -60,9 +60,11 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource,
|
||||
AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
|
||||
return new DiversifiedMapSamplerAggregator(name, shardSize, factories, context, parent, reducers, metaData, valuesSource,
|
||||
return new DiversifiedMapSamplerAggregator(name, shardSize, factories, context, parent, pipelineAggregators, metaData,
|
||||
valuesSource,
|
||||
maxDocsPerValue);
|
||||
}
|
||||
|
||||
|
@ -76,9 +78,11 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource,
|
||||
AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
|
||||
return new DiversifiedBytesHashSamplerAggregator(name, shardSize, factories, context, parent, reducers, metaData,
|
||||
return new DiversifiedBytesHashSamplerAggregator(name, shardSize, factories, context, parent, pipelineAggregators,
|
||||
metaData,
|
||||
valuesSource,
|
||||
maxDocsPerValue);
|
||||
}
|
||||
|
@ -93,8 +97,9 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource,
|
||||
AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
return new DiversifiedOrdinalsSamplerAggregator(name, shardSize, factories, context, parent, reducers, metaData,
|
||||
AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
return new DiversifiedOrdinalsSamplerAggregator(name, shardSize, factories, context, parent, pipelineAggregators, metaData,
|
||||
(ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, maxDocsPerValue);
|
||||
}
|
||||
|
||||
|
@ -121,7 +126,7 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
}
|
||||
|
||||
abstract Aggregator create(String name, AggregatorFactories factories, int shardSize, int maxDocsPerValue, ValuesSource valuesSource,
|
||||
AggregationContext context, Aggregator parent, List<Reducer> reducers,
|
||||
AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException;
|
||||
|
||||
abstract boolean needsGlobalOrdinals();
|
||||
|
@ -137,8 +142,8 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
protected BestDocsDeferringCollector bdd;
|
||||
|
||||
public SamplerAggregator(String name, int shardSize, AggregatorFactories factories, AggregationContext aggregationContext,
|
||||
Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, reducers, metaData);
|
||||
Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.shardSize = shardSize;
|
||||
}
|
||||
|
||||
|
@ -163,13 +168,13 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
@Override
|
||||
public InternalAggregation buildAggregation(long owningBucketOrdinal) throws IOException {
|
||||
runDeferredCollections(owningBucketOrdinal);
|
||||
return new InternalSampler(name, bdd == null ? 0 : bdd.getDocCount(), bucketAggregations(owningBucketOrdinal), reducers(),
|
||||
return new InternalSampler(name, bdd == null ? 0 : bdd.getDocCount(), bucketAggregations(owningBucketOrdinal), pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalSampler(name, 0, buildEmptySubAggregations(), reducers(), metaData());
|
||||
return new InternalSampler(name, 0, buildEmptySubAggregations(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
public static class Factory extends AggregatorFactory {
|
||||
|
@ -183,12 +188,12 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
public Aggregator createInternal(AggregationContext context, Aggregator parent, boolean collectsFromSingleBucket,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
|
||||
if (collectsFromSingleBucket == false) {
|
||||
return asMultiBucketAggregator(this, context, parent);
|
||||
}
|
||||
return new SamplerAggregator(name, shardSize, factories, context, parent, reducers, metaData);
|
||||
return new SamplerAggregator(name, shardSize, factories, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -208,7 +213,8 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext context, Aggregator parent,
|
||||
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
|
||||
if (collectsFromSingleBucket == false) {
|
||||
return asMultiBucketAggregator(this, context, parent);
|
||||
|
@ -216,7 +222,7 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
|
||||
|
||||
if (valuesSource instanceof ValuesSource.Numeric) {
|
||||
return new DiversifiedNumericSamplerAggregator(name, shardSize, factories, context, parent, reducers, metaData,
|
||||
return new DiversifiedNumericSamplerAggregator(name, shardSize, factories, context, parent, pipelineAggregators, metaData,
|
||||
(Numeric) valuesSource, maxDocsPerValue);
|
||||
}
|
||||
|
||||
|
@ -234,7 +240,8 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
if ((execution.needsGlobalOrdinals()) && (!(valuesSource instanceof ValuesSource.Bytes.WithOrdinals))) {
|
||||
execution = ExecutionMode.MAP;
|
||||
}
|
||||
return execution.create(name, factories, shardSize, maxDocsPerValue, valuesSource, context, parent, reducers, metaData);
|
||||
return execution.create(name, factories, shardSize, maxDocsPerValue, valuesSource, context, parent, pipelineAggregators,
|
||||
metaData);
|
||||
}
|
||||
|
||||
throw new AggregationExecutionException("Sampler aggregation cannot be applied to field [" + config.fieldContext().field() +
|
||||
|
@ -242,11 +249,12 @@ public class SamplerAggregator extends SingleBucketAggregator {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
|
||||
List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
final UnmappedSampler aggregation = new UnmappedSampler(name, reducers, metaData);
|
||||
final UnmappedSampler aggregation = new UnmappedSampler(name, pipelineAggregators, metaData);
|
||||
|
||||
return new NonCollectingAggregator(name, aggregationContext, parent, factories, reducers, metaData) {
|
||||
return new NonCollectingAggregator(name, aggregationContext, parent, factories, pipelineAggregators, metaData) {
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return aggregation;
|
||||
|
|
|
@ -23,7 +23,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
@ -53,8 +53,8 @@ public class UnmappedSampler extends InternalSampler {
|
|||
UnmappedSampler() {
|
||||
}
|
||||
|
||||
public UnmappedSampler(String name, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, 0, InternalAggregations.EMPTY, reducers, metaData);
|
||||
public UnmappedSampler(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, 0, InternalAggregations.EMPTY, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -29,7 +29,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.GlobalOrdinalsStringTermsAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.internal.ContextIndexSearcher;
|
||||
|
@ -51,10 +51,11 @@ public class GlobalOrdinalsSignificantTermsAggregator extends GlobalOrdinalsStri
|
|||
public GlobalOrdinalsSignificantTermsAggregator(String name, AggregatorFactories factories,
|
||||
ValuesSource.Bytes.WithOrdinals.FieldData valuesSource, BucketCountThresholds bucketCountThresholds,
|
||||
IncludeExclude.OrdinalsFilter includeExclude, AggregationContext aggregationContext, Aggregator parent,
|
||||
SignificantTermsAggregatorFactory termsAggFactory, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
SignificantTermsAggregatorFactory termsAggFactory, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
|
||||
super(name, factories, valuesSource, null, bucketCountThresholds, includeExclude, aggregationContext, parent,
|
||||
SubAggCollectionMode.DEPTH_FIRST, false, reducers, metaData);
|
||||
SubAggCollectionMode.DEPTH_FIRST, false, pipelineAggregators, metaData);
|
||||
this.termsAggFactory = termsAggFactory;
|
||||
}
|
||||
|
||||
|
@ -130,7 +131,7 @@ public class GlobalOrdinalsSignificantTermsAggregator extends GlobalOrdinalsStri
|
|||
}
|
||||
|
||||
return new SignificantStringTerms(subsetSize, supersetSize, name, bucketCountThresholds.getRequiredSize(),
|
||||
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), reducers(),
|
||||
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
|
@ -142,7 +143,7 @@ public class GlobalOrdinalsSignificantTermsAggregator extends GlobalOrdinalsStri
|
|||
int supersetSize = topReader.numDocs();
|
||||
return new SignificantStringTerms(0, supersetSize, name, bucketCountThresholds.getRequiredSize(),
|
||||
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(),
|
||||
Collections.<InternalSignificantTerms.Bucket> emptyList(), reducers(), metaData());
|
||||
Collections.<InternalSignificantTerms.Bucket> emptyList(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -154,8 +155,12 @@ public class GlobalOrdinalsSignificantTermsAggregator extends GlobalOrdinalsStri
|
|||
|
||||
private final LongHash bucketOrds;
|
||||
|
||||
public WithHash(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals.FieldData valuesSource, BucketCountThresholds bucketCountThresholds, IncludeExclude.OrdinalsFilter includeExclude, AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggFactory, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, valuesSource, bucketCountThresholds, includeExclude, aggregationContext, parent, termsAggFactory, reducers, metaData);
|
||||
public WithHash(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals.FieldData valuesSource,
|
||||
BucketCountThresholds bucketCountThresholds, IncludeExclude.OrdinalsFilter includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggFactory,
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, valuesSource, bucketCountThresholds, includeExclude, aggregationContext, parent, termsAggFactory,
|
||||
pipelineAggregators, metaData);
|
||||
bucketOrds = new LongHash(1, aggregationContext.bigArrays());
|
||||
}
|
||||
|
||||
|
|
|
@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
|
@ -125,9 +125,9 @@ public abstract class InternalSignificantTerms<A extends InternalSignificantTerm
|
|||
}
|
||||
|
||||
protected InternalSignificantTerms(long subsetSize, long supersetSize, String name, int requiredSize, long minDocCount,
|
||||
SignificanceHeuristic significanceHeuristic, List<? extends Bucket> buckets, List<Reducer> reducers,
|
||||
SignificanceHeuristic significanceHeuristic, List<? extends Bucket> buckets, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.requiredSize = requiredSize;
|
||||
this.minDocCount = minDocCount;
|
||||
this.buckets = buckets;
|
||||
|
|
|
@ -18,7 +18,6 @@
|
|||
*/
|
||||
package org.elasticsearch.search.aggregations.bucket.significant;
|
||||
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.common.Nullable;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
|
@ -29,7 +28,7 @@ import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
|||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic;
|
||||
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
|
||||
|
||||
|
@ -169,9 +168,9 @@ public class SignificantLongTerms extends InternalSignificantTerms<SignificantLo
|
|||
|
||||
public SignificantLongTerms(long subsetSize, long supersetSize, String name, @Nullable ValueFormatter formatter, int requiredSize,
|
||||
long minDocCount, SignificanceHeuristic significanceHeuristic, List<? extends InternalSignificantTerms.Bucket> buckets,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
|
||||
super(subsetSize, supersetSize, name, requiredSize, minDocCount, significanceHeuristic, buckets, reducers, metaData);
|
||||
super(subsetSize, supersetSize, name, requiredSize, minDocCount, significanceHeuristic, buckets, pipelineAggregators, metaData);
|
||||
this.formatter = formatter;
|
||||
}
|
||||
|
||||
|
@ -183,7 +182,7 @@ public class SignificantLongTerms extends InternalSignificantTerms<SignificantLo
|
|||
@Override
|
||||
public SignificantLongTerms create(List<SignificantLongTerms.Bucket> buckets) {
|
||||
return new SignificantLongTerms(this.subsetSize, this.supersetSize, this.name, this.formatter, this.requiredSize, this.minDocCount,
|
||||
this.significanceHeuristic, buckets, this.reducers(), this.metaData);
|
||||
this.significanceHeuristic, buckets, this.pipelineAggregators(), this.metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -197,7 +196,7 @@ public class SignificantLongTerms extends InternalSignificantTerms<SignificantLo
|
|||
List<org.elasticsearch.search.aggregations.bucket.significant.InternalSignificantTerms.Bucket> buckets,
|
||||
InternalSignificantTerms prototype) {
|
||||
return new SignificantLongTerms(subsetSize, supersetSize, prototype.getName(), ((SignificantLongTerms) prototype).formatter,
|
||||
prototype.requiredSize, prototype.minDocCount, prototype.significanceHeuristic, buckets, prototype.reducers(),
|
||||
prototype.requiredSize, prototype.minDocCount, prototype.significanceHeuristic, buckets, prototype.pipelineAggregators(),
|
||||
prototype.getMetaData());
|
||||
}
|
||||
|
||||
|
|
|
@ -28,7 +28,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.LongTermsAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormat;
|
||||
|
@ -48,10 +48,10 @@ public class SignificantLongTermsAggregator extends LongTermsAggregator {
|
|||
public SignificantLongTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource, @Nullable ValueFormat format,
|
||||
BucketCountThresholds bucketCountThresholds, AggregationContext aggregationContext,
|
||||
Aggregator parent, SignificantTermsAggregatorFactory termsAggFactory, IncludeExclude.LongFilter includeExclude,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
|
||||
super(name, factories, valuesSource, format, null, bucketCountThresholds, aggregationContext, parent,
|
||||
SubAggCollectionMode.DEPTH_FIRST, false, includeExclude, reducers, metaData);
|
||||
SubAggCollectionMode.DEPTH_FIRST, false, includeExclude, pipelineAggregators, metaData);
|
||||
this.termsAggFactory = termsAggFactory;
|
||||
}
|
||||
|
||||
|
@ -109,7 +109,7 @@ public class SignificantLongTermsAggregator extends LongTermsAggregator {
|
|||
list[i] = bucket;
|
||||
}
|
||||
return new SignificantLongTerms(subsetSize, supersetSize, name, formatter, bucketCountThresholds.getRequiredSize(),
|
||||
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), reducers(),
|
||||
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
|
@ -121,7 +121,7 @@ public class SignificantLongTermsAggregator extends LongTermsAggregator {
|
|||
int supersetSize = topReader.numDocs();
|
||||
return new SignificantLongTerms(0, supersetSize, name, formatter, bucketCountThresholds.getRequiredSize(),
|
||||
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(),
|
||||
Collections.<InternalSignificantTerms.Bucket> emptyList(), reducers(), metaData());
|
||||
Collections.<InternalSignificantTerms.Bucket> emptyList(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -19,7 +19,6 @@
|
|||
package org.elasticsearch.search.aggregations.bucket.significant;
|
||||
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
|
@ -30,7 +29,7 @@ import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
|||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic;
|
||||
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
|
@ -161,9 +160,10 @@ public class SignificantStringTerms extends InternalSignificantTerms<Significant
|
|||
SignificantStringTerms() {} // for serialization
|
||||
|
||||
public SignificantStringTerms(long subsetSize, long supersetSize, String name, int requiredSize, long minDocCount,
|
||||
SignificanceHeuristic significanceHeuristic, List<? extends InternalSignificantTerms.Bucket> buckets, List<Reducer> reducers,
|
||||
SignificanceHeuristic significanceHeuristic, List<? extends InternalSignificantTerms.Bucket> buckets,
|
||||
List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(subsetSize, supersetSize, name, requiredSize, minDocCount, significanceHeuristic, buckets, reducers, metaData);
|
||||
super(subsetSize, supersetSize, name, requiredSize, minDocCount, significanceHeuristic, buckets, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -174,7 +174,7 @@ public class SignificantStringTerms extends InternalSignificantTerms<Significant
|
|||
@Override
|
||||
public SignificantStringTerms create(List<SignificantStringTerms.Bucket> buckets) {
|
||||
return new SignificantStringTerms(this.subsetSize, this.supersetSize, this.name, this.requiredSize, this.minDocCount,
|
||||
this.significanceHeuristic, buckets, this.reducers(), this.metaData);
|
||||
this.significanceHeuristic, buckets, this.pipelineAggregators(), this.metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -187,7 +187,7 @@ public class SignificantStringTerms extends InternalSignificantTerms<Significant
|
|||
protected SignificantStringTerms create(long subsetSize, long supersetSize, List<InternalSignificantTerms.Bucket> buckets,
|
||||
InternalSignificantTerms prototype) {
|
||||
return new SignificantStringTerms(subsetSize, supersetSize, prototype.getName(), prototype.requiredSize, prototype.minDocCount,
|
||||
prototype.significanceHeuristic, buckets, prototype.reducers(), prototype.getMetaData());
|
||||
prototype.significanceHeuristic, buckets, prototype.pipelineAggregators(), prototype.getMetaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -28,7 +28,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.StringTermsAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.internal.ContextIndexSearcher;
|
||||
|
@ -50,11 +50,12 @@ public class SignificantStringTermsAggregator extends StringTermsAggregator {
|
|||
public SignificantStringTermsAggregator(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
BucketCountThresholds bucketCountThresholds,
|
||||
IncludeExclude.StringFilter includeExclude, AggregationContext aggregationContext, Aggregator parent,
|
||||
SignificantTermsAggregatorFactory termsAggFactory, List<Reducer> reducers, Map<String, Object> metaData)
|
||||
SignificantTermsAggregatorFactory termsAggFactory, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
|
||||
super(name, factories, valuesSource, null, bucketCountThresholds, includeExclude, aggregationContext, parent,
|
||||
SubAggCollectionMode.DEPTH_FIRST, false, reducers, metaData);
|
||||
SubAggCollectionMode.DEPTH_FIRST, false, pipelineAggregators, metaData);
|
||||
this.termsAggFactory = termsAggFactory;
|
||||
}
|
||||
|
||||
|
@ -115,7 +116,7 @@ public class SignificantStringTermsAggregator extends StringTermsAggregator {
|
|||
}
|
||||
|
||||
return new SignificantStringTerms(subsetSize, supersetSize, name, bucketCountThresholds.getRequiredSize(),
|
||||
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), reducers(),
|
||||
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(), Arrays.asList(list), pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
|
@ -127,7 +128,7 @@ public class SignificantStringTermsAggregator extends StringTermsAggregator {
|
|||
int supersetSize = topReader.numDocs();
|
||||
return new SignificantStringTerms(0, supersetSize, name, bucketCountThresholds.getRequiredSize(),
|
||||
bucketCountThresholds.getMinDocCount(), termsAggFactory.getSignificanceHeuristic(),
|
||||
Collections.<InternalSignificantTerms.Bucket> emptyList(), reducers(), metaData());
|
||||
Collections.<InternalSignificantTerms.Bucket> emptyList(), pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -37,7 +37,7 @@ import org.elasticsearch.search.aggregations.NonCollectingAggregator;
|
|||
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristic;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.TermsAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -65,10 +65,10 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
|
|||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
final IncludeExclude.StringFilter filter = includeExclude == null ? null : includeExclude.convertToStringFilter();
|
||||
return new SignificantStringTermsAggregator(name, factories, valuesSource, bucketCountThresholds, filter,
|
||||
aggregationContext, parent, termsAggregatorFactory, reducers, metaData);
|
||||
aggregationContext, parent, termsAggregatorFactory, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
},
|
||||
|
@ -78,11 +78,13 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
|
|||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
ValuesSource.Bytes.WithOrdinals valueSourceWithOrdinals = (ValuesSource.Bytes.WithOrdinals) valuesSource;
|
||||
IndexSearcher indexSearcher = aggregationContext.searchContext().searcher();
|
||||
final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter();
|
||||
return new GlobalOrdinalsSignificantTermsAggregator(name, factories, (ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, bucketCountThresholds, filter, aggregationContext, parent, termsAggregatorFactory, reducers, metaData);
|
||||
return new GlobalOrdinalsSignificantTermsAggregator(name, factories,
|
||||
(ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, bucketCountThresholds, filter, aggregationContext,
|
||||
parent, termsAggregatorFactory, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
},
|
||||
|
@ -92,11 +94,12 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
|
|||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter();
|
||||
return new GlobalOrdinalsSignificantTermsAggregator.WithHash(name, factories,
|
||||
(ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, bucketCountThresholds, filter,
|
||||
aggregationContext, parent, termsAggregatorFactory, reducers, metaData);
|
||||
aggregationContext,
|
||||
parent, termsAggregatorFactory, pipelineAggregators, metaData);
|
||||
}
|
||||
};
|
||||
|
||||
|
@ -118,7 +121,7 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
|
|||
abstract Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SignificantTermsAggregatorFactory termsAggregatorFactory,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException;
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException;
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
|
@ -155,11 +158,12 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
|
||||
List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
final InternalAggregation aggregation = new UnmappedSignificantTerms(name, bucketCountThresholds.getRequiredSize(),
|
||||
bucketCountThresholds.getMinDocCount(), reducers, metaData);
|
||||
return new NonCollectingAggregator(name, aggregationContext, parent, reducers, metaData) {
|
||||
bucketCountThresholds.getMinDocCount(), pipelineAggregators, metaData);
|
||||
return new NonCollectingAggregator(name, aggregationContext, parent, pipelineAggregators, metaData) {
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return aggregation;
|
||||
|
@ -169,7 +173,8 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
|
|||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext aggregationContext, Aggregator parent,
|
||||
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
if (collectsFromSingleBucket == false) {
|
||||
return asMultiBucketAggregator(this, aggregationContext, parent);
|
||||
}
|
||||
|
@ -193,7 +198,7 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
|
|||
}
|
||||
assert execution != null;
|
||||
return execution.create(name, factories, valuesSource, bucketCountThresholds, includeExclude, aggregationContext, parent, this,
|
||||
reducers, metaData);
|
||||
pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
|
||||
|
@ -212,7 +217,7 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
|
|||
longFilter = includeExclude.convertToLongFilter();
|
||||
}
|
||||
return new SignificantLongTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(),
|
||||
bucketCountThresholds, aggregationContext, parent, this, longFilter, reducers, metaData);
|
||||
bucketCountThresholds, aggregationContext, parent, this, longFilter, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
throw new AggregationExecutionException("sigfnificant_terms aggregation cannot be applied to field [" + config.fieldContext().field() +
|
||||
|
|
|
@ -25,7 +25,7 @@ import org.elasticsearch.search.aggregations.AggregationStreams;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.significant.heuristics.JLHScore;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Collections;
|
||||
|
@ -57,10 +57,10 @@ public class UnmappedSignificantTerms extends InternalSignificantTerms<UnmappedS
|
|||
|
||||
UnmappedSignificantTerms() {} // for serialization
|
||||
|
||||
public UnmappedSignificantTerms(String name, int requiredSize, long minDocCount, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
public UnmappedSignificantTerms(String name, int requiredSize, long minDocCount, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
//We pass zero for index/subset sizes because for the purpose of significant term analysis
|
||||
// we assume an unmapped index's size is irrelevant to the proceedings.
|
||||
super(0, 0, name, requiredSize, minDocCount, JLHScore.INSTANCE, BUCKETS, reducers, metaData);
|
||||
super(0, 0, name, requiredSize, minDocCount, JLHScore.INSTANCE, BUCKETS, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -70,7 +70,7 @@ public class UnmappedSignificantTerms extends InternalSignificantTerms<UnmappedS
|
|||
|
||||
@Override
|
||||
public UnmappedSignificantTerms create(List<InternalSignificantTerms.Bucket> buckets) {
|
||||
return new UnmappedSignificantTerms(this.name, this.requiredSize, this.minDocCount, this.reducers(), this.metaData);
|
||||
return new UnmappedSignificantTerms(this.name, this.requiredSize, this.minDocCount, this.pipelineAggregators(), this.metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -22,7 +22,7 @@ package org.elasticsearch.search.aggregations.bucket.terms;
|
|||
import org.elasticsearch.search.aggregations.Aggregator;
|
||||
import org.elasticsearch.search.aggregations.AggregatorFactories;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -36,8 +36,8 @@ abstract class AbstractStringTermsAggregator extends TermsAggregator {
|
|||
|
||||
public AbstractStringTermsAggregator(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent,
|
||||
Terms.Order order, BucketCountThresholds bucketCountThresholds, SubAggCollectionMode subAggCollectMode,
|
||||
boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, context, parent, bucketCountThresholds, order, subAggCollectMode, reducers, metaData);
|
||||
boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, context, parent, bucketCountThresholds, order, subAggCollectMode, pipelineAggregators, metaData);
|
||||
this.showTermDocCountError = showTermDocCountError;
|
||||
}
|
||||
|
||||
|
@ -45,7 +45,7 @@ abstract class AbstractStringTermsAggregator extends TermsAggregator {
|
|||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new StringTerms(name, order, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(),
|
||||
bucketCountThresholds.getMinDocCount(), Collections.<InternalTerms.Bucket> emptyList(), showTermDocCountError, 0, 0,
|
||||
reducers(), metaData());
|
||||
pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.AggregationStreams;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
|
||||
|
||||
|
@ -162,8 +162,8 @@ public class DoubleTerms extends InternalTerms<DoubleTerms, DoubleTerms.Bucket>
|
|||
|
||||
public DoubleTerms(String name, Terms.Order order, @Nullable ValueFormatter formatter, int requiredSize, int shardSize,
|
||||
long minDocCount, List<? extends InternalTerms.Bucket> buckets, boolean showTermDocCountError, long docCountError,
|
||||
long otherDocCount, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, reducers,
|
||||
long otherDocCount, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, pipelineAggregators,
|
||||
metaData);
|
||||
this.formatter = formatter;
|
||||
}
|
||||
|
@ -176,7 +176,7 @@ public class DoubleTerms extends InternalTerms<DoubleTerms, DoubleTerms.Bucket>
|
|||
@Override
|
||||
public DoubleTerms create(List<Bucket> buckets) {
|
||||
return new DoubleTerms(this.name, this.order, this.formatter, this.requiredSize, this.shardSize, this.minDocCount, buckets,
|
||||
this.showTermDocCountError, this.docCountError, this.otherDocCount, this.reducers(), this.metaData);
|
||||
this.showTermDocCountError, this.docCountError, this.otherDocCount, this.pipelineAggregators(), this.metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -189,7 +189,7 @@ public class DoubleTerms extends InternalTerms<DoubleTerms, DoubleTerms.Bucket>
|
|||
protected DoubleTerms create(String name, List<org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket> buckets,
|
||||
long docCountError, long otherDocCount, InternalTerms prototype) {
|
||||
return new DoubleTerms(name, prototype.order, ((DoubleTerms) prototype).formatter, prototype.requiredSize, prototype.shardSize,
|
||||
prototype.minDocCount, buckets, prototype.showTermDocCountError, docCountError, otherDocCount, prototype.reducers(),
|
||||
prototype.minDocCount, buckets, prototype.showTermDocCountError, docCountError, otherDocCount, prototype.pipelineAggregators(),
|
||||
prototype.getMetaData());
|
||||
}
|
||||
|
||||
|
|
|
@ -26,7 +26,7 @@ import org.elasticsearch.index.fielddata.FieldData;
|
|||
import org.elasticsearch.search.aggregations.Aggregator;
|
||||
import org.elasticsearch.search.aggregations.AggregatorFactories;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource.Numeric;
|
||||
|
@ -45,9 +45,9 @@ public class DoubleTermsAggregator extends LongTermsAggregator {
|
|||
public DoubleTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource, @Nullable ValueFormat format,
|
||||
Terms.Order order, BucketCountThresholds bucketCountThresholds,
|
||||
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError,
|
||||
IncludeExclude.LongFilter longFilter, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
IncludeExclude.LongFilter longFilter, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, valuesSource, format, order, bucketCountThresholds, aggregationContext, parent, collectionMode,
|
||||
showTermDocCountError, longFilter, reducers, metaData);
|
||||
showTermDocCountError, longFilter, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -79,7 +79,7 @@ public class DoubleTermsAggregator extends LongTermsAggregator {
|
|||
buckets[i] = convertToDouble(buckets[i]);
|
||||
}
|
||||
return new DoubleTerms(terms.getName(), terms.order, terms.formatter, terms.requiredSize, terms.shardSize, terms.minDocCount,
|
||||
Arrays.asList(buckets), terms.showTermDocCountError, terms.docCountError, terms.otherDocCount, terms.reducers(),
|
||||
Arrays.asList(buckets), terms.showTermDocCountError, terms.docCountError, terms.otherDocCount, terms.pipelineAggregators(),
|
||||
terms.getMetaData());
|
||||
}
|
||||
|
||||
|
|
|
@ -44,7 +44,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
|||
import org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
|
||||
|
@ -73,8 +73,11 @@ public class GlobalOrdinalsStringTermsAggregator extends AbstractStringTermsAggr
|
|||
|
||||
public GlobalOrdinalsStringTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals valuesSource,
|
||||
Terms.Order order, BucketCountThresholds bucketCountThresholds,
|
||||
IncludeExclude.OrdinalsFilter includeExclude, AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, order, bucketCountThresholds, collectionMode, showTermDocCountError, reducers,
|
||||
IncludeExclude.OrdinalsFilter includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError,
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, order, bucketCountThresholds, collectionMode, showTermDocCountError,
|
||||
pipelineAggregators,
|
||||
metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.includeExclude = includeExclude;
|
||||
|
@ -200,7 +203,7 @@ public class GlobalOrdinalsStringTermsAggregator extends AbstractStringTermsAggr
|
|||
}
|
||||
|
||||
return new StringTerms(name, order, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(),
|
||||
bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, reducers(),
|
||||
bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
|
@ -266,8 +269,11 @@ public class GlobalOrdinalsStringTermsAggregator extends AbstractStringTermsAggr
|
|||
|
||||
public WithHash(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals.FieldData valuesSource,
|
||||
Terms.Order order, BucketCountThresholds bucketCountThresholds, IncludeExclude.OrdinalsFilter includeExclude, AggregationContext aggregationContext,
|
||||
Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, valuesSource, order, bucketCountThresholds, includeExclude, aggregationContext, parent, collectionMode, showTermDocCountError, reducers, metaData);
|
||||
Aggregator parent, SubAggCollectionMode collectionMode,
|
||||
boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
super(name, factories, valuesSource, order, bucketCountThresholds, includeExclude, aggregationContext, parent, collectionMode,
|
||||
showTermDocCountError, pipelineAggregators, metaData);
|
||||
bucketOrds = new LongHash(1, aggregationContext.bigArrays());
|
||||
}
|
||||
|
||||
|
@ -335,8 +341,12 @@ public class GlobalOrdinalsStringTermsAggregator extends AbstractStringTermsAggr
|
|||
private RandomAccessOrds segmentOrds;
|
||||
|
||||
public LowCardinality(String name, AggregatorFactories factories, ValuesSource.Bytes.WithOrdinals valuesSource,
|
||||
Terms.Order order, BucketCountThresholds bucketCountThresholds, AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, valuesSource, order, bucketCountThresholds, null, aggregationContext, parent, collectionMode, showTermDocCountError, reducers, metaData);
|
||||
Terms.Order order,
|
||||
BucketCountThresholds bucketCountThresholds, AggregationContext aggregationContext, Aggregator parent,
|
||||
SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, valuesSource, order, bucketCountThresholds, null, aggregationContext, parent, collectionMode,
|
||||
showTermDocCountError, pipelineAggregators, metaData);
|
||||
assert factories == null || factories.count() == 0;
|
||||
this.segmentDocCounts = context.bigArrays().newIntArray(1, true);
|
||||
}
|
||||
|
|
|
@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
|
||||
import java.util.ArrayList;
|
||||
|
@ -124,9 +124,9 @@ public abstract class InternalTerms<A extends InternalTerms, B extends InternalT
|
|||
protected InternalTerms() {} // for serialization
|
||||
|
||||
protected InternalTerms(String name, Terms.Order order, int requiredSize, int shardSize, long minDocCount,
|
||||
List<? extends Bucket> buckets, boolean showTermDocCountError, long docCountError, long otherDocCount, List<Reducer> reducers,
|
||||
List<? extends Bucket> buckets, boolean showTermDocCountError, long docCountError, long otherDocCount, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.order = order;
|
||||
this.requiredSize = requiredSize;
|
||||
this.shardSize = shardSize;
|
||||
|
|
|
@ -26,7 +26,7 @@ import org.elasticsearch.search.aggregations.AggregationStreams;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
|
||||
|
||||
|
@ -158,8 +158,8 @@ public class LongTerms extends InternalTerms<LongTerms, LongTerms.Bucket> {
|
|||
|
||||
public LongTerms(String name, Terms.Order order, @Nullable ValueFormatter formatter, int requiredSize, int shardSize, long minDocCount,
|
||||
List<? extends InternalTerms.Bucket> buckets, boolean showTermDocCountError, long docCountError, long otherDocCount,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, reducers,
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, pipelineAggregators,
|
||||
metaData);
|
||||
this.formatter = formatter;
|
||||
}
|
||||
|
@ -172,7 +172,7 @@ public class LongTerms extends InternalTerms<LongTerms, LongTerms.Bucket> {
|
|||
@Override
|
||||
public LongTerms create(List<Bucket> buckets) {
|
||||
return new LongTerms(this.name, this.order, this.formatter, this.requiredSize, this.shardSize, this.minDocCount, buckets,
|
||||
this.showTermDocCountError, this.docCountError, this.otherDocCount, this.reducers(), this.metaData);
|
||||
this.showTermDocCountError, this.docCountError, this.otherDocCount, this.pipelineAggregators(), this.metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -185,7 +185,7 @@ public class LongTerms extends InternalTerms<LongTerms, LongTerms.Bucket> {
|
|||
protected LongTerms create(String name, List<org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket> buckets,
|
||||
long docCountError, long otherDocCount, InternalTerms prototype) {
|
||||
return new LongTerms(name, prototype.order, ((LongTerms) prototype).formatter, prototype.requiredSize, prototype.shardSize,
|
||||
prototype.minDocCount, buckets, prototype.showTermDocCountError, docCountError, otherDocCount, prototype.reducers(),
|
||||
prototype.minDocCount, buckets, prototype.showTermDocCountError, docCountError, otherDocCount, prototype.pipelineAggregators(),
|
||||
prototype.getMetaData());
|
||||
}
|
||||
|
||||
|
|
|
@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
|||
import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude.LongFilter;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormat;
|
||||
|
@ -57,8 +57,8 @@ public class LongTermsAggregator extends TermsAggregator {
|
|||
public LongTermsAggregator(String name, AggregatorFactories factories, ValuesSource.Numeric valuesSource, @Nullable ValueFormat format,
|
||||
Terms.Order order, BucketCountThresholds bucketCountThresholds, AggregationContext aggregationContext, Aggregator parent,
|
||||
SubAggCollectionMode subAggCollectMode, boolean showTermDocCountError, IncludeExclude.LongFilter longFilter,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, bucketCountThresholds, order, subAggCollectMode, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, aggregationContext, parent, bucketCountThresholds, order, subAggCollectMode, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.showTermDocCountError = showTermDocCountError;
|
||||
this.formatter = format != null ? format.formatter() : null;
|
||||
|
@ -162,7 +162,7 @@ public class LongTermsAggregator extends TermsAggregator {
|
|||
}
|
||||
|
||||
return new LongTerms(name, order, formatter, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(),
|
||||
bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, reducers(),
|
||||
bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
|
@ -170,7 +170,7 @@ public class LongTermsAggregator extends TermsAggregator {
|
|||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new LongTerms(name, order, formatter, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(),
|
||||
bucketCountThresholds.getMinDocCount(), Collections.<InternalTerms.Bucket> emptyList(), showTermDocCountError, 0, 0,
|
||||
reducers(), metaData());
|
||||
pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreamContext;
|
||||
import org.elasticsearch.search.aggregations.bucket.BucketStreams;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
|
@ -153,8 +153,8 @@ public class StringTerms extends InternalTerms<StringTerms, StringTerms.Bucket>
|
|||
|
||||
public StringTerms(String name, Terms.Order order, int requiredSize, int shardSize, long minDocCount,
|
||||
List<? extends InternalTerms.Bucket> buckets, boolean showTermDocCountError, long docCountError, long otherDocCount,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, reducers,
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, order, requiredSize, shardSize, minDocCount, buckets, showTermDocCountError, docCountError, otherDocCount, pipelineAggregators,
|
||||
metaData);
|
||||
}
|
||||
|
||||
|
@ -166,7 +166,7 @@ public class StringTerms extends InternalTerms<StringTerms, StringTerms.Bucket>
|
|||
@Override
|
||||
public StringTerms create(List<Bucket> buckets) {
|
||||
return new StringTerms(this.name, this.order, this.requiredSize, this.shardSize, this.minDocCount, buckets,
|
||||
this.showTermDocCountError, this.docCountError, this.otherDocCount, this.reducers(), this.metaData);
|
||||
this.showTermDocCountError, this.docCountError, this.otherDocCount, this.pipelineAggregators(), this.metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -178,7 +178,7 @@ public class StringTerms extends InternalTerms<StringTerms, StringTerms.Bucket>
|
|||
protected StringTerms create(String name, List<org.elasticsearch.search.aggregations.bucket.terms.InternalTerms.Bucket> buckets,
|
||||
long docCountError, long otherDocCount, InternalTerms prototype) {
|
||||
return new StringTerms(name, prototype.order, prototype.requiredSize, prototype.shardSize, prototype.minDocCount, buckets,
|
||||
prototype.showTermDocCountError, docCountError, otherDocCount, prototype.reducers(), prototype.getMetaData());
|
||||
prototype.showTermDocCountError, docCountError, otherDocCount, prototype.pipelineAggregators(), prototype.getMetaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.BucketPriorityQueue;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
|
||||
|
@ -52,10 +52,10 @@ public class StringTermsAggregator extends AbstractStringTermsAggregator {
|
|||
public StringTermsAggregator(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
Terms.Order order, BucketCountThresholds bucketCountThresholds,
|
||||
IncludeExclude.StringFilter includeExclude, AggregationContext aggregationContext,
|
||||
Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<Reducer> reducers,
|
||||
Aggregator parent, SubAggCollectionMode collectionMode, boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
|
||||
super(name, factories, aggregationContext, parent, order, bucketCountThresholds, collectionMode, showTermDocCountError, reducers,
|
||||
super(name, factories, aggregationContext, parent, order, bucketCountThresholds, collectionMode, showTermDocCountError, pipelineAggregators,
|
||||
metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.includeExclude = includeExclude;
|
||||
|
@ -164,7 +164,7 @@ public class StringTermsAggregator extends AbstractStringTermsAggregator {
|
|||
}
|
||||
|
||||
return new StringTerms(name, order, bucketCountThresholds.getRequiredSize(), bucketCountThresholds.getShardSize(),
|
||||
bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, reducers(),
|
||||
bucketCountThresholds.getMinDocCount(), Arrays.asList(list), showTermDocCountError, 0, otherDocCount, pipelineAggregators(),
|
||||
metaData());
|
||||
}
|
||||
|
||||
|
|
|
@ -28,7 +28,7 @@ import org.elasticsearch.search.aggregations.AggregatorFactories;
|
|||
import org.elasticsearch.search.aggregations.bucket.BucketsAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.InternalOrder.Aggregation;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.InternalOrder.CompoundOrder;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationPath;
|
||||
|
||||
|
@ -137,8 +137,8 @@ public abstract class TermsAggregator extends BucketsAggregator {
|
|||
protected final Set<Aggregator> aggsUsedForSorting = new HashSet<>();
|
||||
protected final SubAggCollectionMode collectMode;
|
||||
|
||||
public TermsAggregator(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent, BucketCountThresholds bucketCountThresholds, Terms.Order order, SubAggCollectionMode collectMode, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, context, parent, reducers, metaData);
|
||||
public TermsAggregator(String name, AggregatorFactories factories, AggregationContext context, Aggregator parent, BucketCountThresholds bucketCountThresholds, Terms.Order order, SubAggCollectionMode collectMode, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, factories, context, parent, pipelineAggregators, metaData);
|
||||
this.bucketCountThresholds = bucketCountThresholds;
|
||||
this.order = InternalOrder.validate(order, this);
|
||||
this.collectMode = collectMode;
|
||||
|
|
|
@ -27,7 +27,7 @@ import org.elasticsearch.search.aggregations.AggregatorFactories;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.NonCollectingAggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.support.IncludeExclude;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -40,20 +40,21 @@ import java.util.Map;
|
|||
/**
|
||||
*
|
||||
*/
|
||||
public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<ValuesSource> {
|
||||
public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<ValuesSource> {
|
||||
|
||||
public enum ExecutionMode {
|
||||
|
||||
MAP(new ParseField("map")) {
|
||||
|
||||
@Override
|
||||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Terms.Order order,
|
||||
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode,
|
||||
boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
final IncludeExclude.StringFilter filter = includeExclude == null ? null : includeExclude.convertToStringFilter();
|
||||
return new StringTermsAggregator(name, factories, valuesSource, order, bucketCountThresholds, filter,
|
||||
aggregationContext, parent, subAggCollectMode, showTermDocCountError, reducers, metaData);
|
||||
return new StringTermsAggregator(name, factories, valuesSource, order, bucketCountThresholds, filter, aggregationContext,
|
||||
parent, subAggCollectMode, showTermDocCountError, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -65,11 +66,15 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
|
|||
GLOBAL_ORDINALS(new ParseField("global_ordinals")) {
|
||||
|
||||
@Override
|
||||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Terms.Order order,
|
||||
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode,
|
||||
boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter();
|
||||
return new GlobalOrdinalsStringTermsAggregator(name, factories, (ValuesSource.Bytes.WithOrdinals) valuesSource, order, bucketCountThresholds, filter, aggregationContext, parent, subAggCollectMode, showTermDocCountError, reducers, metaData);
|
||||
return new GlobalOrdinalsStringTermsAggregator(name, factories, (ValuesSource.Bytes.WithOrdinals) valuesSource, order,
|
||||
bucketCountThresholds, filter, aggregationContext, parent, subAggCollectMode, showTermDocCountError,
|
||||
pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -81,11 +86,15 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
|
|||
GLOBAL_ORDINALS_HASH(new ParseField("global_ordinals_hash")) {
|
||||
|
||||
@Override
|
||||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Terms.Order order,
|
||||
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode,
|
||||
boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
final IncludeExclude.OrdinalsFilter filter = includeExclude == null ? null : includeExclude.convertToOrdinalsFilter();
|
||||
return new GlobalOrdinalsStringTermsAggregator.WithHash(name, factories, (ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, order, bucketCountThresholds, filter, aggregationContext, parent, subAggCollectMode, showTermDocCountError, reducers, metaData);
|
||||
return new GlobalOrdinalsStringTermsAggregator.WithHash(name, factories,
|
||||
(ValuesSource.Bytes.WithOrdinals.FieldData) valuesSource, order, bucketCountThresholds, filter, aggregationContext,
|
||||
parent, subAggCollectMode, showTermDocCountError, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -96,14 +105,18 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
|
|||
GLOBAL_ORDINALS_LOW_CARDINALITY(new ParseField("global_ordinals_low_cardinality")) {
|
||||
|
||||
@Override
|
||||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Terms.Order order,
|
||||
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode,
|
||||
boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
if (includeExclude != null || factories.count() > 0) {
|
||||
return GLOBAL_ORDINALS.create(name, factories, valuesSource, order, bucketCountThresholds, includeExclude, aggregationContext, parent, subAggCollectMode, showTermDocCountError, reducers, metaData);
|
||||
return GLOBAL_ORDINALS.create(name, factories, valuesSource, order, bucketCountThresholds, includeExclude,
|
||||
aggregationContext, parent, subAggCollectMode, showTermDocCountError, pipelineAggregators, metaData);
|
||||
}
|
||||
return new GlobalOrdinalsStringTermsAggregator.LowCardinality(name, factories, (ValuesSource.Bytes.WithOrdinals) valuesSource, order, bucketCountThresholds, aggregationContext, parent, subAggCollectMode, showTermDocCountError, reducers, metaData);
|
||||
return new GlobalOrdinalsStringTermsAggregator.LowCardinality(name, factories,
|
||||
(ValuesSource.Bytes.WithOrdinals) valuesSource, order, bucketCountThresholds, aggregationContext, parent,
|
||||
subAggCollectMode, showTermDocCountError, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -127,10 +140,11 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
|
|||
this.parseField = parseField;
|
||||
}
|
||||
|
||||
abstract Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource,
|
||||
Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds,
|
||||
IncludeExclude includeExclude, AggregationContext aggregationContext, Aggregator parent,
|
||||
SubAggCollectionMode subAggCollectMode, boolean showTermDocCountError, List<Reducer> reducers, Map<String, Object> metaData) throws IOException;
|
||||
abstract Aggregator create(String name, AggregatorFactories factories, ValuesSource valuesSource, Terms.Order order,
|
||||
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude,
|
||||
AggregationContext aggregationContext, Aggregator parent, SubAggCollectionMode subAggCollectMode,
|
||||
boolean showTermDocCountError, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException;
|
||||
|
||||
abstract boolean needsGlobalOrdinals();
|
||||
|
||||
|
@ -147,7 +161,9 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
|
|||
private final TermsAggregator.BucketCountThresholds bucketCountThresholds;
|
||||
private final boolean showTermDocCountError;
|
||||
|
||||
public TermsAggregatorFactory(String name, ValuesSourceConfig config, Terms.Order order, TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, String executionHint, SubAggCollectionMode executionMode, boolean showTermDocCountError) {
|
||||
public TermsAggregatorFactory(String name, ValuesSourceConfig config, Terms.Order order,
|
||||
TermsAggregator.BucketCountThresholds bucketCountThresholds, IncludeExclude includeExclude, String executionHint,
|
||||
SubAggCollectionMode executionMode, boolean showTermDocCountError) {
|
||||
super(name, StringTerms.TYPE.name(), config);
|
||||
this.order = order;
|
||||
this.includeExclude = includeExclude;
|
||||
|
@ -158,15 +174,16 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
final InternalAggregation aggregation = new UnmappedTerms(name, order, bucketCountThresholds.getRequiredSize(),
|
||||
bucketCountThresholds.getShardSize(), bucketCountThresholds.getMinDocCount(), reducers, metaData);
|
||||
return new NonCollectingAggregator(name, aggregationContext, parent, factories, reducers, metaData) {
|
||||
bucketCountThresholds.getShardSize(), bucketCountThresholds.getMinDocCount(), pipelineAggregators, metaData);
|
||||
return new NonCollectingAggregator(name, aggregationContext, parent, factories, pipelineAggregators, metaData) {
|
||||
{
|
||||
// even in the case of an unmapped aggregator, validate the order
|
||||
InternalOrder.validate(order, this);
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return aggregation;
|
||||
|
@ -176,7 +193,8 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
|
|||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext aggregationContext, Aggregator parent,
|
||||
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
if (collectsFromSingleBucket == false) {
|
||||
return asMultiBucketAggregator(this, aggregationContext, parent);
|
||||
}
|
||||
|
@ -226,12 +244,16 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
|
|||
}
|
||||
|
||||
assert execution != null;
|
||||
return execution.create(name, factories, valuesSource, order, bucketCountThresholds, includeExclude, aggregationContext, parent, collectMode, showTermDocCountError, reducers, metaData);
|
||||
return execution.create(name, factories, valuesSource, order, bucketCountThresholds, includeExclude, aggregationContext,
|
||||
parent, collectMode, showTermDocCountError, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
if ((includeExclude != null) && (includeExclude.isRegexBased())) {
|
||||
throw new AggregationExecutionException("Aggregation [" + name + "] cannot support regular expression style include/exclude " +
|
||||
"settings as they can only be applied to string fields. Use an array of numeric values for include/exclude clauses used to filter numeric fields");
|
||||
throw new AggregationExecutionException(
|
||||
"Aggregation ["
|
||||
+ name
|
||||
+ "] cannot support regular expression style include/exclude "
|
||||
+ "settings as they can only be applied to string fields. Use an array of numeric values for include/exclude clauses used to filter numeric fields");
|
||||
}
|
||||
|
||||
if (valuesSource instanceof ValuesSource.Numeric) {
|
||||
|
@ -240,19 +262,20 @@ public class TermsAggregatorFactory extends ValuesSourceAggregatorFactory<Values
|
|||
if (includeExclude != null) {
|
||||
longFilter = includeExclude.convertToDoubleFilter();
|
||||
}
|
||||
return new DoubleTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(),
|
||||
order, bucketCountThresholds, aggregationContext, parent, collectMode,
|
||||
showTermDocCountError, longFilter, reducers, metaData);
|
||||
return new DoubleTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(), order,
|
||||
bucketCountThresholds, aggregationContext, parent, collectMode, showTermDocCountError, longFilter,
|
||||
pipelineAggregators, metaData);
|
||||
}
|
||||
if (includeExclude != null) {
|
||||
longFilter = includeExclude.convertToLongFilter();
|
||||
}
|
||||
return new LongTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(),
|
||||
order, bucketCountThresholds, aggregationContext, parent, collectMode, showTermDocCountError, longFilter, reducers, metaData);
|
||||
return new LongTermsAggregator(name, factories, (ValuesSource.Numeric) valuesSource, config.format(), order,
|
||||
bucketCountThresholds, aggregationContext, parent, collectMode, showTermDocCountError, longFilter, pipelineAggregators,
|
||||
metaData);
|
||||
}
|
||||
|
||||
throw new AggregationExecutionException("terms aggregation cannot be applied to field [" + config.fieldContext().field() +
|
||||
"]. It can only be applied to numeric or string fields.");
|
||||
throw new AggregationExecutionException("terms aggregation cannot be applied to field [" + config.fieldContext().field()
|
||||
+ "]. It can only be applied to numeric or string fields.");
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -24,7 +24,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregations;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Collections;
|
||||
|
@ -56,9 +56,9 @@ public class UnmappedTerms extends InternalTerms<UnmappedTerms, InternalTerms.Bu
|
|||
|
||||
UnmappedTerms() {} // for serialization
|
||||
|
||||
public UnmappedTerms(String name, Terms.Order order, int requiredSize, int shardSize, long minDocCount, List<Reducer> reducers,
|
||||
public UnmappedTerms(String name, Terms.Order order, int requiredSize, int shardSize, long minDocCount, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, order, requiredSize, shardSize, minDocCount, BUCKETS, false, 0, 0, reducers, metaData);
|
||||
super(name, order, requiredSize, shardSize, minDocCount, BUCKETS, false, 0, 0, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -68,7 +68,7 @@ public class UnmappedTerms extends InternalTerms<UnmappedTerms, InternalTerms.Bu
|
|||
|
||||
@Override
|
||||
public UnmappedTerms create(List<InternalTerms.Bucket> buckets) {
|
||||
return new UnmappedTerms(this.name, this.order, this.requiredSize, this.shardSize, this.minDocCount, this.reducers(), this.metaData);
|
||||
return new UnmappedTerms(this.name, this.order, this.requiredSize, this.shardSize, this.minDocCount, this.pipelineAggregators(), this.metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -20,7 +20,7 @@
|
|||
package org.elasticsearch.search.aggregations.metrics;
|
||||
|
||||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
@ -29,7 +29,7 @@ public abstract class InternalMetricsAggregation extends InternalAggregation {
|
|||
|
||||
protected InternalMetricsAggregation() {} // for serialization
|
||||
|
||||
protected InternalMetricsAggregation(String name, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
protected InternalMetricsAggregation(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, pipelineAggregators, metaData);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -18,7 +18,7 @@
|
|||
*/
|
||||
package org.elasticsearch.search.aggregations.metrics;
|
||||
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
|
||||
import java.util.List;
|
||||
|
@ -35,8 +35,8 @@ public abstract class InternalNumericMetricsAggregation extends InternalMetricsA
|
|||
|
||||
protected SingleValue() {}
|
||||
|
||||
protected SingleValue(String name, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
protected SingleValue(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -65,8 +65,8 @@ public abstract class InternalNumericMetricsAggregation extends InternalMetricsA
|
|||
|
||||
protected MultiValue() {}
|
||||
|
||||
protected MultiValue(String name, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
protected MultiValue(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
public abstract double value(String name);
|
||||
|
@ -93,8 +93,8 @@ public abstract class InternalNumericMetricsAggregation extends InternalMetricsA
|
|||
|
||||
private InternalNumericMetricsAggregation() {} // for serialization
|
||||
|
||||
private InternalNumericMetricsAggregation(String name, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
private InternalNumericMetricsAggregation(String name, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -22,7 +22,7 @@ package org.elasticsearch.search.aggregations.metrics;
|
|||
import org.elasticsearch.search.aggregations.Aggregator;
|
||||
import org.elasticsearch.search.aggregations.AggregatorBase;
|
||||
import org.elasticsearch.search.aggregations.AggregatorFactories;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -31,8 +31,8 @@ import java.util.Map;
|
|||
|
||||
public abstract class MetricsAggregator extends AggregatorBase {
|
||||
|
||||
protected MetricsAggregator(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers,
|
||||
protected MetricsAggregator(String name, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, AggregatorFactories.EMPTY, context, parent, reducers, metaData);
|
||||
super(name, AggregatorFactories.EMPTY, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -19,7 +19,7 @@
|
|||
package org.elasticsearch.search.aggregations.metrics;
|
||||
|
||||
import org.elasticsearch.search.aggregations.Aggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -31,16 +31,16 @@ import java.util.Map;
|
|||
*/
|
||||
public abstract class NumericMetricsAggregator extends MetricsAggregator {
|
||||
|
||||
private NumericMetricsAggregator(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, context, parent, reducers, metaData);
|
||||
private NumericMetricsAggregator(String name, AggregationContext context, Aggregator parent,
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
public static abstract class SingleValue extends NumericMetricsAggregator {
|
||||
|
||||
protected SingleValue(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers,
|
||||
protected SingleValue(String name, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, context, parent, reducers, metaData);
|
||||
super(name, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
public abstract double metric(long owningBucketOrd);
|
||||
|
@ -48,9 +48,9 @@ public abstract class NumericMetricsAggregator extends MetricsAggregator {
|
|||
|
||||
public static abstract class MultiValue extends NumericMetricsAggregator {
|
||||
|
||||
protected MultiValue(String name, AggregationContext context, Aggregator parent, List<Reducer> reducers,
|
||||
protected MultiValue(String name, AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, context, parent, reducers, metaData);
|
||||
super(name, context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
public abstract boolean hasMetric(String name);
|
||||
|
|
|
@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.metrics.NumericMetricsAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -53,8 +53,9 @@ public class AvgAggregator extends NumericMetricsAggregator.SingleValue {
|
|||
ValueFormatter formatter;
|
||||
|
||||
public AvgAggregator(String name, ValuesSource.Numeric valuesSource, @Nullable ValueFormatter formatter,
|
||||
AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, context, parent, reducers, metaData);
|
||||
AggregationContext context,
|
||||
Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, context, parent, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.formatter = formatter;
|
||||
if (valuesSource != null) {
|
||||
|
@ -105,12 +106,12 @@ public class AvgAggregator extends NumericMetricsAggregator.SingleValue {
|
|||
if (valuesSource == null || bucket >= sums.size()) {
|
||||
return buildEmptyAggregation();
|
||||
}
|
||||
return new InternalAvg(name, sums.get(bucket), counts.get(bucket), formatter, reducers(), metaData());
|
||||
return new InternalAvg(name, sums.get(bucket), counts.get(bucket), formatter, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalAvg(name, 0.0, 0l, formatter, reducers(), metaData());
|
||||
return new InternalAvg(name, 0.0, 0l, formatter, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
public static class Factory extends ValuesSourceAggregatorFactory.LeafOnly<ValuesSource.Numeric> {
|
||||
|
@ -120,15 +121,17 @@ public class AvgAggregator extends NumericMetricsAggregator.SingleValue {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
|
||||
List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
return new AvgAggregator(name, null, config.formatter(), aggregationContext, parent, reducers, metaData);
|
||||
return new AvgAggregator(name, null, config.formatter(), aggregationContext, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent,
|
||||
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
return new AvgAggregator(name, valuesSource, config.formatter(), aggregationContext, parent, reducers, metaData);
|
||||
boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
return new AvgAggregator(name, valuesSource, config.formatter(), aggregationContext, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -25,7 +25,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.metrics.InternalNumericMetricsAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
|
||||
|
||||
|
@ -58,9 +58,9 @@ public class InternalAvg extends InternalNumericMetricsAggregation.SingleValue i
|
|||
|
||||
InternalAvg() {} // for serialization
|
||||
|
||||
public InternalAvg(String name, double sum, long count, @Nullable ValueFormatter formatter, List<Reducer> reducers,
|
||||
public InternalAvg(String name, double sum, long count, @Nullable ValueFormatter formatter, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.sum = sum;
|
||||
this.count = count;
|
||||
this.valueFormatter = formatter;
|
||||
|
@ -89,7 +89,7 @@ public class InternalAvg extends InternalNumericMetricsAggregation.SingleValue i
|
|||
count += ((InternalAvg) aggregation).count;
|
||||
sum += ((InternalAvg) aggregation).sum;
|
||||
}
|
||||
return new InternalAvg(getName(), sum, count, valueFormatter, reducers(), getMetaData());
|
||||
return new InternalAvg(getName(), sum, count, valueFormatter, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -42,7 +42,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
|
|||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.metrics.NumericMetricsAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
|
@ -68,8 +68,8 @@ public class CardinalityAggregator extends NumericMetricsAggregator.SingleValue
|
|||
private ValueFormatter formatter;
|
||||
|
||||
public CardinalityAggregator(String name, ValuesSource valuesSource, boolean rehash, int precision, @Nullable ValueFormatter formatter,
|
||||
AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
super(name, context, parent, reducers, metaData);
|
||||
AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
super(name, context, parent, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.rehash = rehash;
|
||||
this.precision = precision;
|
||||
|
@ -158,12 +158,12 @@ public class CardinalityAggregator extends NumericMetricsAggregator.SingleValue
|
|||
// this Aggregator (and its HLL++ counters) is released.
|
||||
HyperLogLogPlusPlus copy = new HyperLogLogPlusPlus(precision, BigArrays.NON_RECYCLING_INSTANCE, 1);
|
||||
copy.merge(0, counts, owningBucketOrdinal);
|
||||
return new InternalCardinality(name, copy, formatter, reducers(), metaData());
|
||||
return new InternalCardinality(name, copy, formatter, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalCardinality(name, null, formatter, reducers(), metaData());
|
||||
return new InternalCardinality(name, null, formatter, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -22,7 +22,7 @@ package org.elasticsearch.search.aggregations.metrics.cardinality;
|
|||
import org.elasticsearch.search.aggregations.AggregationExecutionException;
|
||||
import org.elasticsearch.search.aggregations.Aggregator;
|
||||
import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -48,18 +48,18 @@ final class CardinalityAggregatorFactory extends ValuesSourceAggregatorFactory<V
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext context, Aggregator parent, List<Reducer> reducers, Map<String, Object> metaData)
|
||||
protected Aggregator createUnmapped(AggregationContext context, Aggregator parent, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
return new CardinalityAggregator(name, null, true, precision(parent), config.formatter(), context, parent, reducers, metaData);
|
||||
return new CardinalityAggregator(name, null, true, precision(parent), config.formatter(), context, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(ValuesSource valuesSource, AggregationContext context, Aggregator parent,
|
||||
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
if (!(valuesSource instanceof ValuesSource.Numeric) && !rehash) {
|
||||
throw new AggregationExecutionException("Turning off rehashing for cardinality aggregation [" + name + "] on non-numeric values in not allowed");
|
||||
}
|
||||
return new CardinalityAggregator(name, valuesSource, rehash, precision(parent), config.formatter(), context, parent, reducers,
|
||||
return new CardinalityAggregator(name, valuesSource, rehash, precision(parent), config.formatter(), context, parent, pipelineAggregators,
|
||||
metaData);
|
||||
}
|
||||
|
||||
|
|
|
@ -27,7 +27,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.metrics.InternalNumericMetricsAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
|
||||
|
||||
|
@ -54,9 +54,9 @@ public final class InternalCardinality extends InternalNumericMetricsAggregation
|
|||
|
||||
private HyperLogLogPlusPlus counts;
|
||||
|
||||
InternalCardinality(String name, HyperLogLogPlusPlus counts, @Nullable ValueFormatter formatter, List<Reducer> reducers,
|
||||
InternalCardinality(String name, HyperLogLogPlusPlus counts, @Nullable ValueFormatter formatter, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.counts = counts;
|
||||
this.valueFormatter = formatter;
|
||||
}
|
||||
|
@ -108,7 +108,7 @@ public final class InternalCardinality extends InternalNumericMetricsAggregation
|
|||
if (cardinality.counts != null) {
|
||||
if (reduced == null) {
|
||||
reduced = new InternalCardinality(name, new HyperLogLogPlusPlus(cardinality.counts.precision(),
|
||||
BigArrays.NON_RECYCLING_INSTANCE, 1), this.valueFormatter, reducers(), getMetaData());
|
||||
BigArrays.NON_RECYCLING_INSTANCE, 1), this.valueFormatter, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
reduced.merge(cardinality);
|
||||
}
|
||||
|
|
|
@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.metrics.MetricsAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -52,9 +52,9 @@ public final class GeoBoundsAggregator extends MetricsAggregator {
|
|||
DoubleArray negRights;
|
||||
|
||||
protected GeoBoundsAggregator(String name, AggregationContext aggregationContext, Aggregator parent,
|
||||
ValuesSource.GeoPoint valuesSource, boolean wrapLongitude, List<Reducer> reducers,
|
||||
ValuesSource.GeoPoint valuesSource, boolean wrapLongitude, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, aggregationContext, parent, reducers, metaData);
|
||||
super(name, aggregationContext, parent, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.wrapLongitude = wrapLongitude;
|
||||
if (valuesSource != null) {
|
||||
|
@ -152,13 +152,13 @@ public final class GeoBoundsAggregator extends MetricsAggregator {
|
|||
double posRight = posRights.get(owningBucketOrdinal);
|
||||
double negLeft = negLefts.get(owningBucketOrdinal);
|
||||
double negRight = negRights.get(owningBucketOrdinal);
|
||||
return new InternalGeoBounds(name, top, bottom, posLeft, posRight, negLeft, negRight, wrapLongitude, reducers(), metaData());
|
||||
return new InternalGeoBounds(name, top, bottom, posLeft, posRight, negLeft, negRight, wrapLongitude, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalGeoBounds(name, Double.NEGATIVE_INFINITY, Double.POSITIVE_INFINITY, Double.POSITIVE_INFINITY,
|
||||
Double.NEGATIVE_INFINITY, Double.POSITIVE_INFINITY, Double.NEGATIVE_INFINITY, wrapLongitude, reducers(), metaData());
|
||||
Double.NEGATIVE_INFINITY, Double.POSITIVE_INFINITY, Double.NEGATIVE_INFINITY, wrapLongitude, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -176,15 +176,16 @@ public final class GeoBoundsAggregator extends MetricsAggregator {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
return new GeoBoundsAggregator(name, aggregationContext, parent, null, wrapLongitude, reducers, metaData);
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
return new GeoBoundsAggregator(name, aggregationContext, parent, null, wrapLongitude, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(ValuesSource.GeoPoint valuesSource, AggregationContext aggregationContext,
|
||||
Aggregator parent, boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
return new GeoBoundsAggregator(name, aggregationContext, parent, valuesSource, wrapLongitude, reducers, metaData);
|
||||
protected Aggregator doCreateInternal(ValuesSource.GeoPoint valuesSource, AggregationContext aggregationContext, Aggregator parent,
|
||||
boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
return new GeoBoundsAggregator(name, aggregationContext, parent, valuesSource, wrapLongitude, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -26,7 +26,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.metrics.InternalMetricsAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
|
@ -57,8 +57,8 @@ public class InternalGeoBounds extends InternalMetricsAggregation implements Geo
|
|||
|
||||
InternalGeoBounds(String name, double top, double bottom, double posLeft, double posRight,
|
||||
double negLeft, double negRight, boolean wrapLongitude,
|
||||
List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.top = top;
|
||||
this.bottom = bottom;
|
||||
this.posLeft = posLeft;
|
||||
|
@ -104,7 +104,7 @@ public class InternalGeoBounds extends InternalMetricsAggregation implements Geo
|
|||
negRight = bounds.negRight;
|
||||
}
|
||||
}
|
||||
return new InternalGeoBounds(name, top, bottom, posLeft, posRight, negLeft, negRight, wrapLongitude, reducers(), getMetaData());
|
||||
return new InternalGeoBounds(name, top, bottom, posLeft, posRight, negLeft, negRight, wrapLongitude, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -25,7 +25,7 @@ import org.elasticsearch.common.xcontent.XContentBuilder;
|
|||
import org.elasticsearch.search.aggregations.AggregationStreams;
|
||||
import org.elasticsearch.search.aggregations.InternalAggregation;
|
||||
import org.elasticsearch.search.aggregations.metrics.InternalNumericMetricsAggregation;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatter;
|
||||
import org.elasticsearch.search.aggregations.support.format.ValueFormatterStreams;
|
||||
|
||||
|
@ -57,8 +57,8 @@ public class InternalMax extends InternalNumericMetricsAggregation.SingleValue i
|
|||
|
||||
InternalMax() {} // for serialization
|
||||
|
||||
public InternalMax(String name, double max, @Nullable ValueFormatter formatter, List<Reducer> reducers, Map<String, Object> metaData) {
|
||||
super(name, reducers, metaData);
|
||||
public InternalMax(String name, double max, @Nullable ValueFormatter formatter, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) {
|
||||
super(name, pipelineAggregators, metaData);
|
||||
this.valueFormatter = formatter;
|
||||
this.max = max;
|
||||
}
|
||||
|
@ -84,7 +84,7 @@ public class InternalMax extends InternalNumericMetricsAggregation.SingleValue i
|
|||
for (InternalAggregation aggregation : aggregations) {
|
||||
max = Math.max(max, ((InternalMax) aggregation).max);
|
||||
}
|
||||
return new InternalMax(name, max, valueFormatter, reducers(), getMetaData());
|
||||
return new InternalMax(name, max, valueFormatter, pipelineAggregators(), getMetaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -31,7 +31,7 @@ import org.elasticsearch.search.aggregations.InternalAggregation;
|
|||
import org.elasticsearch.search.aggregations.LeafBucketCollector;
|
||||
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
|
||||
import org.elasticsearch.search.aggregations.metrics.NumericMetricsAggregator;
|
||||
import org.elasticsearch.search.aggregations.reducers.Reducer;
|
||||
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
|
||||
import org.elasticsearch.search.aggregations.support.AggregationContext;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSource;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceAggregatorFactory;
|
||||
|
@ -53,9 +53,10 @@ public class MaxAggregator extends NumericMetricsAggregator.SingleValue {
|
|||
DoubleArray maxes;
|
||||
|
||||
public MaxAggregator(String name, ValuesSource.Numeric valuesSource, @Nullable ValueFormatter formatter,
|
||||
AggregationContext context, Aggregator parent, List<Reducer> reducers,
|
||||
AggregationContext context,
|
||||
Aggregator parent, List<PipelineAggregator> pipelineAggregators,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
super(name, context, parent, reducers, metaData);
|
||||
super(name, context, parent, pipelineAggregators, metaData);
|
||||
this.valuesSource = valuesSource;
|
||||
this.formatter = formatter;
|
||||
if (valuesSource != null) {
|
||||
|
@ -106,12 +107,12 @@ public class MaxAggregator extends NumericMetricsAggregator.SingleValue {
|
|||
if (valuesSource == null || bucket >= maxes.size()) {
|
||||
return buildEmptyAggregation();
|
||||
}
|
||||
return new InternalMax(name, maxes.get(bucket), formatter, reducers(), metaData());
|
||||
return new InternalMax(name, maxes.get(bucket), formatter, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
@Override
|
||||
public InternalAggregation buildEmptyAggregation() {
|
||||
return new InternalMax(name, Double.NEGATIVE_INFINITY, formatter, reducers(), metaData());
|
||||
return new InternalMax(name, Double.NEGATIVE_INFINITY, formatter, pipelineAggregators(), metaData());
|
||||
}
|
||||
|
||||
public static class Factory extends ValuesSourceAggregatorFactory.LeafOnly<ValuesSource.Numeric> {
|
||||
|
@ -121,15 +122,16 @@ public class MaxAggregator extends NumericMetricsAggregator.SingleValue {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent, List<Reducer> reducers,
|
||||
Map<String, Object> metaData) throws IOException {
|
||||
return new MaxAggregator(name, null, config.formatter(), aggregationContext, parent, reducers, metaData);
|
||||
protected Aggregator createUnmapped(AggregationContext aggregationContext, Aggregator parent,
|
||||
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException {
|
||||
return new MaxAggregator(name, null, config.formatter(), aggregationContext, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Aggregator doCreateInternal(ValuesSource.Numeric valuesSource, AggregationContext aggregationContext, Aggregator parent,
|
||||
boolean collectsFromSingleBucket, List<Reducer> reducers, Map<String, Object> metaData) throws IOException {
|
||||
return new MaxAggregator(name, valuesSource, config.formatter(), aggregationContext, parent, reducers, metaData);
|
||||
boolean collectsFromSingleBucket, List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
|
||||
throws IOException {
|
||||
return new MaxAggregator(name, valuesSource, config.formatter(), aggregationContext, parent, pipelineAggregators, metaData);
|
||||
}
|
||||
}
|
||||
|
||||
|
|