mirror of
https://github.com/honeymoose/OpenSearch.git
synced 2025-02-09 22:45:04 +00:00
Updated docs for 3.0.0-beta
This commit is contained in:
parent
53f316b540
commit
dc018cf622
@ -2,8 +2,6 @@
|
|||||||
|
|
||||||
== Pipeline Aggregations
|
== Pipeline Aggregations
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
Pipeline aggregations work on the outputs produced from other aggregations rather than from document sets, adding
|
Pipeline aggregations work on the outputs produced from other aggregations rather than from document sets, adding
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-avg-bucket-aggregation]]
|
[[search-aggregations-pipeline-avg-bucket-aggregation]]
|
||||||
=== Avg Bucket Aggregation
|
=== Avg Bucket Aggregation
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A sibling pipeline aggregation which calculates the (mean) average value of a specified metric in a sibling aggregation.
|
A sibling pipeline aggregation which calculates the (mean) average value of a specified metric in a sibling aggregation.
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-bucket-script-aggregation]]
|
[[search-aggregations-pipeline-bucket-script-aggregation]]
|
||||||
=== Bucket Script Aggregation
|
=== Bucket Script Aggregation
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A parent pipeline aggregation which executes a script which can perform per bucket computations on specified metrics
|
A parent pipeline aggregation which executes a script which can perform per bucket computations on specified metrics
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-bucket-selector-aggregation]]
|
[[search-aggregations-pipeline-bucket-selector-aggregation]]
|
||||||
=== Bucket Selector Aggregation
|
=== Bucket Selector Aggregation
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A parent pipeline aggregation which executes a script which determines whether the current bucket will be retained
|
A parent pipeline aggregation which executes a script which determines whether the current bucket will be retained
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-cumulative-sum-aggregation]]
|
[[search-aggregations-pipeline-cumulative-sum-aggregation]]
|
||||||
=== Cumulative Sum Aggregation
|
=== Cumulative Sum Aggregation
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A parent pipeline aggregation which calculates the cumulative sum of a specified metric in a parent histogram (or date_histogram)
|
A parent pipeline aggregation which calculates the cumulative sum of a specified metric in a parent histogram (or date_histogram)
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-derivative-aggregation]]
|
[[search-aggregations-pipeline-derivative-aggregation]]
|
||||||
=== Derivative Aggregation
|
=== Derivative Aggregation
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A parent pipeline aggregation which calculates the derivative of a specified metric in a parent histogram (or date_histogram)
|
A parent pipeline aggregation which calculates the derivative of a specified metric in a parent histogram (or date_histogram)
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-extended-stats-bucket-aggregation]]
|
[[search-aggregations-pipeline-extended-stats-bucket-aggregation]]
|
||||||
=== Extended Stats Bucket Aggregation
|
=== Extended Stats Bucket Aggregation
|
||||||
|
|
||||||
coming[2.1.0]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A sibling pipeline aggregation which calculates a variety of stats across all bucket of a specified metric in a sibling aggregation.
|
A sibling pipeline aggregation which calculates a variety of stats across all bucket of a specified metric in a sibling aggregation.
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-max-bucket-aggregation]]
|
[[search-aggregations-pipeline-max-bucket-aggregation]]
|
||||||
=== Max Bucket Aggregation
|
=== Max Bucket Aggregation
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A sibling pipeline aggregation which identifies the bucket(s) with the maximum value of a specified metric in a sibling aggregation
|
A sibling pipeline aggregation which identifies the bucket(s) with the maximum value of a specified metric in a sibling aggregation
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-min-bucket-aggregation]]
|
[[search-aggregations-pipeline-min-bucket-aggregation]]
|
||||||
=== Min Bucket Aggregation
|
=== Min Bucket Aggregation
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A sibling pipeline aggregation which identifies the bucket(s) with the minimum value of a specified metric in a sibling aggregation
|
A sibling pipeline aggregation which identifies the bucket(s) with the minimum value of a specified metric in a sibling aggregation
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-movavg-aggregation]]
|
[[search-aggregations-pipeline-movavg-aggregation]]
|
||||||
=== Moving Average Aggregation
|
=== Moving Average Aggregation
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
Given an ordered series of data, the Moving Average aggregation will slide a window across the data and emit the average
|
Given an ordered series of data, the Moving Average aggregation will slide a window across the data and emit the average
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-percentiles-bucket-aggregation]]
|
[[search-aggregations-pipeline-percentiles-bucket-aggregation]]
|
||||||
=== Percentiles Bucket Aggregation
|
=== Percentiles Bucket Aggregation
|
||||||
|
|
||||||
coming[2.1.0]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A sibling pipeline aggregation which calculates percentiles across all bucket of a specified metric in a sibling aggregation.
|
A sibling pipeline aggregation which calculates percentiles across all bucket of a specified metric in a sibling aggregation.
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-serialdiff-aggregation]]
|
[[search-aggregations-pipeline-serialdiff-aggregation]]
|
||||||
=== Serial Differencing Aggregation
|
=== Serial Differencing Aggregation
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
Serial differencing is a technique where values in a time series are subtracted from itself at
|
Serial differencing is a technique where values in a time series are subtracted from itself at
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-stats-bucket-aggregation]]
|
[[search-aggregations-pipeline-stats-bucket-aggregation]]
|
||||||
=== Stats Bucket Aggregation
|
=== Stats Bucket Aggregation
|
||||||
|
|
||||||
coming[2.1.0]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A sibling pipeline aggregation which calculates a variety of stats across all bucket of a specified metric in a sibling aggregation.
|
A sibling pipeline aggregation which calculates a variety of stats across all bucket of a specified metric in a sibling aggregation.
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[search-aggregations-pipeline-sum-bucket-aggregation]]
|
[[search-aggregations-pipeline-sum-bucket-aggregation]]
|
||||||
=== Sum Bucket Aggregation
|
=== Sum Bucket Aggregation
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
experimental[]
|
experimental[]
|
||||||
|
|
||||||
A sibling pipeline aggregation which calculates the sum across all bucket of a specified metric in a sibling aggregation.
|
A sibling pipeline aggregation which calculates the sum across all bucket of a specified metric in a sibling aggregation.
|
||||||
|
@ -131,8 +131,6 @@ operation based on the `_parent` / `_routing` mapping.
|
|||||||
[[bulk-timestamp]]
|
[[bulk-timestamp]]
|
||||||
=== Timestamp
|
=== Timestamp
|
||||||
|
|
||||||
deprecated[2.0.0,The `_timestamp` field is deprecated. Instead, use a normal <<date,`date`>> field and set its value explicitly]
|
|
||||||
|
|
||||||
Each bulk item can include the timestamp value using the
|
Each bulk item can include the timestamp value using the
|
||||||
`_timestamp`/`timestamp` field. It automatically follows the behavior of
|
`_timestamp`/`timestamp` field. It automatically follows the behavior of
|
||||||
the index operation based on the `_timestamp` mapping.
|
the index operation based on the `_timestamp` mapping.
|
||||||
@ -141,8 +139,6 @@ the index operation based on the `_timestamp` mapping.
|
|||||||
[[bulk-ttl]]
|
[[bulk-ttl]]
|
||||||
=== TTL
|
=== TTL
|
||||||
|
|
||||||
deprecated[2.0.0,The current `_ttl` implementation is deprecated and will be replaced with a different implementation in a future version]
|
|
||||||
|
|
||||||
Each bulk item can include the ttl value using the `_ttl`/`ttl` field.
|
Each bulk item can include the ttl value using the `_ttl`/`ttl` field.
|
||||||
It automatically follows the behavior of the index operation based on
|
It automatically follows the behavior of the index operation based on
|
||||||
the `_ttl` mapping.
|
the `_ttl` mapping.
|
||||||
|
@ -257,8 +257,6 @@ specified using the `routing` parameter.
|
|||||||
[[index-timestamp]]
|
[[index-timestamp]]
|
||||||
=== Timestamp
|
=== Timestamp
|
||||||
|
|
||||||
deprecated[2.0.0,The `_timestamp` field is deprecated. Instead, use a normal <<date,`date`>> field and set its value explicitly]
|
|
||||||
|
|
||||||
A document can be indexed with a `timestamp` associated with it. The
|
A document can be indexed with a `timestamp` associated with it. The
|
||||||
`timestamp` value of a document can be set using the `timestamp`
|
`timestamp` value of a document can be set using the `timestamp`
|
||||||
parameter. For example:
|
parameter. For example:
|
||||||
@ -281,8 +279,6 @@ page>>.
|
|||||||
[[index-ttl]]
|
[[index-ttl]]
|
||||||
=== TTL
|
=== TTL
|
||||||
|
|
||||||
deprecated[2.0.0,The current `_ttl` implementation is deprecated and will be replaced with a different implementation in a future version]
|
|
||||||
|
|
||||||
|
|
||||||
A document can be indexed with a `ttl` (time to live) associated with
|
A document can be indexed with a `ttl` (time to live) associated with
|
||||||
it. Expired documents will be expunged automatically. The expiration
|
it. Expired documents will be expunged automatically. The expiration
|
||||||
|
@ -81,8 +81,6 @@ omit :
|
|||||||
[float]
|
[float]
|
||||||
==== Distributed frequencies
|
==== Distributed frequencies
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
Setting `dfs` to `true` (default is `false`) will return the term statistics
|
Setting `dfs` to `true` (default is `false`) will return the term statistics
|
||||||
or the field statistics of the entire index, and not just at the shard. Use it
|
or the field statistics of the entire index, and not just at the shard. Use it
|
||||||
with caution as distributed frequencies can have a serious performance impact.
|
with caution as distributed frequencies can have a serious performance impact.
|
||||||
@ -90,8 +88,6 @@ with caution as distributed frequencies can have a serious performance impact.
|
|||||||
[float]
|
[float]
|
||||||
==== Terms Filtering
|
==== Terms Filtering
|
||||||
|
|
||||||
coming[2.0.0-beta1]
|
|
||||||
|
|
||||||
With the parameter `filter`, the terms returned could also be filtered based
|
With the parameter `filter`, the terms returned could also be filtered based
|
||||||
on their tf-idf scores. This could be useful in order find out a good
|
on their tf-idf scores. This could be useful in order find out a good
|
||||||
characteristic vector of a document. This feature works in a similar manner to
|
characteristic vector of a document. This feature works in a similar manner to
|
||||||
|
@ -1,8 +1,8 @@
|
|||||||
[[elasticsearch-reference]]
|
[[elasticsearch-reference]]
|
||||||
= Elasticsearch Reference
|
= Elasticsearch Reference
|
||||||
|
|
||||||
:version: 2.0.0-beta1
|
:version: 3.0.0-beta1
|
||||||
:branch: 2.0
|
:branch: 3.0
|
||||||
:jdk: 1.8.0_25
|
:jdk: 1.8.0_25
|
||||||
:defguide: https://www.elastic.co/guide/en/elasticsearch/guide/current
|
:defguide: https://www.elastic.co/guide/en/elasticsearch/guide/current
|
||||||
:plugins: https://www.elastic.co/guide/en/elasticsearch/plugins/master
|
:plugins: https://www.elastic.co/guide/en/elasticsearch/plugins/master
|
||||||
|
@ -16,8 +16,6 @@ curl -XGET 'localhost:9200/_analyze' -d '
|
|||||||
}'
|
}'
|
||||||
--------------------------------------------------
|
--------------------------------------------------
|
||||||
|
|
||||||
coming[2.0.0-beta1, body based parameters were added in 2.0.0]
|
|
||||||
|
|
||||||
If text parameter is provided as array of strings, it is analyzed as a multi-valued field.
|
If text parameter is provided as array of strings, it is analyzed as a multi-valued field.
|
||||||
|
|
||||||
[source,js]
|
[source,js]
|
||||||
@ -29,8 +27,6 @@ curl -XGET 'localhost:9200/_analyze' -d '
|
|||||||
}'
|
}'
|
||||||
--------------------------------------------------
|
--------------------------------------------------
|
||||||
|
|
||||||
coming[2.0.0-beta1, body based parameters were added in 2.0.0]
|
|
||||||
|
|
||||||
Or by building a custom transient analyzer out of tokenizers,
|
Or by building a custom transient analyzer out of tokenizers,
|
||||||
token filters and char filters. Token filters can use the shorter 'filters'
|
token filters and char filters. Token filters can use the shorter 'filters'
|
||||||
parameter name:
|
parameter name:
|
||||||
@ -53,8 +49,6 @@ curl -XGET 'localhost:9200/_analyze' -d '
|
|||||||
}'
|
}'
|
||||||
--------------------------------------------------
|
--------------------------------------------------
|
||||||
|
|
||||||
coming[2.0.0-beta1, body based parameters were added in 2.0.0]
|
|
||||||
|
|
||||||
It can also run against a specific index:
|
It can also run against a specific index:
|
||||||
|
|
||||||
[source,js]
|
[source,js]
|
||||||
@ -78,8 +72,6 @@ curl -XGET 'localhost:9200/test/_analyze' -d '
|
|||||||
}'
|
}'
|
||||||
--------------------------------------------------
|
--------------------------------------------------
|
||||||
|
|
||||||
coming[2.0.0-beta1, body based parameters were added in 2.0.0]
|
|
||||||
|
|
||||||
Also, the analyzer can be derived based on a field mapping, for example:
|
Also, the analyzer can be derived based on a field mapping, for example:
|
||||||
|
|
||||||
[source,js]
|
[source,js]
|
||||||
@ -91,8 +83,6 @@ curl -XGET 'localhost:9200/test/_analyze' -d '
|
|||||||
}'
|
}'
|
||||||
--------------------------------------------------
|
--------------------------------------------------
|
||||||
|
|
||||||
coming[2.0.0-beta1, body based parameters were added in 2.0.0]
|
|
||||||
|
|
||||||
Will cause the analysis to happen based on the analyzer configured in the
|
Will cause the analysis to happen based on the analyzer configured in the
|
||||||
mapping for `obj1.field1` (and if not, the default index analyzer).
|
mapping for `obj1.field1` (and if not, the default index analyzer).
|
||||||
|
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[mapping-parent-field]]
|
[[mapping-parent-field]]
|
||||||
=== `_parent` field
|
=== `_parent` field
|
||||||
|
|
||||||
added[2.0.0-beta1,The parent-child implementation has been completely rewritten. It is advisable to reindex any 1.x indices which use parent-child to take advantage of the new optimizations]
|
|
||||||
|
|
||||||
A parent-child relationship can be established between documents in the same
|
A parent-child relationship can be established between documents in the same
|
||||||
index by making one mapping type the parent of another:
|
index by making one mapping type the parent of another:
|
||||||
|
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[mapping-timestamp-field]]
|
[[mapping-timestamp-field]]
|
||||||
=== `_timestamp` field
|
=== `_timestamp` field
|
||||||
|
|
||||||
deprecated[2.0.0,The `_timestamp` field is deprecated. Instead, use a normal <<date,`date`>> field and set its value explicitly]
|
|
||||||
|
|
||||||
The `_timestamp` field, when enabled, allows a timestamp to be indexed and
|
The `_timestamp` field, when enabled, allows a timestamp to be indexed and
|
||||||
stored with a document. The timestamp may be specified manually, generated
|
stored with a document. The timestamp may be specified manually, generated
|
||||||
automatically, or set to a default value:
|
automatically, or set to a default value:
|
||||||
|
@ -1,8 +1,6 @@
|
|||||||
[[mapping-ttl-field]]
|
[[mapping-ttl-field]]
|
||||||
=== `_ttl` field
|
=== `_ttl` field
|
||||||
|
|
||||||
deprecated[2.0.0,The current `_ttl` implementation is deprecated and will be replaced with a different implementation in a future version]
|
|
||||||
|
|
||||||
Some types of documents, such as session data or special offers, come with an
|
Some types of documents, such as session data or special offers, come with an
|
||||||
expiration date. The `_ttl` field allows you to specify the minimum time a
|
expiration date. The `_ttl` field allows you to specify the minimum time a
|
||||||
document should live, after which time the document is deleted automatically.
|
document should live, after which time the document is deleted automatically.
|
||||||
|
@ -121,7 +121,7 @@ The following settings are supported:
|
|||||||
using size value notation, i.e. 1g, 10m, 5k. Defaults to `null` (unlimited chunk size).
|
using size value notation, i.e. 1g, 10m, 5k. Defaults to `null` (unlimited chunk size).
|
||||||
`max_restore_bytes_per_sec`:: Throttles per node restore rate. Defaults to `40mb` per second.
|
`max_restore_bytes_per_sec`:: Throttles per node restore rate. Defaults to `40mb` per second.
|
||||||
`max_snapshot_bytes_per_sec`:: Throttles per node snapshot rate. Defaults to `40mb` per second.
|
`max_snapshot_bytes_per_sec`:: Throttles per node snapshot rate. Defaults to `40mb` per second.
|
||||||
`readonly`:: Makes repository read-only. coming[2.1.0] Defaults to `false`.
|
`readonly`:: Makes repository read-only. Defaults to `false`.
|
||||||
|
|
||||||
[float]
|
[float]
|
||||||
===== Read-only URL Repository
|
===== Read-only URL Repository
|
||||||
@ -259,7 +259,7 @@ GET /_snapshot/my_backup/_all
|
|||||||
-----------------------------------
|
-----------------------------------
|
||||||
// AUTOSENSE
|
// AUTOSENSE
|
||||||
|
|
||||||
coming[2.0.0-beta1] A currently running snapshot can be retrieved using the following command:
|
A currently running snapshot can be retrieved using the following command:
|
||||||
|
|
||||||
[source,sh]
|
[source,sh]
|
||||||
-----------------------------------
|
-----------------------------------
|
||||||
|
@ -149,7 +149,7 @@ input, the other one for term selection and for query formation.
|
|||||||
==== Document Input Parameters
|
==== Document Input Parameters
|
||||||
|
|
||||||
[horizontal]
|
[horizontal]
|
||||||
`like`:: coming[2.0.0-beta1]
|
`like`::
|
||||||
The only *required* parameter of the MLT query is `like` and follows a
|
The only *required* parameter of the MLT query is `like` and follows a
|
||||||
versatile syntax, in which the user can specify free form text and/or a single
|
versatile syntax, in which the user can specify free form text and/or a single
|
||||||
or multiple documents (see examples above). The syntax to specify documents is
|
or multiple documents (see examples above). The syntax to specify documents is
|
||||||
@ -162,7 +162,7 @@ follows a similar syntax to the `per_field_analyzer` parameter of the
|
|||||||
Additionally, to provide documents not necessarily present in the index,
|
Additionally, to provide documents not necessarily present in the index,
|
||||||
<<docs-termvectors-artificial-doc,artificial documents>> are also supported.
|
<<docs-termvectors-artificial-doc,artificial documents>> are also supported.
|
||||||
|
|
||||||
`unlike`:: coming[2.0.0-beta1]
|
`unlike`::
|
||||||
The `unlike` parameter is used in conjunction with `like` in order not to
|
The `unlike` parameter is used in conjunction with `like` in order not to
|
||||||
select terms found in a chosen set of documents. In other words, we could ask
|
select terms found in a chosen set of documents. In other words, we could ask
|
||||||
for documents `like: "Apple"`, but `unlike: "cake crumble tree"`. The syntax
|
for documents `like: "Apple"`, but `unlike: "cake crumble tree"`. The syntax
|
||||||
@ -172,10 +172,10 @@ is the same as `like`.
|
|||||||
A list of fields to fetch and analyze the text from. Defaults to the `_all`
|
A list of fields to fetch and analyze the text from. Defaults to the `_all`
|
||||||
field for free text and to all possible fields for document inputs.
|
field for free text and to all possible fields for document inputs.
|
||||||
|
|
||||||
`like_text`:: deprecated[2.0.0-beta1,Replaced by `like`]
|
`like_text`::
|
||||||
The text to find documents like it.
|
The text to find documents like it.
|
||||||
|
|
||||||
`ids` or `docs`:: deprecated[2.0.0-beta1,Replaced by `like`]
|
`ids` or `docs`::
|
||||||
A list of documents following the same syntax as the <<docs-multi-get,Multi GET API>>.
|
A list of documents following the same syntax as the <<docs-multi-get,Multi GET API>>.
|
||||||
|
|
||||||
[float]
|
[float]
|
||||||
|
@ -63,8 +63,6 @@ curl -XGET <1> 'localhost:9200/_search/scroll' <2> -d'
|
|||||||
'
|
'
|
||||||
--------------------------------------------------
|
--------------------------------------------------
|
||||||
|
|
||||||
coming[2.0.0-beta1, body based parameters were added in 2.0.0]
|
|
||||||
|
|
||||||
<1> `GET` or `POST` can be used.
|
<1> `GET` or `POST` can be used.
|
||||||
<2> The URL should not include the `index` or `type` name -- these
|
<2> The URL should not include the `index` or `type` name -- these
|
||||||
are specified in the original `search` request instead.
|
are specified in the original `search` request instead.
|
||||||
@ -151,8 +149,6 @@ curl -XDELETE localhost:9200/_search/scroll -d '
|
|||||||
}'
|
}'
|
||||||
---------------------------------------
|
---------------------------------------
|
||||||
|
|
||||||
coming[2.0.0-beta1, Body based parameters were added in 2.0.0]
|
|
||||||
|
|
||||||
Multiple scroll IDs can be passed as array:
|
Multiple scroll IDs can be passed as array:
|
||||||
|
|
||||||
[source,js]
|
[source,js]
|
||||||
@ -163,8 +159,6 @@ curl -XDELETE localhost:9200/_search/scroll -d '
|
|||||||
}'
|
}'
|
||||||
---------------------------------------
|
---------------------------------------
|
||||||
|
|
||||||
coming[2.0.0-beta1, Body based parameters were added in 2.0.0]
|
|
||||||
|
|
||||||
All search contexts can be cleared with the `_all` parameter:
|
All search contexts can be cleared with the `_all` parameter:
|
||||||
|
|
||||||
[source,js]
|
[source,js]
|
||||||
|
Loading…
x
Reference in New Issue
Block a user