Uses `my-data-stream` in place of `logs` for data stream examples. This provides a more intuitive experience for users that copy/paste their own values into snippets.
This commit is contained in:
parent
721198c29e
commit
ff4ea4720a
|
@ -28,7 +28,7 @@ mappings and change <<index-modules-settings,dynamic index settings>>. See
|
|||
////
|
||||
[source,console]
|
||||
----
|
||||
PUT /_ilm/policy/logs_policy
|
||||
PUT /_ilm/policy/my-data-stream-policy
|
||||
{
|
||||
"policy": {
|
||||
"phases": {
|
||||
|
@ -49,23 +49,23 @@ PUT /_ilm/policy/logs_policy
|
|||
}
|
||||
}
|
||||
|
||||
PUT /_index_template/logs_data_stream
|
||||
PUT /_index_template/my-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "logs*" ],
|
||||
"index_patterns": [ "my-data-stream*" ],
|
||||
"data_stream": { }
|
||||
}
|
||||
|
||||
PUT /_index_template/new_logs_data_stream
|
||||
PUT /_index_template/new-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "new_logs*" ],
|
||||
"index_patterns": [ "new-data-stream*" ],
|
||||
"data_stream": { }
|
||||
}
|
||||
|
||||
PUT /_data_stream/logs
|
||||
PUT /_data_stream/my-data-stream
|
||||
|
||||
POST /logs/_rollover/
|
||||
POST /my-data-stream/_rollover/
|
||||
|
||||
PUT /_data_stream/new_logs
|
||||
PUT /_data_stream/new-data-stream
|
||||
----
|
||||
// TESTSETUP
|
||||
|
||||
|
@ -75,7 +75,7 @@ DELETE /_data_stream/*
|
|||
|
||||
DELETE /_index_template/*
|
||||
|
||||
DELETE /_ilm/policy/logs_policy
|
||||
DELETE /_ilm/policy/my-data-stream-policy
|
||||
----
|
||||
// TEARDOWN
|
||||
////
|
||||
|
@ -90,17 +90,17 @@ To add a mapping for a new field to a data stream, following these steps:
|
|||
field mapping is added to future backing indices created for the stream.
|
||||
+
|
||||
--
|
||||
For example, `logs_data_stream` is an existing index template used by the `logs`
|
||||
data stream.
|
||||
For example, `my-data-stream-template` is an existing index template used by
|
||||
`my-data-stream`.
|
||||
|
||||
The following <<indices-templates,put index template>> request adds a mapping
|
||||
for a new field, `message`, to the template.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_index_template/logs_data_stream
|
||||
PUT /_index_template/my-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "logs*" ],
|
||||
"index_patterns": [ "my-data-stream*" ],
|
||||
"data_stream": { },
|
||||
"template": {
|
||||
"mappings": {
|
||||
|
@ -122,11 +122,11 @@ backing indices, including the write index.
|
|||
+
|
||||
--
|
||||
The following put mapping API request adds the new `message` field mapping to
|
||||
the `logs` data stream.
|
||||
`my-data-stream`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /logs/_mapping
|
||||
PUT /my-data-stream/_mapping
|
||||
{
|
||||
"properties": {
|
||||
"message": {
|
||||
|
@ -142,12 +142,12 @@ To add the mapping only to the stream's write index, set the put mapping API's
|
|||
+
|
||||
--
|
||||
The following put mapping request adds the new `message` field mapping only to
|
||||
the `logs` stream's write index. The new field mapping is not added to the
|
||||
stream's other backing indices.
|
||||
`my-data-stream`'s write index. The new field mapping is not added to
|
||||
the stream's other backing indices.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /logs/_mapping?write_index_only=true
|
||||
PUT /my-data-stream/_mapping?write_index_only=true
|
||||
{
|
||||
"properties": {
|
||||
"message": {
|
||||
|
@ -171,8 +171,8 @@ existing field, follow these steps:
|
|||
field mapping is added to future backing indices created for the stream.
|
||||
+
|
||||
--
|
||||
For example, `logs_data_stream` is an existing index template used by the `logs`
|
||||
data stream.
|
||||
For example, `my-data-stream-template` is an existing index template used by
|
||||
`my-data-stream`.
|
||||
|
||||
The following <<indices-templates,put index template>> request changes the
|
||||
argument for the `host.ip` field's <<ignore-malformed,`ignore_malformed`>>
|
||||
|
@ -180,9 +180,9 @@ mapping parameter to `true`.
|
|||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_index_template/logs_data_stream
|
||||
PUT /_index_template/my-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "logs*" ],
|
||||
"index_patterns": [ "my-data-stream*" ],
|
||||
"data_stream": { },
|
||||
"template": {
|
||||
"mappings": {
|
||||
|
@ -208,13 +208,13 @@ to the data stream. By default, this applies the changes to the stream's
|
|||
existing backing indices, including the write index.
|
||||
+
|
||||
--
|
||||
The following <<indices-put-mapping,put mapping API>> request targets the `logs`
|
||||
data stream. The request changes the argument for the `host.ip` field's
|
||||
`ignore_malformed` mapping parameter to `true`.
|
||||
The following <<indices-put-mapping,put mapping API>> request targets
|
||||
`my-data-stream`. The request changes the argument for the `host.ip`
|
||||
field's `ignore_malformed` mapping parameter to `true`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /logs/_mapping
|
||||
PUT /my-data-stream/_mapping
|
||||
{
|
||||
"properties": {
|
||||
"host": {
|
||||
|
@ -230,17 +230,17 @@ PUT /logs/_mapping
|
|||
----
|
||||
--
|
||||
+
|
||||
To apply the mapping changes only to the stream's write index, set the put mapping API's
|
||||
`write_index_only` query parameter to `true`.
|
||||
To apply the mapping changes only to the stream's write index, set the put
|
||||
mapping API's `write_index_only` query parameter to `true`.
|
||||
+
|
||||
--
|
||||
The following put mapping request changes the `host.ip` field's mapping only for
|
||||
the `logs` stream's write index. The change is not applied to the stream's other
|
||||
backing indices.
|
||||
`my-data-stream`'s write index. The change is not applied to the
|
||||
stream's other backing indices.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /logs/_mapping?write_index_only=true
|
||||
PUT /my-data-stream/_mapping?write_index_only=true
|
||||
{
|
||||
"properties": {
|
||||
"host": {
|
||||
|
@ -276,17 +276,17 @@ follow these steps:
|
|||
applied to future backing indices created for the stream.
|
||||
+
|
||||
--
|
||||
For example, `logs_data_stream` is an existing index template used by the `logs`
|
||||
data stream.
|
||||
For example, `my-data-stream-template` is an existing index template used by
|
||||
`my-data-stream`.
|
||||
|
||||
The following <<indices-templates,put index template>> request changes the
|
||||
template's `index.refresh_interval` index setting to `30s` (30 seconds).
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_index_template/logs_data_stream
|
||||
PUT /_index_template/my-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "logs*" ],
|
||||
"index_patterns": [ "my-data-stream*" ],
|
||||
"data_stream": { },
|
||||
"template": {
|
||||
"settings": {
|
||||
|
@ -304,11 +304,11 @@ the stream's existing backing indices, including the write index.
|
|||
+
|
||||
--
|
||||
The following update index settings API request updates the
|
||||
`index.refresh_interval` setting for the `logs` data stream.
|
||||
`index.refresh_interval` setting for `my-data-stream`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /logs/_settings
|
||||
PUT /my-data-stream/_settings
|
||||
{
|
||||
"index": {
|
||||
"refresh_interval": "30s"
|
||||
|
@ -329,17 +329,17 @@ To apply a new static setting to future backing indices, update the index
|
|||
template used by the data stream. The setting is automatically applied to any
|
||||
backing index created after the update.
|
||||
|
||||
For example, `logs_data_stream` is an existing index template used by the `logs`
|
||||
data stream.
|
||||
For example, `my-data-stream-template` is an existing index template used by
|
||||
`my-data-stream`.
|
||||
|
||||
The following <<indices-templates,put index template API>> requests adds new
|
||||
`sort.field` and `sort.order index` settings to the template.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_index_template/logs_data_stream
|
||||
PUT /_index_template/my-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "logs*" ],
|
||||
"index_patterns": [ "my-data-stream*" ],
|
||||
"data_stream": { },
|
||||
"template": {
|
||||
"settings": {
|
||||
|
@ -386,12 +386,12 @@ existing indices, index aliases, or data streams. If so, you should consider
|
|||
using another name or pattern.
|
||||
--
|
||||
The following resolve index API request checks for any existing indices, index
|
||||
aliases, or data streams that start with `new_logs`. If not, the `new_logs*`
|
||||
wildcard pattern can be used to create a new data stream.
|
||||
aliases, or data streams that start with `new-data-stream`. If not, the
|
||||
`new-data-stream*` wildcard pattern can be used to create a new data stream.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
GET /_resolve/index/new_logs*
|
||||
GET /_resolve/index/new-data-stream*
|
||||
----
|
||||
|
||||
The API returns the following response, indicating no existing targets match
|
||||
|
@ -421,25 +421,26 @@ TIP: If you are only adding or changing a few things, we recommend you create a
|
|||
new template by copying an existing one and modifying it as needed.
|
||||
+
|
||||
--
|
||||
For example, `logs_data_stream` is an existing index template used by the
|
||||
`logs` data stream.
|
||||
For example, `my-data-stream-template` is an existing index template used by
|
||||
`my-data-stream`.
|
||||
|
||||
The following <<indices-templates,put index template API>> request creates
|
||||
a new index template, `new_logs_data_stream`. `new_logs_data_stream`
|
||||
uses the `logs_data_stream` template as its basis, with the following changes:
|
||||
The following <<indices-templates,put index template API>> request creates a new
|
||||
index template, `new-data-stream-template`. `new-data-stream-template`
|
||||
uses `my-data-stream-template` as its basis, with the following
|
||||
changes:
|
||||
|
||||
* The `index_patterns` wildcard pattern matches any index or data stream
|
||||
starting with `new_logs`.
|
||||
starting with `new-data-stream`.
|
||||
* The `@timestamp` field mapping uses the `date_nanos` field data type rather
|
||||
than the `date` data type.
|
||||
* The template includes `sort.field` and `sort.order` index settings, which were
|
||||
not in the original `logs_data_stream` template.
|
||||
not in the original `my-data-stream-template` template.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_index_template/new_logs_data_stream
|
||||
PUT /_index_template/new-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "new_logs*" ],
|
||||
"index_patterns": [ "new-data-stream*" ],
|
||||
"data_stream": { },
|
||||
"template": {
|
||||
"mappings": {
|
||||
|
@ -481,16 +482,16 @@ to retain such a backing index until you are ready to delete its newest data.
|
|||
====
|
||||
+
|
||||
--
|
||||
The following create data stream API request targets `new_logs`, which matches
|
||||
the wildcard pattern for the `new_logs_data_stream` template. Because no
|
||||
existing index or data stream uses this name, this request creates the
|
||||
`new_logs` data stream.
|
||||
The following create data stream API request targets `new-data-stream`, which
|
||||
matches the wildcard pattern for `new-data-stream-template`.
|
||||
Because no existing index or data stream uses this name, this request creates
|
||||
the `new-data-stream` data stream.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_data_stream/new_logs
|
||||
PUT /_data_stream/new-data-stream
|
||||
----
|
||||
// TEST[s/new_logs/new_logs_two/]
|
||||
// TEST[s/new-data-stream/new-data-stream-two/]
|
||||
--
|
||||
|
||||
. If you do not want to mix new and old data in your new data stream, pause the
|
||||
|
@ -527,46 +528,46 @@ individual backing indices as the source. You can use the
|
|||
indices.
|
||||
+
|
||||
--
|
||||
You plan to reindex data from the `logs` data stream into the newly created
|
||||
`new_logs` data stream. However, you want to submit a separate reindex request
|
||||
for each backing index in the `logs` data stream, starting with the oldest
|
||||
backing index. This preserves the order in which the data was originally
|
||||
indexed.
|
||||
For example, you plan to reindex data from `my-data-stream` into
|
||||
`new-data-stream`. However, you want to submit a separate reindex request for
|
||||
each backing index in `my-data-stream`, starting with the oldest backing index.
|
||||
This preserves the order in which the data was originally indexed.
|
||||
|
||||
The following get data stream API request retrieves information about the `logs`
|
||||
data stream, including a list of its backing indices.
|
||||
The following get data stream API request retrieves information about
|
||||
`my-data-stream`, including a list of its backing indices.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
GET /_data_stream/logs
|
||||
GET /_data_stream/my-data-stream
|
||||
----
|
||||
|
||||
The API returns the following response. Note the `indices` property contains an
|
||||
array of the stream's current backing indices. The first item in the array
|
||||
contains information about the stream's oldest backing index, `.ds-logs-000001`.
|
||||
contains information about the stream's oldest backing index,
|
||||
`.ds-my-data-stream-000001`.
|
||||
|
||||
[source,console-result]
|
||||
----
|
||||
{
|
||||
"data_streams": [
|
||||
{
|
||||
"name": "logs",
|
||||
"name": "my-data-stream",
|
||||
"timestamp_field": {
|
||||
"name": "@timestamp"
|
||||
},
|
||||
"indices": [
|
||||
{
|
||||
"index_name": ".ds-logs-000001", <1>
|
||||
"index_name": ".ds-my-data-stream-000001", <1>
|
||||
"index_uuid": "Gpdiyq8sRuK9WuthvAdFbw"
|
||||
},
|
||||
{
|
||||
"index_name": ".ds-logs-000002",
|
||||
"index_name": ".ds-my-data-stream-000002",
|
||||
"index_uuid": "_eEfRrFHS9OyhqWntkgHAQ"
|
||||
}
|
||||
],
|
||||
"generation": 2,
|
||||
"status": "GREEN",
|
||||
"template": "logs_data_stream"
|
||||
"template": "my-data-stream-template"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
@ -575,22 +576,23 @@ contains information about the stream's oldest backing index, `.ds-logs-000001`.
|
|||
// TESTRESPONSE[s/"index_uuid": "_eEfRrFHS9OyhqWntkgHAQ"/"index_uuid": $body.data_streams.0.indices.1.index_uuid/]
|
||||
// TESTRESPONSE[s/"status": "GREEN"/"status": "YELLOW"/]
|
||||
|
||||
<1> First item in the `indices` array for the `logs` data stream. This item
|
||||
contains information about the stream's oldest backing index, `.ds-logs-000001`.
|
||||
<1> First item in the `indices` array for `my-data-stream`. This
|
||||
item contains information about the stream's oldest backing index,
|
||||
`.ds-my-data-stream-000001`.
|
||||
|
||||
The following <<docs-reindex,reindex API>> request copies documents from
|
||||
`.ds-logs-000001` to the `new_logs` data stream. Note the request's `op_type` is
|
||||
`create`.
|
||||
`.ds-my-data-stream-000001` to `new-data-stream`. Note the request's `op_type`
|
||||
is `create`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
POST /_reindex
|
||||
{
|
||||
"source": {
|
||||
"index": ".ds-logs-000001"
|
||||
"index": ".ds-my-data-stream-000001"
|
||||
},
|
||||
"dest": {
|
||||
"index": "new_logs",
|
||||
"index": "new-data-stream",
|
||||
"op_type": "create"
|
||||
}
|
||||
}
|
||||
|
@ -601,9 +603,9 @@ You can also use a query to reindex only a subset of documents with each
|
|||
request.
|
||||
+
|
||||
--
|
||||
The following <<docs-reindex,reindex API>> request copies documents from the
|
||||
`logs` data stream to the `new_logs` data stream. The request uses a
|
||||
<<query-dsl-range-query,`range` query>> to only reindex documents with a
|
||||
The following <<docs-reindex,reindex API>> request copies documents from
|
||||
`my-data-stream` to `new-data-stream`. The request
|
||||
uses a <<query-dsl-range-query,`range` query>> to only reindex documents with a
|
||||
timestamp within the last week. Note the request's `op_type` is `create`.
|
||||
|
||||
[source,console]
|
||||
|
@ -611,7 +613,7 @@ timestamp within the last week. Note the request's `op_type` is `create`.
|
|||
POST /_reindex
|
||||
{
|
||||
"source": {
|
||||
"index": "logs",
|
||||
"index": "my-data-stream",
|
||||
"query": {
|
||||
"range": {
|
||||
"@timestamp": {
|
||||
|
@ -622,7 +624,7 @@ POST /_reindex
|
|||
}
|
||||
},
|
||||
"dest": {
|
||||
"index": "new_logs",
|
||||
"index": "new-data-stream",
|
||||
"op_type": "create"
|
||||
}
|
||||
}
|
||||
|
@ -656,11 +658,11 @@ data stream, you can safely remove the old stream.
|
|||
+
|
||||
--
|
||||
The following <<indices-delete-data-stream,delete data stream API>> request
|
||||
deletes the `logs` data stream. This request also deletes the stream's backing
|
||||
indices and any data they contain.
|
||||
deletes `my-data-stream`. This request also deletes the stream's
|
||||
backing indices and any data they contain.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
DELETE /_data_stream/logs
|
||||
DELETE /_data_stream/my-data-stream
|
||||
----
|
||||
--
|
||||
|
|
|
@ -90,9 +90,9 @@ convention:
|
|||
.ds-<data-stream>-<generation>
|
||||
----
|
||||
|
||||
For example, the `web_server_logs` data stream has a generation of `34`. The
|
||||
For example, the `web-server-logs` data stream has a generation of `34`. The
|
||||
most recently created backing index for this data stream is named
|
||||
`.ds-web_server_logs-000034`.
|
||||
`.ds-web-server-logs-000034`.
|
||||
|
||||
Because the generation increments with each new backing index, backing indices
|
||||
with a higher generation contain more recent data. Backing indices with a lower
|
||||
|
|
|
@ -65,15 +65,15 @@ image::images/ilm/create-policy.png[Index Lifecycle Policies page]
|
|||
You can also create a policy using the <<ilm-put-lifecycle,create lifecycle
|
||||
policy API>>.
|
||||
|
||||
The following request configures the `logs_policy` lifecycle policy. The
|
||||
`logs_policy` policy uses the <<ilm-rollover,`rollover` action>> to create a
|
||||
The following request configures the `my-data-stream-policy` lifecycle policy.
|
||||
The policy uses the <<ilm-rollover,`rollover` action>> to create a
|
||||
new <<data-stream-write-index,write index>> for the data stream when the current
|
||||
one reaches 25GB in size. The policy also deletes backing indices 30 days after
|
||||
their rollover.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_ilm/policy/logs_policy
|
||||
PUT /_ilm/policy/my-data-stream-policy
|
||||
{
|
||||
"policy": {
|
||||
"phases": {
|
||||
|
@ -139,19 +139,19 @@ template API>>. The template must include a `data_stream` object with an empty
|
|||
body (`{ }`). This object indicates the template is used exclusively for data
|
||||
streams.
|
||||
|
||||
The following request configures the `logs_data_stream` index template. Because
|
||||
no field mapping is specified, the `@timestamp` field uses the `date` field data
|
||||
type by default.
|
||||
The following request configures the `my-data-stream-template` index template.
|
||||
Because no field mapping is specified, the `@timestamp` field uses the `date`
|
||||
field data type by default.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_index_template/logs_data_stream
|
||||
PUT /_index_template/my-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "logs*" ],
|
||||
"index_patterns": [ "my-data-stream*" ],
|
||||
"data_stream": { },
|
||||
"template": {
|
||||
"settings": {
|
||||
"index.lifecycle.name": "logs_policy"
|
||||
"index.lifecycle.name": "my-data-stream-policy"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -162,9 +162,9 @@ Alternatively, the following template maps `@timestamp` as a `date_nanos` field.
|
|||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_index_template/logs_data_stream
|
||||
PUT /_index_template/my-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "logs*" ],
|
||||
"index_patterns": [ "my-data-stream*" ],
|
||||
"data_stream": { },
|
||||
"template": {
|
||||
"mappings": {
|
||||
|
@ -173,7 +173,7 @@ PUT /_index_template/logs_data_stream
|
|||
}
|
||||
},
|
||||
"settings": {
|
||||
"index.lifecycle.name": "logs_policy"
|
||||
"index.lifecycle.name": "my-data-stream-policy"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -210,14 +210,14 @@ uses the target name as the name for the stream.
|
|||
NOTE: Data streams support only specific types of indexing requests. See
|
||||
<<add-documents-to-a-data-stream>>.
|
||||
|
||||
The following <<docs-index_,index API>> request targets `logs`, which matches
|
||||
the wildcard pattern for the `logs_data_stream` template. Because no existing
|
||||
index or data stream uses this name, this request creates the `logs` data stream
|
||||
and indexes the document to it.
|
||||
The following <<docs-index_,index API>> request targets `my-data-stream`, which
|
||||
matches the wildcard pattern for `my-data-stream-template`. Because
|
||||
no existing index or data stream uses this name, this request creates the
|
||||
`my-data-stream` data stream and indexes the document to it.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
POST /logs/_doc/
|
||||
POST /my-data-stream/_doc/
|
||||
{
|
||||
"@timestamp": "2020-12-06T11:04:05.000Z",
|
||||
"user": {
|
||||
|
@ -229,13 +229,13 @@ POST /logs/_doc/
|
|||
// TEST[continued]
|
||||
|
||||
The API returns the following response. Note the `_index` property contains
|
||||
`.ds-logs-000001`, indicating the document was indexed to the write index of the
|
||||
new `logs` data stream.
|
||||
`.ds-my-data-stream-000001`, indicating the document was indexed to the write
|
||||
index of the new data stream.
|
||||
|
||||
[source,console-result]
|
||||
----
|
||||
{
|
||||
"_index": ".ds-logs-000001",
|
||||
"_index": ".ds-my-data-stream-000001",
|
||||
"_id": "qecQmXIBT4jB8tq1nG0j",
|
||||
"_type": "_doc",
|
||||
"_version": 1,
|
||||
|
@ -259,14 +259,14 @@ You can use the <<indices-create-data-stream,create data stream API>> to
|
|||
manually create a data stream. The name of the data stream must match the name
|
||||
or wildcard pattern defined in the template's `index_patterns` property.
|
||||
|
||||
The following create data stream request
|
||||
targets `logs_alt`, which matches the wildcard pattern for the
|
||||
`logs_data_stream` template. Because no existing index or data stream uses this
|
||||
name, this request creates the `logs_alt` data stream.
|
||||
The following create data stream request targets `my-data-stream-alt`, which
|
||||
matches the wildcard pattern for `my-data-stream-template`. Because
|
||||
no existing index or data stream uses this name, this request creates the
|
||||
`my-data-stream-alt` data stream.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_data_stream/logs_alt
|
||||
PUT /_data_stream/my-data-stream-alt
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
|
@ -292,50 +292,50 @@ the following information about one or more data streams:
|
|||
* The current {ilm-init} lifecycle policy in the stream's matching index
|
||||
template
|
||||
|
||||
The following get data stream API request retrieves information about the
|
||||
`logs` data stream.
|
||||
The following get data stream API request retrieves information about
|
||||
`my-data-stream`.
|
||||
|
||||
////
|
||||
[source,console]
|
||||
----
|
||||
POST /logs/_rollover/
|
||||
POST /my-data-stream/_rollover/
|
||||
----
|
||||
// TEST[continued]
|
||||
////
|
||||
|
||||
[source,console]
|
||||
----
|
||||
GET /_data_stream/logs
|
||||
GET /_data_stream/my-data-stream
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
The API returns the following response. Note the `indices` property contains an
|
||||
array of the stream's current backing indices. The last item in this array
|
||||
contains information about the stream's write index, `.ds-logs-000002`.
|
||||
contains information about the stream's write index, `.ds-my-data-stream-000002`.
|
||||
|
||||
[source,console-result]
|
||||
----
|
||||
{
|
||||
"data_streams": [
|
||||
{
|
||||
"name": "logs",
|
||||
"name": "my-data-stream",
|
||||
"timestamp_field": {
|
||||
"name": "@timestamp"
|
||||
},
|
||||
"indices": [
|
||||
{
|
||||
"index_name": ".ds-logs-000001",
|
||||
"index_name": ".ds-my-data-stream-000001",
|
||||
"index_uuid": "krR78LfvTOe6gr5dj2_1xQ"
|
||||
},
|
||||
{
|
||||
"index_name": ".ds-logs-000002", <1>
|
||||
"index_name": ".ds-my-data-stream-000002", <1>
|
||||
"index_uuid": "C6LWyNJHQWmA08aQGvqRkA"
|
||||
}
|
||||
],
|
||||
"generation": 2,
|
||||
"status": "GREEN",
|
||||
"template": "logs_data_stream",
|
||||
"ilm_policy": "logs_policy"
|
||||
"template": "my-data-stream-template",
|
||||
"ilm_policy": "my-data-stream-policy"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
@ -344,8 +344,9 @@ contains information about the stream's write index, `.ds-logs-000002`.
|
|||
// TESTRESPONSE[s/"index_uuid": "C6LWyNJHQWmA08aQGvqRkA"/"index_uuid": $body.data_streams.0.indices.1.index_uuid/]
|
||||
// TESTRESPONSE[s/"status": "GREEN"/"status": "YELLOW"/]
|
||||
|
||||
<1> Last item in the `indices` array for the `logs` data stream. This item
|
||||
contains information about the stream's current write index, `.ds-logs-000002`.
|
||||
<1> Last item in the `indices` array for `my-data-stream`. This
|
||||
item contains information about the stream's current write index,
|
||||
`.ds-my-data-stream-000002`.
|
||||
|
||||
[discrete]
|
||||
[[secure-a-data-stream]]
|
||||
|
@ -368,12 +369,12 @@ image::images/data-streams/data-streams-list.png[Data Streams tab]
|
|||
|
||||
You can also use the the <<indices-delete-data-stream,delete data stream API>>
|
||||
to delete a data stream. The following delete data stream API request deletes
|
||||
the `logs` data stream. This request also deletes the stream's backing indices
|
||||
and any data they contain.
|
||||
`my-data-stream`. This request also deletes the stream's backing
|
||||
indices and any data they contain.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
DELETE /_data_stream/logs
|
||||
DELETE /_data_stream/my-data-stream
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
|
@ -382,7 +383,7 @@ DELETE /_data_stream/logs
|
|||
----
|
||||
DELETE /_data_stream/*
|
||||
DELETE /_index_template/*
|
||||
DELETE /_ilm/policy/logs_policy
|
||||
DELETE /_ilm/policy/my-data-stream-policy
|
||||
----
|
||||
// TEST[continued]
|
||||
////
|
||||
|
|
|
@ -18,19 +18,19 @@ the following:
|
|||
////
|
||||
[source,console]
|
||||
----
|
||||
PUT /_index_template/logs_data_stream
|
||||
PUT /_index_template/my-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "logs*" ],
|
||||
"index_patterns": [ "my-data-stream*" ],
|
||||
"data_stream": { }
|
||||
}
|
||||
|
||||
PUT /_data_stream/logs
|
||||
PUT /_data_stream/my-data-stream
|
||||
|
||||
POST /logs/_rollover/
|
||||
POST /my-data-stream/_rollover/
|
||||
|
||||
POST /logs/_rollover/
|
||||
POST /my-data-stream/_rollover/
|
||||
|
||||
PUT /logs/_create/bfspvnIBr7VVZlfp2lqX?refresh=wait_for
|
||||
PUT /my-data-stream/_create/bfspvnIBr7VVZlfp2lqX?refresh=wait_for
|
||||
{
|
||||
"@timestamp": "2020-12-07T11:06:07.000Z",
|
||||
"user": {
|
||||
|
@ -39,7 +39,7 @@ PUT /logs/_create/bfspvnIBr7VVZlfp2lqX?refresh=wait_for
|
|||
"message": "Login successful"
|
||||
}
|
||||
|
||||
PUT /_data_stream/logs_alt
|
||||
PUT /_data_stream/my-data-stream-alt
|
||||
----
|
||||
// TESTSETUP
|
||||
|
||||
|
@ -84,12 +84,11 @@ to a data stream.
|
|||
|
||||
NOTE: The `op_type` parameter defaults to `create` when adding new documents.
|
||||
|
||||
The following index API request adds a new document to the `logs` data
|
||||
stream.
|
||||
The following index API request adds a new document to `my-data-stream`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
POST /logs/_doc/
|
||||
POST /my-data-stream/_doc/
|
||||
{
|
||||
"@timestamp": "2020-12-07T11:06:07.000Z",
|
||||
"user": {
|
||||
|
@ -115,11 +114,11 @@ stream in a single request. Each action in the bulk request must use the
|
|||
NOTE: Data streams do not support other bulk actions, such as `index`.
|
||||
|
||||
The following bulk API request adds several new documents to
|
||||
the `logs` data stream. Note that only the `create` action is used.
|
||||
`my-data-stream`. Only the `create` action is used.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /logs/_bulk?refresh
|
||||
PUT /my-data-stream/_bulk?refresh
|
||||
{"create":{ }}
|
||||
{ "@timestamp": "2020-12-08T11:04:05.000Z", "user": { "id": "vlb44hny" }, "message": "Login attempt failed" }
|
||||
{"create":{ }}
|
||||
|
@ -156,7 +155,7 @@ PUT /_ingest/pipeline/lowercase_message_field
|
|||
----
|
||||
// TEST[continued]
|
||||
|
||||
The following index API request adds a new document to the `logs` data stream.
|
||||
The following index API request adds a new document to `my-data-stream`.
|
||||
|
||||
The request includes a `?pipeline=lowercase_message_field` query parameter.
|
||||
This parameter indicates {es} should use the `lowercase_message_field` pipeline
|
||||
|
@ -167,7 +166,7 @@ During pre-processing, the pipeline changes the letter case of the document's
|
|||
|
||||
[source,console]
|
||||
----
|
||||
POST /logs/_doc?pipeline=lowercase_message_field
|
||||
POST /my-data-stream/_doc?pipeline=lowercase_message_field
|
||||
{
|
||||
"@timestamp": "2020-12-08T11:12:01.000Z",
|
||||
"user": {
|
||||
|
@ -199,13 +198,13 @@ The following search APIs support data streams:
|
|||
* <<search-field-caps, Field capabilities>>
|
||||
* <<eql-search-api, EQL search>>
|
||||
|
||||
The following <<search-search,search API>> request searches the `logs` data
|
||||
stream for documents with a timestamp between today and yesterday that also have
|
||||
The following <<search-search,search API>> request searches `my-data-stream`
|
||||
for documents with a timestamp between today and yesterday that also have
|
||||
`message` value of `login successful`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
GET /logs/_search
|
||||
GET /my-data-stream/_search
|
||||
{
|
||||
"query": {
|
||||
"bool": {
|
||||
|
@ -230,12 +229,12 @@ GET /logs/_search
|
|||
You can use a comma-separated list or wildcard (`*`) expression to search
|
||||
multiple data streams, indices, and index aliases in the same request.
|
||||
|
||||
The following request searches the `logs` and `logs_alt` data streams, which are
|
||||
specified as a comma-separated list in the request path.
|
||||
The following request searches `my-data-stream` and `my-data-stream-alt`,
|
||||
which are specified as a comma-separated list in the request path.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
GET /logs,logs_alt/_search
|
||||
GET /my-data-stream,my-data-stream-alt/_search
|
||||
{
|
||||
"query": {
|
||||
"match": {
|
||||
|
@ -244,12 +243,12 @@ GET /logs,logs_alt/_search
|
|||
}
|
||||
}
|
||||
----
|
||||
The following request uses the `logs*` wildcard expression to search any data
|
||||
stream, index, or index alias beginning with `logs`.
|
||||
The following request uses the `my-data-stream*` wildcard expression to search any data
|
||||
stream, index, or index alias beginning with `my-data-stream`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
GET /logs*/_search
|
||||
GET /my-data-stream*/_search
|
||||
{
|
||||
"query": {
|
||||
"match": {
|
||||
|
@ -288,12 +287,12 @@ statistics for one or more data streams. These statistics include:
|
|||
.*Example*
|
||||
[%collapsible]
|
||||
====
|
||||
The following data stream stats API request retrieves statistics for the
|
||||
`logs` data stream.
|
||||
The following data stream stats API request retrieves statistics for
|
||||
`my-data-stream`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
GET /_data_stream/logs/_stats?human=true
|
||||
GET /_data_stream/my-data-stream/_stats?human=true
|
||||
----
|
||||
|
||||
The API returns the following response.
|
||||
|
@ -312,7 +311,7 @@ The API returns the following response.
|
|||
"total_store_size_bytes": 624,
|
||||
"data_streams": [
|
||||
{
|
||||
"data_stream": "logs",
|
||||
"data_stream": "my-data-stream",
|
||||
"backing_indices": 3,
|
||||
"store_size": "624b",
|
||||
"store_size_bytes": 624,
|
||||
|
@ -346,11 +345,11 @@ manually perform a rollover. This can be useful if you want to
|
|||
to the stream's write index after updating a data stream's template.
|
||||
|
||||
The following <<indices-rollover-index,rollover API>> request submits a manual
|
||||
rollover request for the `logs` data stream.
|
||||
rollover request for `my-data-stream`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
POST /logs/_rollover/
|
||||
POST /my-data-stream/_rollover/
|
||||
----
|
||||
|
||||
[discrete]
|
||||
|
@ -370,50 +369,50 @@ You also can conveniently re-open all closed backing indices for a data stream
|
|||
by sending an open request directly to the stream.
|
||||
|
||||
The following <<cat-indices,cat indices>> API request retrieves the status for
|
||||
the `logs` data stream's backing indices.
|
||||
`my-data-stream`'s backing indices.
|
||||
|
||||
////
|
||||
[source,console]
|
||||
----
|
||||
POST /.ds-logs-000001,.ds-logs-000002/_close/
|
||||
POST /.ds-my-data-stream-000001,.ds-my-data-stream-000002/_close/
|
||||
----
|
||||
////
|
||||
|
||||
[source,console]
|
||||
----
|
||||
GET /_cat/indices/logs?v&s=index&h=index,status
|
||||
GET /_cat/indices/my-data-stream?v&s=index&h=index,status
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
The API returns the following response. The response indicates the `logs` data
|
||||
stream contains two closed backing indices: `.ds-logs-000001` and
|
||||
`.ds-logs-000002`.
|
||||
The API returns the following response. The response indicates
|
||||
`my-data-stream` contains two closed backing indices:
|
||||
`.ds-my-data-stream-000001` and `.ds-my-data-stream-000002`.
|
||||
|
||||
[source,txt]
|
||||
----
|
||||
index status
|
||||
.ds-logs-000001 close
|
||||
.ds-logs-000002 close
|
||||
.ds-logs-000003 open
|
||||
index status
|
||||
.ds-my-data-stream-000001 close
|
||||
.ds-my-data-stream-000002 close
|
||||
.ds-my-data-stream-000003 open
|
||||
----
|
||||
// TESTRESPONSE[non_json]
|
||||
|
||||
The following <<indices-open-close,open API>> request re-opens any closed
|
||||
backing indices for the `logs` data stream, including `.ds-logs-000001` and
|
||||
`.ds-logs-000002`.
|
||||
backing indices for `my-data-stream`, including
|
||||
`.ds-my-data-stream-000001` and `.ds-my-data-stream-000002`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
POST /logs/_open/
|
||||
POST /my-data-stream/_open/
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
You can resubmit the original cat indices API request to verify the
|
||||
`.ds-logs-000001` and `.ds-logs-000002` backing indices were re-opened.
|
||||
You can resubmit the original cat indices API request to verify
|
||||
`.ds-my-data-stream-000001` and `.ds-my-data-stream-000002` were re-opened.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
GET /_cat/indices/logs?v&s=index&h=index,status
|
||||
GET /_cat/indices/my-data-stream?v&s=index&h=index,status
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
|
@ -421,10 +420,10 @@ The API returns the following response.
|
|||
|
||||
[source,txt]
|
||||
----
|
||||
index status
|
||||
.ds-logs-000001 open
|
||||
.ds-logs-000002 open
|
||||
.ds-logs-000003 open
|
||||
index status
|
||||
.ds-my-data-stream-000001 open
|
||||
.ds-my-data-stream-000002 open
|
||||
.ds-my-data-stream-000003 open
|
||||
----
|
||||
// TESTRESPONSE[non_json]
|
||||
|
||||
|
@ -461,8 +460,8 @@ write index, we recommend you update the <<create-a-data-stream-template,data
|
|||
stream's template>> and perform a <<manually-roll-over-a-data-stream,rollover>>.
|
||||
|
||||
The following reindex request copies documents from the `archive` index alias to
|
||||
the existing `logs` data stream. Because the destination is a data stream, the
|
||||
request's `op_type` is `create`.
|
||||
`my-data-stream`. Because the destination is a data
|
||||
stream, the request's `op_type` is `create`.
|
||||
|
||||
////
|
||||
[source,console]
|
||||
|
@ -495,7 +494,7 @@ POST /_reindex
|
|||
"index": "archive"
|
||||
},
|
||||
"dest": {
|
||||
"index": "logs",
|
||||
"index": "my-data-stream",
|
||||
"op_type": "create"
|
||||
}
|
||||
}
|
||||
|
@ -505,16 +504,16 @@ POST /_reindex
|
|||
You can also reindex documents from a data stream to an index, index
|
||||
alias, or data stream.
|
||||
|
||||
The following reindex request copies documents from the `logs` data stream
|
||||
to the existing `archive` index alias. Because the destination is not a data
|
||||
stream, the `op_type` does not need to be specified.
|
||||
The following reindex request copies documents from `my-data-stream`
|
||||
to the existing `archive` index alias. Because the destination is not a
|
||||
data stream, the `op_type` does not need to be specified.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
POST /_reindex
|
||||
{
|
||||
"source": {
|
||||
"index": "logs"
|
||||
"index": "my-data-stream"
|
||||
},
|
||||
"dest": {
|
||||
"index": "archive"
|
||||
|
@ -540,14 +539,14 @@ data stream. These prohibited requests include:
|
|||
Instead, you can use the <<docs-update-by-query,update by query API>> to update
|
||||
documents in a data stream that matches a provided query.
|
||||
|
||||
The following update by query request updates documents in the `logs` data
|
||||
stream with a `user.id` of `l7gk7f82`. The request uses a
|
||||
The following update by query request updates documents in `my-data-stream`
|
||||
with a `user.id` of `l7gk7f82`. The request uses a
|
||||
<<modules-scripting-using,script>> to assign matching documents a new `user.id`
|
||||
value of `XgdX0NoX`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
POST /logs/_update_by_query
|
||||
POST /my-data-stream/_update_by_query
|
||||
{
|
||||
"query": {
|
||||
"match": {
|
||||
|
@ -577,12 +576,12 @@ prohibited requests include:
|
|||
Instead, you can use the <<docs-delete-by-query,delete by query API>> to delete
|
||||
documents in a data stream that matches a provided query.
|
||||
|
||||
The following delete by query request deletes documents in the `logs` data
|
||||
stream with a `user.id` of `vlb44hny`.
|
||||
The following delete by query request deletes documents in `my-data-stream`
|
||||
with a `user.id` of `vlb44hny`.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
POST /logs/_delete_by_query
|
||||
POST /my-data-stream/_delete_by_query
|
||||
{
|
||||
"query": {
|
||||
"match": {
|
||||
|
@ -609,9 +608,9 @@ If you want to update a document, you must also get its current
|
|||
You can use a <<search-a-data-stream,search request>> to retrieve this
|
||||
information.
|
||||
|
||||
The following search request retrieves documents in the `logs` data stream with
|
||||
a `user.id` of `yWIumJd7`. By default, this search returns the document ID and
|
||||
backing index for any matching documents.
|
||||
The following search request retrieves documents in `my-data-stream`
|
||||
with a `user.id` of `yWIumJd7`. By default, this search returns the
|
||||
document ID and backing index for any matching documents.
|
||||
|
||||
The request includes a `"seq_no_primary_term": true` argument. This means the
|
||||
search also returns the sequence number and primary term for any matching
|
||||
|
@ -619,7 +618,7 @@ documents.
|
|||
|
||||
[source,console]
|
||||
----
|
||||
GET /logs/_search
|
||||
GET /my-data-stream/_search
|
||||
{
|
||||
"seq_no_primary_term": true,
|
||||
"query": {
|
||||
|
@ -652,7 +651,7 @@ information for any documents matching the search.
|
|||
"max_score": 0.2876821,
|
||||
"hits": [
|
||||
{
|
||||
"_index": ".ds-logs-000003", <1>
|
||||
"_index": ".ds-my-data-stream-000003", <1>
|
||||
"_type": "_doc",
|
||||
"_id": "bfspvnIBr7VVZlfp2lqX", <2>
|
||||
"_seq_no": 0, <3>
|
||||
|
@ -683,9 +682,9 @@ You can use an <<docs-index_,index API>> request to update an individual
|
|||
document. To prevent an accidental overwrite, this request must include valid
|
||||
`if_seq_no` and `if_primary_term` arguments.
|
||||
|
||||
The following index API request updates an existing document in the `logs` data
|
||||
stream. The request targets document ID `bfspvnIBr7VVZlfp2lqX` in the
|
||||
`.ds-logs-000003` backing index.
|
||||
The following index API request updates an existing document in
|
||||
`my-data-stream`. The request targets document ID
|
||||
`bfspvnIBr7VVZlfp2lqX` in the `.ds-my-data-stream-000003` backing index.
|
||||
|
||||
The request also includes the current sequence number and primary term in the
|
||||
respective `if_seq_no` and `if_primary_term` query parameters. The request body
|
||||
|
@ -693,7 +692,7 @@ contains a new JSON source for the document.
|
|||
|
||||
[source,console]
|
||||
----
|
||||
PUT /.ds-logs-000003/_doc/bfspvnIBr7VVZlfp2lqX?if_seq_no=0&if_primary_term=1
|
||||
PUT /.ds-my-data-stream-000003/_doc/bfspvnIBr7VVZlfp2lqX?if_seq_no=0&if_primary_term=1
|
||||
{
|
||||
"@timestamp": "2020-12-07T11:06:07.000Z",
|
||||
"user": {
|
||||
|
@ -706,13 +705,13 @@ PUT /.ds-logs-000003/_doc/bfspvnIBr7VVZlfp2lqX?if_seq_no=0&if_primary_term=1
|
|||
You use the <<docs-delete,delete API>> to delete individual documents. Deletion
|
||||
requests do not require a sequence number or primary term.
|
||||
|
||||
The following index API request deletes an existing document in the `logs` data
|
||||
stream. The request targets document ID `bfspvnIBr7VVZlfp2lqX` in the
|
||||
`.ds-logs-000003` backing index.
|
||||
The following index API request deletes an existing document in
|
||||
`my-data-stream`. The request targets document ID
|
||||
`bfspvnIBr7VVZlfp2lqX` in the `.ds-my-data-stream-000003` backing index.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
DELETE /.ds-logs-000003/_doc/bfspvnIBr7VVZlfp2lqX
|
||||
DELETE /.ds-my-data-stream-000003/_doc/bfspvnIBr7VVZlfp2lqX
|
||||
----
|
||||
|
||||
You can use the <<docs-bulk,bulk API>> to delete or update multiple documents in
|
||||
|
@ -723,17 +722,17 @@ If the action type is `index`, the action must include valid
|
|||
arguments.
|
||||
|
||||
The following bulk API request uses an `index` action to update an existing
|
||||
document in the `logs` data stream.
|
||||
document in `my-data-stream`.
|
||||
|
||||
The `index` action targets document ID `bfspvnIBr7VVZlfp2lqX` in the
|
||||
`.ds-logs-000003` backing index. The action also includes the current sequence
|
||||
number and primary term in the respective `if_seq_no` and `if_primary_term`
|
||||
parameters.
|
||||
`.ds-my-data-stream-000003` backing index. The action also includes the current
|
||||
sequence number and primary term in the respective `if_seq_no` and
|
||||
`if_primary_term` parameters.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
PUT /_bulk?refresh
|
||||
{ "index": { "_index": ".ds-logs-000003", "_id": "bfspvnIBr7VVZlfp2lqX", "if_seq_no": 0, "if_primary_term": 1 } }
|
||||
{ "index": { "_index": ".ds-my-data-stream-000003", "_id": "bfspvnIBr7VVZlfp2lqX", "if_seq_no": 0, "if_primary_term": 1 } }
|
||||
{ "@timestamp": "2020-12-07T11:06:07.000Z", "user": { "id": "8a4f500d" }, "message": "Login successful" }
|
||||
----
|
||||
|
||||
|
|
Binary file not shown.
Before Width: | Height: | Size: 16 KiB After Width: | Height: | Size: 54 KiB |
|
@ -257,7 +257,7 @@ POST /my-data-stream/_rollover <2>
|
|||
<1> Creates a data stream called `my-data-stream` with one initial backing index
|
||||
named `my-data-stream-000001`.
|
||||
<2> This request creates a new backing index, `my-data-stream-000002`, and adds
|
||||
it as the write index for the `my-data-stream` data stream if the current
|
||||
it as the write index for `my-data-stream` if the current
|
||||
write index meets at least one of the following conditions:
|
||||
+
|
||||
--
|
||||
|
|
|
@ -17,37 +17,38 @@ to control access to a data stream. Any role or user granted privileges to a
|
|||
data stream are automatically granted the same privileges to its backing
|
||||
indices.
|
||||
|
||||
`logs` is a data stream that consists of two backing indices: `.ds-logs-000001`
|
||||
and `.ds-logs-000002`.
|
||||
For example, `my-data-stream` consists of two backing indices:
|
||||
`.ds-my-data-stream-000001` and `.ds-my-data-stream-000002`.
|
||||
|
||||
A user is granted the `read` privilege to the `logs` data stream.
|
||||
A user is granted the `read` privilege to `my-data-stream`.
|
||||
|
||||
[source,js]
|
||||
--------------------------------------------------
|
||||
{
|
||||
"names" : [ "logs" ],
|
||||
"names" : [ "my-data-stream" ],
|
||||
"privileges" : [ "read" ]
|
||||
}
|
||||
--------------------------------------------------
|
||||
// NOTCONSOLE
|
||||
|
||||
Because the user is automatically granted the same privileges to the stream's
|
||||
backing indices, the user can retrieve a document directly from `.ds-logs-000002`:
|
||||
backing indices, the user can retrieve a document directly from
|
||||
`.ds-my-data-stream-000002`:
|
||||
|
||||
////
|
||||
[source,console]
|
||||
----
|
||||
PUT /_index_template/logs_data_stream
|
||||
PUT /_index_template/my-data-stream-template
|
||||
{
|
||||
"index_patterns": [ "logs*" ],
|
||||
"index_patterns": [ "my-data-stream*" ],
|
||||
"data_stream": { }
|
||||
}
|
||||
|
||||
PUT /_data_stream/logs
|
||||
PUT /_data_stream/my-data-stream
|
||||
|
||||
POST /logs/_rollover/
|
||||
POST /my-data-stream/_rollover/
|
||||
|
||||
PUT /logs/_create/2?refresh=wait_for
|
||||
PUT /my-data-stream/_create/2?refresh=wait_for
|
||||
{
|
||||
"@timestamp": "2020-12-07T11:06:07.000Z"
|
||||
}
|
||||
|
@ -56,21 +57,21 @@ PUT /logs/_create/2?refresh=wait_for
|
|||
|
||||
[source,console]
|
||||
----
|
||||
GET /.ds-logs-000002/_doc/2
|
||||
GET /.ds-my-data-stream-000002/_doc/2
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
Later the `logs` data stream <<manually-roll-over-a-data-stream,rolls over>>.
|
||||
This creates a new backing index: `.ds-logs-000003`. Because the user still has
|
||||
the `read` privilege for the `logs` data stream, the user can retrieve documents
|
||||
directly from `.ds-logs-000003`:
|
||||
Later `my-data-stream` <<manually-roll-over-a-data-stream,rolls over>>. This
|
||||
creates a new backing index: `.ds-my-data-stream-000003`. Because the user still
|
||||
has the `read` privilege for `my-data-stream`, the user can retrieve
|
||||
documents directly from `.ds-my-data-stream-000003`:
|
||||
|
||||
////
|
||||
[source,console]
|
||||
----
|
||||
POST /logs/_rollover/
|
||||
POST /my-data-stream/_rollover/
|
||||
|
||||
PUT /logs/_create/2?refresh=wait_for
|
||||
PUT /my-data-stream/_create/2?refresh=wait_for
|
||||
{
|
||||
"@timestamp": "2020-12-07T11:06:07.000Z"
|
||||
}
|
||||
|
@ -80,7 +81,7 @@ PUT /logs/_create/2?refresh=wait_for
|
|||
|
||||
[source,console]
|
||||
----
|
||||
GET /.ds-logs-000003/_doc/2
|
||||
GET /.ds-my-data-stream-000003/_doc/2
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
|
|
Loading…
Reference in New Issue