[DOCS] Update example data stream names (#60783) (#60820)

Uses `my-data-stream` in place of `logs` for data stream examples.
This provides a more intuitive experience for users that copy/paste
their own values into snippets.
This commit is contained in:
James Rodewig 2020-08-06 09:38:35 -04:00 committed by GitHub
parent 721198c29e
commit ff4ea4720a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
7 changed files with 231 additions and 228 deletions

View File

@ -28,7 +28,7 @@ mappings and change <<index-modules-settings,dynamic index settings>>. See
//// ////
[source,console] [source,console]
---- ----
PUT /_ilm/policy/logs_policy PUT /_ilm/policy/my-data-stream-policy
{ {
"policy": { "policy": {
"phases": { "phases": {
@ -49,23 +49,23 @@ PUT /_ilm/policy/logs_policy
} }
} }
PUT /_index_template/logs_data_stream PUT /_index_template/my-data-stream-template
{ {
"index_patterns": [ "logs*" ], "index_patterns": [ "my-data-stream*" ],
"data_stream": { } "data_stream": { }
} }
PUT /_index_template/new_logs_data_stream PUT /_index_template/new-data-stream-template
{ {
"index_patterns": [ "new_logs*" ], "index_patterns": [ "new-data-stream*" ],
"data_stream": { } "data_stream": { }
} }
PUT /_data_stream/logs PUT /_data_stream/my-data-stream
POST /logs/_rollover/ POST /my-data-stream/_rollover/
PUT /_data_stream/new_logs PUT /_data_stream/new-data-stream
---- ----
// TESTSETUP // TESTSETUP
@ -75,7 +75,7 @@ DELETE /_data_stream/*
DELETE /_index_template/* DELETE /_index_template/*
DELETE /_ilm/policy/logs_policy DELETE /_ilm/policy/my-data-stream-policy
---- ----
// TEARDOWN // TEARDOWN
//// ////
@ -90,17 +90,17 @@ To add a mapping for a new field to a data stream, following these steps:
field mapping is added to future backing indices created for the stream. field mapping is added to future backing indices created for the stream.
+ +
-- --
For example, `logs_data_stream` is an existing index template used by the `logs` For example, `my-data-stream-template` is an existing index template used by
data stream. `my-data-stream`.
The following <<indices-templates,put index template>> request adds a mapping The following <<indices-templates,put index template>> request adds a mapping
for a new field, `message`, to the template. for a new field, `message`, to the template.
[source,console] [source,console]
---- ----
PUT /_index_template/logs_data_stream PUT /_index_template/my-data-stream-template
{ {
"index_patterns": [ "logs*" ], "index_patterns": [ "my-data-stream*" ],
"data_stream": { }, "data_stream": { },
"template": { "template": {
"mappings": { "mappings": {
@ -122,11 +122,11 @@ backing indices, including the write index.
+ +
-- --
The following put mapping API request adds the new `message` field mapping to The following put mapping API request adds the new `message` field mapping to
the `logs` data stream. `my-data-stream`.
[source,console] [source,console]
---- ----
PUT /logs/_mapping PUT /my-data-stream/_mapping
{ {
"properties": { "properties": {
"message": { "message": {
@ -142,12 +142,12 @@ To add the mapping only to the stream's write index, set the put mapping API's
+ +
-- --
The following put mapping request adds the new `message` field mapping only to The following put mapping request adds the new `message` field mapping only to
the `logs` stream's write index. The new field mapping is not added to the `my-data-stream`'s write index. The new field mapping is not added to
stream's other backing indices. the stream's other backing indices.
[source,console] [source,console]
---- ----
PUT /logs/_mapping?write_index_only=true PUT /my-data-stream/_mapping?write_index_only=true
{ {
"properties": { "properties": {
"message": { "message": {
@ -171,8 +171,8 @@ existing field, follow these steps:
field mapping is added to future backing indices created for the stream. field mapping is added to future backing indices created for the stream.
+ +
-- --
For example, `logs_data_stream` is an existing index template used by the `logs` For example, `my-data-stream-template` is an existing index template used by
data stream. `my-data-stream`.
The following <<indices-templates,put index template>> request changes the The following <<indices-templates,put index template>> request changes the
argument for the `host.ip` field's <<ignore-malformed,`ignore_malformed`>> argument for the `host.ip` field's <<ignore-malformed,`ignore_malformed`>>
@ -180,9 +180,9 @@ mapping parameter to `true`.
[source,console] [source,console]
---- ----
PUT /_index_template/logs_data_stream PUT /_index_template/my-data-stream-template
{ {
"index_patterns": [ "logs*" ], "index_patterns": [ "my-data-stream*" ],
"data_stream": { }, "data_stream": { },
"template": { "template": {
"mappings": { "mappings": {
@ -208,13 +208,13 @@ to the data stream. By default, this applies the changes to the stream's
existing backing indices, including the write index. existing backing indices, including the write index.
+ +
-- --
The following <<indices-put-mapping,put mapping API>> request targets the `logs` The following <<indices-put-mapping,put mapping API>> request targets
data stream. The request changes the argument for the `host.ip` field's `my-data-stream`. The request changes the argument for the `host.ip`
`ignore_malformed` mapping parameter to `true`. field's `ignore_malformed` mapping parameter to `true`.
[source,console] [source,console]
---- ----
PUT /logs/_mapping PUT /my-data-stream/_mapping
{ {
"properties": { "properties": {
"host": { "host": {
@ -230,17 +230,17 @@ PUT /logs/_mapping
---- ----
-- --
+ +
To apply the mapping changes only to the stream's write index, set the put mapping API's To apply the mapping changes only to the stream's write index, set the put
`write_index_only` query parameter to `true`. mapping API's `write_index_only` query parameter to `true`.
+ +
-- --
The following put mapping request changes the `host.ip` field's mapping only for The following put mapping request changes the `host.ip` field's mapping only for
the `logs` stream's write index. The change is not applied to the stream's other `my-data-stream`'s write index. The change is not applied to the
backing indices. stream's other backing indices.
[source,console] [source,console]
---- ----
PUT /logs/_mapping?write_index_only=true PUT /my-data-stream/_mapping?write_index_only=true
{ {
"properties": { "properties": {
"host": { "host": {
@ -276,17 +276,17 @@ follow these steps:
applied to future backing indices created for the stream. applied to future backing indices created for the stream.
+ +
-- --
For example, `logs_data_stream` is an existing index template used by the `logs` For example, `my-data-stream-template` is an existing index template used by
data stream. `my-data-stream`.
The following <<indices-templates,put index template>> request changes the The following <<indices-templates,put index template>> request changes the
template's `index.refresh_interval` index setting to `30s` (30 seconds). template's `index.refresh_interval` index setting to `30s` (30 seconds).
[source,console] [source,console]
---- ----
PUT /_index_template/logs_data_stream PUT /_index_template/my-data-stream-template
{ {
"index_patterns": [ "logs*" ], "index_patterns": [ "my-data-stream*" ],
"data_stream": { }, "data_stream": { },
"template": { "template": {
"settings": { "settings": {
@ -304,11 +304,11 @@ the stream's existing backing indices, including the write index.
+ +
-- --
The following update index settings API request updates the The following update index settings API request updates the
`index.refresh_interval` setting for the `logs` data stream. `index.refresh_interval` setting for `my-data-stream`.
[source,console] [source,console]
---- ----
PUT /logs/_settings PUT /my-data-stream/_settings
{ {
"index": { "index": {
"refresh_interval": "30s" "refresh_interval": "30s"
@ -329,17 +329,17 @@ To apply a new static setting to future backing indices, update the index
template used by the data stream. The setting is automatically applied to any template used by the data stream. The setting is automatically applied to any
backing index created after the update. backing index created after the update.
For example, `logs_data_stream` is an existing index template used by the `logs` For example, `my-data-stream-template` is an existing index template used by
data stream. `my-data-stream`.
The following <<indices-templates,put index template API>> requests adds new The following <<indices-templates,put index template API>> requests adds new
`sort.field` and `sort.order index` settings to the template. `sort.field` and `sort.order index` settings to the template.
[source,console] [source,console]
---- ----
PUT /_index_template/logs_data_stream PUT /_index_template/my-data-stream-template
{ {
"index_patterns": [ "logs*" ], "index_patterns": [ "my-data-stream*" ],
"data_stream": { }, "data_stream": { },
"template": { "template": {
"settings": { "settings": {
@ -386,12 +386,12 @@ existing indices, index aliases, or data streams. If so, you should consider
using another name or pattern. using another name or pattern.
-- --
The following resolve index API request checks for any existing indices, index The following resolve index API request checks for any existing indices, index
aliases, or data streams that start with `new_logs`. If not, the `new_logs*` aliases, or data streams that start with `new-data-stream`. If not, the
wildcard pattern can be used to create a new data stream. `new-data-stream*` wildcard pattern can be used to create a new data stream.
[source,console] [source,console]
---- ----
GET /_resolve/index/new_logs* GET /_resolve/index/new-data-stream*
---- ----
The API returns the following response, indicating no existing targets match The API returns the following response, indicating no existing targets match
@ -421,25 +421,26 @@ TIP: If you are only adding or changing a few things, we recommend you create a
new template by copying an existing one and modifying it as needed. new template by copying an existing one and modifying it as needed.
+ +
-- --
For example, `logs_data_stream` is an existing index template used by the For example, `my-data-stream-template` is an existing index template used by
`logs` data stream. `my-data-stream`.
The following <<indices-templates,put index template API>> request creates The following <<indices-templates,put index template API>> request creates a new
a new index template, `new_logs_data_stream`. `new_logs_data_stream` index template, `new-data-stream-template`. `new-data-stream-template`
uses the `logs_data_stream` template as its basis, with the following changes: uses `my-data-stream-template` as its basis, with the following
changes:
* The `index_patterns` wildcard pattern matches any index or data stream * The `index_patterns` wildcard pattern matches any index or data stream
starting with `new_logs`. starting with `new-data-stream`.
* The `@timestamp` field mapping uses the `date_nanos` field data type rather * The `@timestamp` field mapping uses the `date_nanos` field data type rather
than the `date` data type. than the `date` data type.
* The template includes `sort.field` and `sort.order` index settings, which were * The template includes `sort.field` and `sort.order` index settings, which were
not in the original `logs_data_stream` template. not in the original `my-data-stream-template` template.
[source,console] [source,console]
---- ----
PUT /_index_template/new_logs_data_stream PUT /_index_template/new-data-stream-template
{ {
"index_patterns": [ "new_logs*" ], "index_patterns": [ "new-data-stream*" ],
"data_stream": { }, "data_stream": { },
"template": { "template": {
"mappings": { "mappings": {
@ -481,16 +482,16 @@ to retain such a backing index until you are ready to delete its newest data.
==== ====
+ +
-- --
The following create data stream API request targets `new_logs`, which matches The following create data stream API request targets `new-data-stream`, which
the wildcard pattern for the `new_logs_data_stream` template. Because no matches the wildcard pattern for `new-data-stream-template`.
existing index or data stream uses this name, this request creates the Because no existing index or data stream uses this name, this request creates
`new_logs` data stream. the `new-data-stream` data stream.
[source,console] [source,console]
---- ----
PUT /_data_stream/new_logs PUT /_data_stream/new-data-stream
---- ----
// TEST[s/new_logs/new_logs_two/] // TEST[s/new-data-stream/new-data-stream-two/]
-- --
. If you do not want to mix new and old data in your new data stream, pause the . If you do not want to mix new and old data in your new data stream, pause the
@ -527,46 +528,46 @@ individual backing indices as the source. You can use the
indices. indices.
+ +
-- --
You plan to reindex data from the `logs` data stream into the newly created For example, you plan to reindex data from `my-data-stream` into
`new_logs` data stream. However, you want to submit a separate reindex request `new-data-stream`. However, you want to submit a separate reindex request for
for each backing index in the `logs` data stream, starting with the oldest each backing index in `my-data-stream`, starting with the oldest backing index.
backing index. This preserves the order in which the data was originally This preserves the order in which the data was originally indexed.
indexed.
The following get data stream API request retrieves information about the `logs` The following get data stream API request retrieves information about
data stream, including a list of its backing indices. `my-data-stream`, including a list of its backing indices.
[source,console] [source,console]
---- ----
GET /_data_stream/logs GET /_data_stream/my-data-stream
---- ----
The API returns the following response. Note the `indices` property contains an The API returns the following response. Note the `indices` property contains an
array of the stream's current backing indices. The first item in the array array of the stream's current backing indices. The first item in the array
contains information about the stream's oldest backing index, `.ds-logs-000001`. contains information about the stream's oldest backing index,
`.ds-my-data-stream-000001`.
[source,console-result] [source,console-result]
---- ----
{ {
"data_streams": [ "data_streams": [
{ {
"name": "logs", "name": "my-data-stream",
"timestamp_field": { "timestamp_field": {
"name": "@timestamp" "name": "@timestamp"
}, },
"indices": [ "indices": [
{ {
"index_name": ".ds-logs-000001", <1> "index_name": ".ds-my-data-stream-000001", <1>
"index_uuid": "Gpdiyq8sRuK9WuthvAdFbw" "index_uuid": "Gpdiyq8sRuK9WuthvAdFbw"
}, },
{ {
"index_name": ".ds-logs-000002", "index_name": ".ds-my-data-stream-000002",
"index_uuid": "_eEfRrFHS9OyhqWntkgHAQ" "index_uuid": "_eEfRrFHS9OyhqWntkgHAQ"
} }
], ],
"generation": 2, "generation": 2,
"status": "GREEN", "status": "GREEN",
"template": "logs_data_stream" "template": "my-data-stream-template"
} }
] ]
} }
@ -575,22 +576,23 @@ contains information about the stream's oldest backing index, `.ds-logs-000001`.
// TESTRESPONSE[s/"index_uuid": "_eEfRrFHS9OyhqWntkgHAQ"/"index_uuid": $body.data_streams.0.indices.1.index_uuid/] // TESTRESPONSE[s/"index_uuid": "_eEfRrFHS9OyhqWntkgHAQ"/"index_uuid": $body.data_streams.0.indices.1.index_uuid/]
// TESTRESPONSE[s/"status": "GREEN"/"status": "YELLOW"/] // TESTRESPONSE[s/"status": "GREEN"/"status": "YELLOW"/]
<1> First item in the `indices` array for the `logs` data stream. This item <1> First item in the `indices` array for `my-data-stream`. This
contains information about the stream's oldest backing index, `.ds-logs-000001`. item contains information about the stream's oldest backing index,
`.ds-my-data-stream-000001`.
The following <<docs-reindex,reindex API>> request copies documents from The following <<docs-reindex,reindex API>> request copies documents from
`.ds-logs-000001` to the `new_logs` data stream. Note the request's `op_type` is `.ds-my-data-stream-000001` to `new-data-stream`. Note the request's `op_type`
`create`. is `create`.
[source,console] [source,console]
---- ----
POST /_reindex POST /_reindex
{ {
"source": { "source": {
"index": ".ds-logs-000001" "index": ".ds-my-data-stream-000001"
}, },
"dest": { "dest": {
"index": "new_logs", "index": "new-data-stream",
"op_type": "create" "op_type": "create"
} }
} }
@ -601,9 +603,9 @@ You can also use a query to reindex only a subset of documents with each
request. request.
+ +
-- --
The following <<docs-reindex,reindex API>> request copies documents from the The following <<docs-reindex,reindex API>> request copies documents from
`logs` data stream to the `new_logs` data stream. The request uses a `my-data-stream` to `new-data-stream`. The request
<<query-dsl-range-query,`range` query>> to only reindex documents with a uses a <<query-dsl-range-query,`range` query>> to only reindex documents with a
timestamp within the last week. Note the request's `op_type` is `create`. timestamp within the last week. Note the request's `op_type` is `create`.
[source,console] [source,console]
@ -611,7 +613,7 @@ timestamp within the last week. Note the request's `op_type` is `create`.
POST /_reindex POST /_reindex
{ {
"source": { "source": {
"index": "logs", "index": "my-data-stream",
"query": { "query": {
"range": { "range": {
"@timestamp": { "@timestamp": {
@ -622,7 +624,7 @@ POST /_reindex
} }
}, },
"dest": { "dest": {
"index": "new_logs", "index": "new-data-stream",
"op_type": "create" "op_type": "create"
} }
} }
@ -656,11 +658,11 @@ data stream, you can safely remove the old stream.
+ +
-- --
The following <<indices-delete-data-stream,delete data stream API>> request The following <<indices-delete-data-stream,delete data stream API>> request
deletes the `logs` data stream. This request also deletes the stream's backing deletes `my-data-stream`. This request also deletes the stream's
indices and any data they contain. backing indices and any data they contain.
[source,console] [source,console]
---- ----
DELETE /_data_stream/logs DELETE /_data_stream/my-data-stream
---- ----
-- --

View File

@ -90,9 +90,9 @@ convention:
.ds-<data-stream>-<generation> .ds-<data-stream>-<generation>
---- ----
For example, the `web_server_logs` data stream has a generation of `34`. The For example, the `web-server-logs` data stream has a generation of `34`. The
most recently created backing index for this data stream is named most recently created backing index for this data stream is named
`.ds-web_server_logs-000034`. `.ds-web-server-logs-000034`.
Because the generation increments with each new backing index, backing indices Because the generation increments with each new backing index, backing indices
with a higher generation contain more recent data. Backing indices with a lower with a higher generation contain more recent data. Backing indices with a lower

View File

@ -65,15 +65,15 @@ image::images/ilm/create-policy.png[Index Lifecycle Policies page]
You can also create a policy using the <<ilm-put-lifecycle,create lifecycle You can also create a policy using the <<ilm-put-lifecycle,create lifecycle
policy API>>. policy API>>.
The following request configures the `logs_policy` lifecycle policy. The The following request configures the `my-data-stream-policy` lifecycle policy.
`logs_policy` policy uses the <<ilm-rollover,`rollover` action>> to create a The policy uses the <<ilm-rollover,`rollover` action>> to create a
new <<data-stream-write-index,write index>> for the data stream when the current new <<data-stream-write-index,write index>> for the data stream when the current
one reaches 25GB in size. The policy also deletes backing indices 30 days after one reaches 25GB in size. The policy also deletes backing indices 30 days after
their rollover. their rollover.
[source,console] [source,console]
---- ----
PUT /_ilm/policy/logs_policy PUT /_ilm/policy/my-data-stream-policy
{ {
"policy": { "policy": {
"phases": { "phases": {
@ -139,19 +139,19 @@ template API>>. The template must include a `data_stream` object with an empty
body (`{ }`). This object indicates the template is used exclusively for data body (`{ }`). This object indicates the template is used exclusively for data
streams. streams.
The following request configures the `logs_data_stream` index template. Because The following request configures the `my-data-stream-template` index template.
no field mapping is specified, the `@timestamp` field uses the `date` field data Because no field mapping is specified, the `@timestamp` field uses the `date`
type by default. field data type by default.
[source,console] [source,console]
---- ----
PUT /_index_template/logs_data_stream PUT /_index_template/my-data-stream-template
{ {
"index_patterns": [ "logs*" ], "index_patterns": [ "my-data-stream*" ],
"data_stream": { }, "data_stream": { },
"template": { "template": {
"settings": { "settings": {
"index.lifecycle.name": "logs_policy" "index.lifecycle.name": "my-data-stream-policy"
} }
} }
} }
@ -162,9 +162,9 @@ Alternatively, the following template maps `@timestamp` as a `date_nanos` field.
[source,console] [source,console]
---- ----
PUT /_index_template/logs_data_stream PUT /_index_template/my-data-stream-template
{ {
"index_patterns": [ "logs*" ], "index_patterns": [ "my-data-stream*" ],
"data_stream": { }, "data_stream": { },
"template": { "template": {
"mappings": { "mappings": {
@ -173,7 +173,7 @@ PUT /_index_template/logs_data_stream
} }
}, },
"settings": { "settings": {
"index.lifecycle.name": "logs_policy" "index.lifecycle.name": "my-data-stream-policy"
} }
} }
} }
@ -210,14 +210,14 @@ uses the target name as the name for the stream.
NOTE: Data streams support only specific types of indexing requests. See NOTE: Data streams support only specific types of indexing requests. See
<<add-documents-to-a-data-stream>>. <<add-documents-to-a-data-stream>>.
The following <<docs-index_,index API>> request targets `logs`, which matches The following <<docs-index_,index API>> request targets `my-data-stream`, which
the wildcard pattern for the `logs_data_stream` template. Because no existing matches the wildcard pattern for `my-data-stream-template`. Because
index or data stream uses this name, this request creates the `logs` data stream no existing index or data stream uses this name, this request creates the
and indexes the document to it. `my-data-stream` data stream and indexes the document to it.
[source,console] [source,console]
---- ----
POST /logs/_doc/ POST /my-data-stream/_doc/
{ {
"@timestamp": "2020-12-06T11:04:05.000Z", "@timestamp": "2020-12-06T11:04:05.000Z",
"user": { "user": {
@ -229,13 +229,13 @@ POST /logs/_doc/
// TEST[continued] // TEST[continued]
The API returns the following response. Note the `_index` property contains The API returns the following response. Note the `_index` property contains
`.ds-logs-000001`, indicating the document was indexed to the write index of the `.ds-my-data-stream-000001`, indicating the document was indexed to the write
new `logs` data stream. index of the new data stream.
[source,console-result] [source,console-result]
---- ----
{ {
"_index": ".ds-logs-000001", "_index": ".ds-my-data-stream-000001",
"_id": "qecQmXIBT4jB8tq1nG0j", "_id": "qecQmXIBT4jB8tq1nG0j",
"_type": "_doc", "_type": "_doc",
"_version": 1, "_version": 1,
@ -259,14 +259,14 @@ You can use the <<indices-create-data-stream,create data stream API>> to
manually create a data stream. The name of the data stream must match the name manually create a data stream. The name of the data stream must match the name
or wildcard pattern defined in the template's `index_patterns` property. or wildcard pattern defined in the template's `index_patterns` property.
The following create data stream request The following create data stream request targets `my-data-stream-alt`, which
targets `logs_alt`, which matches the wildcard pattern for the matches the wildcard pattern for `my-data-stream-template`. Because
`logs_data_stream` template. Because no existing index or data stream uses this no existing index or data stream uses this name, this request creates the
name, this request creates the `logs_alt` data stream. `my-data-stream-alt` data stream.
[source,console] [source,console]
---- ----
PUT /_data_stream/logs_alt PUT /_data_stream/my-data-stream-alt
---- ----
// TEST[continued] // TEST[continued]
@ -292,50 +292,50 @@ the following information about one or more data streams:
* The current {ilm-init} lifecycle policy in the stream's matching index * The current {ilm-init} lifecycle policy in the stream's matching index
template template
The following get data stream API request retrieves information about the The following get data stream API request retrieves information about
`logs` data stream. `my-data-stream`.
//// ////
[source,console] [source,console]
---- ----
POST /logs/_rollover/ POST /my-data-stream/_rollover/
---- ----
// TEST[continued] // TEST[continued]
//// ////
[source,console] [source,console]
---- ----
GET /_data_stream/logs GET /_data_stream/my-data-stream
---- ----
// TEST[continued] // TEST[continued]
The API returns the following response. Note the `indices` property contains an The API returns the following response. Note the `indices` property contains an
array of the stream's current backing indices. The last item in this array array of the stream's current backing indices. The last item in this array
contains information about the stream's write index, `.ds-logs-000002`. contains information about the stream's write index, `.ds-my-data-stream-000002`.
[source,console-result] [source,console-result]
---- ----
{ {
"data_streams": [ "data_streams": [
{ {
"name": "logs", "name": "my-data-stream",
"timestamp_field": { "timestamp_field": {
"name": "@timestamp" "name": "@timestamp"
}, },
"indices": [ "indices": [
{ {
"index_name": ".ds-logs-000001", "index_name": ".ds-my-data-stream-000001",
"index_uuid": "krR78LfvTOe6gr5dj2_1xQ" "index_uuid": "krR78LfvTOe6gr5dj2_1xQ"
}, },
{ {
"index_name": ".ds-logs-000002", <1> "index_name": ".ds-my-data-stream-000002", <1>
"index_uuid": "C6LWyNJHQWmA08aQGvqRkA" "index_uuid": "C6LWyNJHQWmA08aQGvqRkA"
} }
], ],
"generation": 2, "generation": 2,
"status": "GREEN", "status": "GREEN",
"template": "logs_data_stream", "template": "my-data-stream-template",
"ilm_policy": "logs_policy" "ilm_policy": "my-data-stream-policy"
} }
] ]
} }
@ -344,8 +344,9 @@ contains information about the stream's write index, `.ds-logs-000002`.
// TESTRESPONSE[s/"index_uuid": "C6LWyNJHQWmA08aQGvqRkA"/"index_uuid": $body.data_streams.0.indices.1.index_uuid/] // TESTRESPONSE[s/"index_uuid": "C6LWyNJHQWmA08aQGvqRkA"/"index_uuid": $body.data_streams.0.indices.1.index_uuid/]
// TESTRESPONSE[s/"status": "GREEN"/"status": "YELLOW"/] // TESTRESPONSE[s/"status": "GREEN"/"status": "YELLOW"/]
<1> Last item in the `indices` array for the `logs` data stream. This item <1> Last item in the `indices` array for `my-data-stream`. This
contains information about the stream's current write index, `.ds-logs-000002`. item contains information about the stream's current write index,
`.ds-my-data-stream-000002`.
[discrete] [discrete]
[[secure-a-data-stream]] [[secure-a-data-stream]]
@ -368,12 +369,12 @@ image::images/data-streams/data-streams-list.png[Data Streams tab]
You can also use the the <<indices-delete-data-stream,delete data stream API>> You can also use the the <<indices-delete-data-stream,delete data stream API>>
to delete a data stream. The following delete data stream API request deletes to delete a data stream. The following delete data stream API request deletes
the `logs` data stream. This request also deletes the stream's backing indices `my-data-stream`. This request also deletes the stream's backing
and any data they contain. indices and any data they contain.
[source,console] [source,console]
---- ----
DELETE /_data_stream/logs DELETE /_data_stream/my-data-stream
---- ----
// TEST[continued] // TEST[continued]
@ -382,7 +383,7 @@ DELETE /_data_stream/logs
---- ----
DELETE /_data_stream/* DELETE /_data_stream/*
DELETE /_index_template/* DELETE /_index_template/*
DELETE /_ilm/policy/logs_policy DELETE /_ilm/policy/my-data-stream-policy
---- ----
// TEST[continued] // TEST[continued]
//// ////

View File

@ -18,19 +18,19 @@ the following:
//// ////
[source,console] [source,console]
---- ----
PUT /_index_template/logs_data_stream PUT /_index_template/my-data-stream-template
{ {
"index_patterns": [ "logs*" ], "index_patterns": [ "my-data-stream*" ],
"data_stream": { } "data_stream": { }
} }
PUT /_data_stream/logs PUT /_data_stream/my-data-stream
POST /logs/_rollover/ POST /my-data-stream/_rollover/
POST /logs/_rollover/ POST /my-data-stream/_rollover/
PUT /logs/_create/bfspvnIBr7VVZlfp2lqX?refresh=wait_for PUT /my-data-stream/_create/bfspvnIBr7VVZlfp2lqX?refresh=wait_for
{ {
"@timestamp": "2020-12-07T11:06:07.000Z", "@timestamp": "2020-12-07T11:06:07.000Z",
"user": { "user": {
@ -39,7 +39,7 @@ PUT /logs/_create/bfspvnIBr7VVZlfp2lqX?refresh=wait_for
"message": "Login successful" "message": "Login successful"
} }
PUT /_data_stream/logs_alt PUT /_data_stream/my-data-stream-alt
---- ----
// TESTSETUP // TESTSETUP
@ -84,12 +84,11 @@ to a data stream.
NOTE: The `op_type` parameter defaults to `create` when adding new documents. NOTE: The `op_type` parameter defaults to `create` when adding new documents.
The following index API request adds a new document to the `logs` data The following index API request adds a new document to `my-data-stream`.
stream.
[source,console] [source,console]
---- ----
POST /logs/_doc/ POST /my-data-stream/_doc/
{ {
"@timestamp": "2020-12-07T11:06:07.000Z", "@timestamp": "2020-12-07T11:06:07.000Z",
"user": { "user": {
@ -115,11 +114,11 @@ stream in a single request. Each action in the bulk request must use the
NOTE: Data streams do not support other bulk actions, such as `index`. NOTE: Data streams do not support other bulk actions, such as `index`.
The following bulk API request adds several new documents to The following bulk API request adds several new documents to
the `logs` data stream. Note that only the `create` action is used. `my-data-stream`. Only the `create` action is used.
[source,console] [source,console]
---- ----
PUT /logs/_bulk?refresh PUT /my-data-stream/_bulk?refresh
{"create":{ }} {"create":{ }}
{ "@timestamp": "2020-12-08T11:04:05.000Z", "user": { "id": "vlb44hny" }, "message": "Login attempt failed" } { "@timestamp": "2020-12-08T11:04:05.000Z", "user": { "id": "vlb44hny" }, "message": "Login attempt failed" }
{"create":{ }} {"create":{ }}
@ -156,7 +155,7 @@ PUT /_ingest/pipeline/lowercase_message_field
---- ----
// TEST[continued] // TEST[continued]
The following index API request adds a new document to the `logs` data stream. The following index API request adds a new document to `my-data-stream`.
The request includes a `?pipeline=lowercase_message_field` query parameter. The request includes a `?pipeline=lowercase_message_field` query parameter.
This parameter indicates {es} should use the `lowercase_message_field` pipeline This parameter indicates {es} should use the `lowercase_message_field` pipeline
@ -167,7 +166,7 @@ During pre-processing, the pipeline changes the letter case of the document's
[source,console] [source,console]
---- ----
POST /logs/_doc?pipeline=lowercase_message_field POST /my-data-stream/_doc?pipeline=lowercase_message_field
{ {
"@timestamp": "2020-12-08T11:12:01.000Z", "@timestamp": "2020-12-08T11:12:01.000Z",
"user": { "user": {
@ -199,13 +198,13 @@ The following search APIs support data streams:
* <<search-field-caps, Field capabilities>> * <<search-field-caps, Field capabilities>>
* <<eql-search-api, EQL search>> * <<eql-search-api, EQL search>>
The following <<search-search,search API>> request searches the `logs` data The following <<search-search,search API>> request searches `my-data-stream`
stream for documents with a timestamp between today and yesterday that also have for documents with a timestamp between today and yesterday that also have
`message` value of `login successful`. `message` value of `login successful`.
[source,console] [source,console]
---- ----
GET /logs/_search GET /my-data-stream/_search
{ {
"query": { "query": {
"bool": { "bool": {
@ -230,12 +229,12 @@ GET /logs/_search
You can use a comma-separated list or wildcard (`*`) expression to search You can use a comma-separated list or wildcard (`*`) expression to search
multiple data streams, indices, and index aliases in the same request. multiple data streams, indices, and index aliases in the same request.
The following request searches the `logs` and `logs_alt` data streams, which are The following request searches `my-data-stream` and `my-data-stream-alt`,
specified as a comma-separated list in the request path. which are specified as a comma-separated list in the request path.
[source,console] [source,console]
---- ----
GET /logs,logs_alt/_search GET /my-data-stream,my-data-stream-alt/_search
{ {
"query": { "query": {
"match": { "match": {
@ -244,12 +243,12 @@ GET /logs,logs_alt/_search
} }
} }
---- ----
The following request uses the `logs*` wildcard expression to search any data The following request uses the `my-data-stream*` wildcard expression to search any data
stream, index, or index alias beginning with `logs`. stream, index, or index alias beginning with `my-data-stream`.
[source,console] [source,console]
---- ----
GET /logs*/_search GET /my-data-stream*/_search
{ {
"query": { "query": {
"match": { "match": {
@ -288,12 +287,12 @@ statistics for one or more data streams. These statistics include:
.*Example* .*Example*
[%collapsible] [%collapsible]
==== ====
The following data stream stats API request retrieves statistics for the The following data stream stats API request retrieves statistics for
`logs` data stream. `my-data-stream`.
[source,console] [source,console]
---- ----
GET /_data_stream/logs/_stats?human=true GET /_data_stream/my-data-stream/_stats?human=true
---- ----
The API returns the following response. The API returns the following response.
@ -312,7 +311,7 @@ The API returns the following response.
"total_store_size_bytes": 624, "total_store_size_bytes": 624,
"data_streams": [ "data_streams": [
{ {
"data_stream": "logs", "data_stream": "my-data-stream",
"backing_indices": 3, "backing_indices": 3,
"store_size": "624b", "store_size": "624b",
"store_size_bytes": 624, "store_size_bytes": 624,
@ -346,11 +345,11 @@ manually perform a rollover. This can be useful if you want to
to the stream's write index after updating a data stream's template. to the stream's write index after updating a data stream's template.
The following <<indices-rollover-index,rollover API>> request submits a manual The following <<indices-rollover-index,rollover API>> request submits a manual
rollover request for the `logs` data stream. rollover request for `my-data-stream`.
[source,console] [source,console]
---- ----
POST /logs/_rollover/ POST /my-data-stream/_rollover/
---- ----
[discrete] [discrete]
@ -370,50 +369,50 @@ You also can conveniently re-open all closed backing indices for a data stream
by sending an open request directly to the stream. by sending an open request directly to the stream.
The following <<cat-indices,cat indices>> API request retrieves the status for The following <<cat-indices,cat indices>> API request retrieves the status for
the `logs` data stream's backing indices. `my-data-stream`'s backing indices.
//// ////
[source,console] [source,console]
---- ----
POST /.ds-logs-000001,.ds-logs-000002/_close/ POST /.ds-my-data-stream-000001,.ds-my-data-stream-000002/_close/
---- ----
//// ////
[source,console] [source,console]
---- ----
GET /_cat/indices/logs?v&s=index&h=index,status GET /_cat/indices/my-data-stream?v&s=index&h=index,status
---- ----
// TEST[continued] // TEST[continued]
The API returns the following response. The response indicates the `logs` data The API returns the following response. The response indicates
stream contains two closed backing indices: `.ds-logs-000001` and `my-data-stream` contains two closed backing indices:
`.ds-logs-000002`. `.ds-my-data-stream-000001` and `.ds-my-data-stream-000002`.
[source,txt] [source,txt]
---- ----
index status index status
.ds-logs-000001 close .ds-my-data-stream-000001 close
.ds-logs-000002 close .ds-my-data-stream-000002 close
.ds-logs-000003 open .ds-my-data-stream-000003 open
---- ----
// TESTRESPONSE[non_json] // TESTRESPONSE[non_json]
The following <<indices-open-close,open API>> request re-opens any closed The following <<indices-open-close,open API>> request re-opens any closed
backing indices for the `logs` data stream, including `.ds-logs-000001` and backing indices for `my-data-stream`, including
`.ds-logs-000002`. `.ds-my-data-stream-000001` and `.ds-my-data-stream-000002`.
[source,console] [source,console]
---- ----
POST /logs/_open/ POST /my-data-stream/_open/
---- ----
// TEST[continued] // TEST[continued]
You can resubmit the original cat indices API request to verify the You can resubmit the original cat indices API request to verify
`.ds-logs-000001` and `.ds-logs-000002` backing indices were re-opened. `.ds-my-data-stream-000001` and `.ds-my-data-stream-000002` were re-opened.
[source,console] [source,console]
---- ----
GET /_cat/indices/logs?v&s=index&h=index,status GET /_cat/indices/my-data-stream?v&s=index&h=index,status
---- ----
// TEST[continued] // TEST[continued]
@ -421,10 +420,10 @@ The API returns the following response.
[source,txt] [source,txt]
---- ----
index status index status
.ds-logs-000001 open .ds-my-data-stream-000001 open
.ds-logs-000002 open .ds-my-data-stream-000002 open
.ds-logs-000003 open .ds-my-data-stream-000003 open
---- ----
// TESTRESPONSE[non_json] // TESTRESPONSE[non_json]
@ -461,8 +460,8 @@ write index, we recommend you update the <<create-a-data-stream-template,data
stream's template>> and perform a <<manually-roll-over-a-data-stream,rollover>>. stream's template>> and perform a <<manually-roll-over-a-data-stream,rollover>>.
The following reindex request copies documents from the `archive` index alias to The following reindex request copies documents from the `archive` index alias to
the existing `logs` data stream. Because the destination is a data stream, the `my-data-stream`. Because the destination is a data
request's `op_type` is `create`. stream, the request's `op_type` is `create`.
//// ////
[source,console] [source,console]
@ -495,7 +494,7 @@ POST /_reindex
"index": "archive" "index": "archive"
}, },
"dest": { "dest": {
"index": "logs", "index": "my-data-stream",
"op_type": "create" "op_type": "create"
} }
} }
@ -505,16 +504,16 @@ POST /_reindex
You can also reindex documents from a data stream to an index, index You can also reindex documents from a data stream to an index, index
alias, or data stream. alias, or data stream.
The following reindex request copies documents from the `logs` data stream The following reindex request copies documents from `my-data-stream`
to the existing `archive` index alias. Because the destination is not a data to the existing `archive` index alias. Because the destination is not a
stream, the `op_type` does not need to be specified. data stream, the `op_type` does not need to be specified.
[source,console] [source,console]
---- ----
POST /_reindex POST /_reindex
{ {
"source": { "source": {
"index": "logs" "index": "my-data-stream"
}, },
"dest": { "dest": {
"index": "archive" "index": "archive"
@ -540,14 +539,14 @@ data stream. These prohibited requests include:
Instead, you can use the <<docs-update-by-query,update by query API>> to update Instead, you can use the <<docs-update-by-query,update by query API>> to update
documents in a data stream that matches a provided query. documents in a data stream that matches a provided query.
The following update by query request updates documents in the `logs` data The following update by query request updates documents in `my-data-stream`
stream with a `user.id` of `l7gk7f82`. The request uses a with a `user.id` of `l7gk7f82`. The request uses a
<<modules-scripting-using,script>> to assign matching documents a new `user.id` <<modules-scripting-using,script>> to assign matching documents a new `user.id`
value of `XgdX0NoX`. value of `XgdX0NoX`.
[source,console] [source,console]
---- ----
POST /logs/_update_by_query POST /my-data-stream/_update_by_query
{ {
"query": { "query": {
"match": { "match": {
@ -577,12 +576,12 @@ prohibited requests include:
Instead, you can use the <<docs-delete-by-query,delete by query API>> to delete Instead, you can use the <<docs-delete-by-query,delete by query API>> to delete
documents in a data stream that matches a provided query. documents in a data stream that matches a provided query.
The following delete by query request deletes documents in the `logs` data The following delete by query request deletes documents in `my-data-stream`
stream with a `user.id` of `vlb44hny`. with a `user.id` of `vlb44hny`.
[source,console] [source,console]
---- ----
POST /logs/_delete_by_query POST /my-data-stream/_delete_by_query
{ {
"query": { "query": {
"match": { "match": {
@ -609,9 +608,9 @@ If you want to update a document, you must also get its current
You can use a <<search-a-data-stream,search request>> to retrieve this You can use a <<search-a-data-stream,search request>> to retrieve this
information. information.
The following search request retrieves documents in the `logs` data stream with The following search request retrieves documents in `my-data-stream`
a `user.id` of `yWIumJd7`. By default, this search returns the document ID and with a `user.id` of `yWIumJd7`. By default, this search returns the
backing index for any matching documents. document ID and backing index for any matching documents.
The request includes a `"seq_no_primary_term": true` argument. This means the The request includes a `"seq_no_primary_term": true` argument. This means the
search also returns the sequence number and primary term for any matching search also returns the sequence number and primary term for any matching
@ -619,7 +618,7 @@ documents.
[source,console] [source,console]
---- ----
GET /logs/_search GET /my-data-stream/_search
{ {
"seq_no_primary_term": true, "seq_no_primary_term": true,
"query": { "query": {
@ -652,7 +651,7 @@ information for any documents matching the search.
"max_score": 0.2876821, "max_score": 0.2876821,
"hits": [ "hits": [
{ {
"_index": ".ds-logs-000003", <1> "_index": ".ds-my-data-stream-000003", <1>
"_type": "_doc", "_type": "_doc",
"_id": "bfspvnIBr7VVZlfp2lqX", <2> "_id": "bfspvnIBr7VVZlfp2lqX", <2>
"_seq_no": 0, <3> "_seq_no": 0, <3>
@ -683,9 +682,9 @@ You can use an <<docs-index_,index API>> request to update an individual
document. To prevent an accidental overwrite, this request must include valid document. To prevent an accidental overwrite, this request must include valid
`if_seq_no` and `if_primary_term` arguments. `if_seq_no` and `if_primary_term` arguments.
The following index API request updates an existing document in the `logs` data The following index API request updates an existing document in
stream. The request targets document ID `bfspvnIBr7VVZlfp2lqX` in the `my-data-stream`. The request targets document ID
`.ds-logs-000003` backing index. `bfspvnIBr7VVZlfp2lqX` in the `.ds-my-data-stream-000003` backing index.
The request also includes the current sequence number and primary term in the The request also includes the current sequence number and primary term in the
respective `if_seq_no` and `if_primary_term` query parameters. The request body respective `if_seq_no` and `if_primary_term` query parameters. The request body
@ -693,7 +692,7 @@ contains a new JSON source for the document.
[source,console] [source,console]
---- ----
PUT /.ds-logs-000003/_doc/bfspvnIBr7VVZlfp2lqX?if_seq_no=0&if_primary_term=1 PUT /.ds-my-data-stream-000003/_doc/bfspvnIBr7VVZlfp2lqX?if_seq_no=0&if_primary_term=1
{ {
"@timestamp": "2020-12-07T11:06:07.000Z", "@timestamp": "2020-12-07T11:06:07.000Z",
"user": { "user": {
@ -706,13 +705,13 @@ PUT /.ds-logs-000003/_doc/bfspvnIBr7VVZlfp2lqX?if_seq_no=0&if_primary_term=1
You use the <<docs-delete,delete API>> to delete individual documents. Deletion You use the <<docs-delete,delete API>> to delete individual documents. Deletion
requests do not require a sequence number or primary term. requests do not require a sequence number or primary term.
The following index API request deletes an existing document in the `logs` data The following index API request deletes an existing document in
stream. The request targets document ID `bfspvnIBr7VVZlfp2lqX` in the `my-data-stream`. The request targets document ID
`.ds-logs-000003` backing index. `bfspvnIBr7VVZlfp2lqX` in the `.ds-my-data-stream-000003` backing index.
[source,console] [source,console]
---- ----
DELETE /.ds-logs-000003/_doc/bfspvnIBr7VVZlfp2lqX DELETE /.ds-my-data-stream-000003/_doc/bfspvnIBr7VVZlfp2lqX
---- ----
You can use the <<docs-bulk,bulk API>> to delete or update multiple documents in You can use the <<docs-bulk,bulk API>> to delete or update multiple documents in
@ -723,17 +722,17 @@ If the action type is `index`, the action must include valid
arguments. arguments.
The following bulk API request uses an `index` action to update an existing The following bulk API request uses an `index` action to update an existing
document in the `logs` data stream. document in `my-data-stream`.
The `index` action targets document ID `bfspvnIBr7VVZlfp2lqX` in the The `index` action targets document ID `bfspvnIBr7VVZlfp2lqX` in the
`.ds-logs-000003` backing index. The action also includes the current sequence `.ds-my-data-stream-000003` backing index. The action also includes the current
number and primary term in the respective `if_seq_no` and `if_primary_term` sequence number and primary term in the respective `if_seq_no` and
parameters. `if_primary_term` parameters.
[source,console] [source,console]
---- ----
PUT /_bulk?refresh PUT /_bulk?refresh
{ "index": { "_index": ".ds-logs-000003", "_id": "bfspvnIBr7VVZlfp2lqX", "if_seq_no": 0, "if_primary_term": 1 } } { "index": { "_index": ".ds-my-data-stream-000003", "_id": "bfspvnIBr7VVZlfp2lqX", "if_seq_no": 0, "if_primary_term": 1 } }
{ "@timestamp": "2020-12-07T11:06:07.000Z", "user": { "id": "8a4f500d" }, "message": "Login successful" } { "@timestamp": "2020-12-07T11:06:07.000Z", "user": { "id": "8a4f500d" }, "message": "Login successful" }
---- ----

Binary file not shown.

Before

Width:  |  Height:  |  Size: 16 KiB

After

Width:  |  Height:  |  Size: 54 KiB

View File

@ -257,7 +257,7 @@ POST /my-data-stream/_rollover <2>
<1> Creates a data stream called `my-data-stream` with one initial backing index <1> Creates a data stream called `my-data-stream` with one initial backing index
named `my-data-stream-000001`. named `my-data-stream-000001`.
<2> This request creates a new backing index, `my-data-stream-000002`, and adds <2> This request creates a new backing index, `my-data-stream-000002`, and adds
it as the write index for the `my-data-stream` data stream if the current it as the write index for `my-data-stream` if the current
write index meets at least one of the following conditions: write index meets at least one of the following conditions:
+ +
-- --

View File

@ -17,37 +17,38 @@ to control access to a data stream. Any role or user granted privileges to a
data stream are automatically granted the same privileges to its backing data stream are automatically granted the same privileges to its backing
indices. indices.
`logs` is a data stream that consists of two backing indices: `.ds-logs-000001` For example, `my-data-stream` consists of two backing indices:
and `.ds-logs-000002`. `.ds-my-data-stream-000001` and `.ds-my-data-stream-000002`.
A user is granted the `read` privilege to the `logs` data stream. A user is granted the `read` privilege to `my-data-stream`.
[source,js] [source,js]
-------------------------------------------------- --------------------------------------------------
{ {
"names" : [ "logs" ], "names" : [ "my-data-stream" ],
"privileges" : [ "read" ] "privileges" : [ "read" ]
} }
-------------------------------------------------- --------------------------------------------------
// NOTCONSOLE // NOTCONSOLE
Because the user is automatically granted the same privileges to the stream's Because the user is automatically granted the same privileges to the stream's
backing indices, the user can retrieve a document directly from `.ds-logs-000002`: backing indices, the user can retrieve a document directly from
`.ds-my-data-stream-000002`:
//// ////
[source,console] [source,console]
---- ----
PUT /_index_template/logs_data_stream PUT /_index_template/my-data-stream-template
{ {
"index_patterns": [ "logs*" ], "index_patterns": [ "my-data-stream*" ],
"data_stream": { } "data_stream": { }
} }
PUT /_data_stream/logs PUT /_data_stream/my-data-stream
POST /logs/_rollover/ POST /my-data-stream/_rollover/
PUT /logs/_create/2?refresh=wait_for PUT /my-data-stream/_create/2?refresh=wait_for
{ {
"@timestamp": "2020-12-07T11:06:07.000Z" "@timestamp": "2020-12-07T11:06:07.000Z"
} }
@ -56,21 +57,21 @@ PUT /logs/_create/2?refresh=wait_for
[source,console] [source,console]
---- ----
GET /.ds-logs-000002/_doc/2 GET /.ds-my-data-stream-000002/_doc/2
---- ----
// TEST[continued] // TEST[continued]
Later the `logs` data stream <<manually-roll-over-a-data-stream,rolls over>>. Later `my-data-stream` <<manually-roll-over-a-data-stream,rolls over>>. This
This creates a new backing index: `.ds-logs-000003`. Because the user still has creates a new backing index: `.ds-my-data-stream-000003`. Because the user still
the `read` privilege for the `logs` data stream, the user can retrieve documents has the `read` privilege for `my-data-stream`, the user can retrieve
directly from `.ds-logs-000003`: documents directly from `.ds-my-data-stream-000003`:
//// ////
[source,console] [source,console]
---- ----
POST /logs/_rollover/ POST /my-data-stream/_rollover/
PUT /logs/_create/2?refresh=wait_for PUT /my-data-stream/_create/2?refresh=wait_for
{ {
"@timestamp": "2020-12-07T11:06:07.000Z" "@timestamp": "2020-12-07T11:06:07.000Z"
} }
@ -80,7 +81,7 @@ PUT /logs/_create/2?refresh=wait_for
[source,console] [source,console]
---- ----
GET /.ds-logs-000003/_doc/2 GET /.ds-my-data-stream-000003/_doc/2
---- ----
// TEST[continued] // TEST[continued]