2020-06-10 14:03:46 -04:00
|
|
|
[[use-a-data-stream]]
|
|
|
|
== Use a data stream
|
|
|
|
|
2020-06-11 11:29:05 -04:00
|
|
|
After you <<set-up-a-data-stream,set up a data stream>>, you can do
|
2020-06-10 14:03:46 -04:00
|
|
|
the following:
|
|
|
|
|
|
|
|
* <<add-documents-to-a-data-stream>>
|
|
|
|
* <<search-a-data-stream>>
|
|
|
|
* <<manually-roll-over-a-data-stream>>
|
2020-06-12 11:21:31 -04:00
|
|
|
* <<reindex-with-a-data-stream>>
|
2020-06-10 14:03:46 -04:00
|
|
|
|
|
|
|
////
|
|
|
|
[source,console]
|
|
|
|
----
|
|
|
|
PUT /_index_template/logs_data_stream
|
|
|
|
{
|
|
|
|
"index_patterns": [ "logs*" ],
|
|
|
|
"data_stream": {
|
|
|
|
"timestamp_field": "@timestamp"
|
|
|
|
},
|
|
|
|
"template": {
|
|
|
|
"mappings": {
|
|
|
|
"properties": {
|
|
|
|
"@timestamp": {
|
|
|
|
"type": "date"
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
PUT /_data_stream/logs
|
|
|
|
----
|
|
|
|
////
|
|
|
|
|
|
|
|
[discrete]
|
|
|
|
[[add-documents-to-a-data-stream]]
|
|
|
|
=== Add documents to a data stream
|
|
|
|
|
|
|
|
You can add documents to a data stream using the following requests:
|
|
|
|
|
|
|
|
* An <<docs-index_,index API>> request with an
|
|
|
|
<<docs-index-api-op_type,`op_type`>> set to `create`. Specify the data
|
|
|
|
stream's name in place of an index name.
|
|
|
|
+
|
|
|
|
--
|
|
|
|
NOTE: The `op_type` parameter defaults to `create` when adding new documents.
|
|
|
|
|
|
|
|
.*Example: Index API request*
|
|
|
|
[%collapsible]
|
|
|
|
====
|
|
|
|
The following <<docs-index_,index API>> adds a new document to the `logs` data
|
|
|
|
stream.
|
|
|
|
|
|
|
|
[source,console]
|
|
|
|
----
|
|
|
|
POST /logs/_doc/
|
|
|
|
{
|
|
|
|
"@timestamp": "2020-12-07T11:06:07.000Z",
|
|
|
|
"user": {
|
|
|
|
"id": "8a4f500d"
|
|
|
|
},
|
|
|
|
"message": "Login successful"
|
|
|
|
}
|
|
|
|
----
|
|
|
|
// TEST[continued]
|
|
|
|
====
|
|
|
|
--
|
|
|
|
|
|
|
|
* A <<docs-bulk,bulk API>> request using the `create` action. Specify the data
|
|
|
|
stream's name in place of an index name.
|
|
|
|
+
|
|
|
|
--
|
|
|
|
NOTE: Data streams do not support other bulk actions, such as `index`.
|
|
|
|
|
|
|
|
.*Example: Bulk API request*
|
|
|
|
[%collapsible]
|
|
|
|
====
|
|
|
|
The following <<docs-bulk,bulk API>> index request adds several new documents to
|
|
|
|
the `logs` data stream. Note that only the `create` action is used.
|
|
|
|
|
|
|
|
[source,console]
|
|
|
|
----
|
|
|
|
PUT /logs/_bulk?refresh
|
|
|
|
{"create":{"_index" : "logs"}}
|
|
|
|
{ "@timestamp": "2020-12-08T11:04:05.000Z", "user": { "id": "vlb44hny" }, "message": "Login attempt failed" }
|
|
|
|
{"create":{"_index" : "logs"}}
|
|
|
|
{ "@timestamp": "2020-12-08T11:06:07.000Z", "user": { "id": "8a4f500d" }, "message": "Login successful" }
|
|
|
|
{"create":{"_index" : "logs"}}
|
|
|
|
{ "@timestamp": "2020-12-09T11:07:08.000Z", "user": { "id": "l7gk7f82" }, "message": "Logout successful" }
|
|
|
|
----
|
|
|
|
// TEST[continued]
|
|
|
|
====
|
|
|
|
--
|
|
|
|
|
|
|
|
[discrete]
|
|
|
|
[[search-a-data-stream]]
|
|
|
|
=== Search a data stream
|
|
|
|
|
|
|
|
The following search APIs support data streams:
|
|
|
|
|
|
|
|
* <<search-search, Search>>
|
|
|
|
* <<async-search, Async search>>
|
|
|
|
* <<search-multi-search, Multi search>>
|
|
|
|
* <<search-field-caps, Field capabilities>>
|
|
|
|
////
|
|
|
|
* <<eql-search-api, EQL search>>
|
|
|
|
////
|
|
|
|
|
|
|
|
.*Example*
|
|
|
|
[%collapsible]
|
|
|
|
====
|
|
|
|
The following <<search-search,search API>> request searches the `logs` data
|
|
|
|
stream for documents with a timestamp between today and yesterday that also have
|
|
|
|
`message` value of `login successful`.
|
|
|
|
|
|
|
|
[source,console]
|
|
|
|
----
|
|
|
|
GET /logs/_search
|
|
|
|
{
|
|
|
|
"query": {
|
|
|
|
"bool": {
|
|
|
|
"must": {
|
|
|
|
"range": {
|
|
|
|
"@timestamp": {
|
|
|
|
"gte": "now-1d/d",
|
|
|
|
"lt": "now/d"
|
|
|
|
}
|
|
|
|
}
|
|
|
|
},
|
|
|
|
"should": {
|
|
|
|
"match": {
|
|
|
|
"message": "login successful"
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
----
|
|
|
|
// TEST[continued]
|
|
|
|
====
|
|
|
|
|
|
|
|
[discrete]
|
|
|
|
[[manually-roll-over-a-data-stream]]
|
|
|
|
=== Manually roll over a data stream
|
|
|
|
|
|
|
|
A rollover creates a new backing index for a data stream. This new backing index
|
2020-06-11 11:32:09 -04:00
|
|
|
becomes the stream's <<data-stream-write-index,write index>> and increments
|
|
|
|
the stream's <<data-streams-generation,generation>>.
|
2020-06-10 14:03:46 -04:00
|
|
|
|
|
|
|
In most cases, we recommend using <<index-lifecycle-management,{ilm-init}>> to
|
|
|
|
automate rollovers for data streams. This lets you automatically roll over the
|
|
|
|
current write index when it meets specified criteria, such as a maximum age or
|
|
|
|
size.
|
|
|
|
|
|
|
|
However, you can also use the <<indices-rollover-index,rollover API>> to
|
2020-06-16 16:20:32 -04:00
|
|
|
manually perform a rollover. This can be useful if you want to
|
|
|
|
<<data-streams-change-mappings-and-settings,apply mapping or setting changes>>
|
|
|
|
to the stream's write index after updating a data stream's template.
|
2020-06-10 14:03:46 -04:00
|
|
|
|
|
|
|
.*Example*
|
|
|
|
[%collapsible]
|
|
|
|
====
|
|
|
|
The following <<indices-rollover-index,rollover API>> request submits a manual
|
|
|
|
rollover request for the `logs` data stream.
|
|
|
|
|
|
|
|
[source,console]
|
|
|
|
----
|
|
|
|
POST /logs/_rollover/
|
|
|
|
{
|
|
|
|
"conditions": {
|
|
|
|
"max_docs": "1"
|
|
|
|
}
|
|
|
|
}
|
|
|
|
----
|
|
|
|
// TEST[continued]
|
|
|
|
====
|
|
|
|
|
2020-06-12 11:21:31 -04:00
|
|
|
[discrete]
|
|
|
|
[[reindex-with-a-data-stream]]
|
|
|
|
=== Reindex with a data stream
|
|
|
|
|
|
|
|
You can use the <<docs-reindex,reindex API>> to copy documents to a data stream
|
|
|
|
from an existing index, index alias, or data stream.
|
|
|
|
|
|
|
|
A reindex copies documents from a _source_ to a _destination_. The source and
|
|
|
|
destination can be any pre-existing index, index alias, or data stream. However,
|
|
|
|
the source and destination must be different. You cannot reindex a data stream
|
|
|
|
into itself.
|
|
|
|
|
|
|
|
Because data streams are <<data-streams-append-only,append-only>>, a reindex
|
|
|
|
request to a data stream destination must have an `op_type` of `create`. This
|
|
|
|
means a reindex can only add new documents to a data stream. It cannot update
|
|
|
|
existing documents in the data stream destination.
|
|
|
|
|
|
|
|
A reindex can be used to:
|
|
|
|
|
|
|
|
* Convert an existing index alias and collection of time-based indices into a
|
|
|
|
data stream.
|
|
|
|
|
|
|
|
* Apply a new or updated <<create-a-data-stream-template,composable template>>
|
|
|
|
by reindexing an existing data stream into a new one. This applies mapping
|
|
|
|
and setting changes in the template to each document and backing index of the
|
2020-06-16 16:20:32 -04:00
|
|
|
data stream destination. See
|
|
|
|
<<data-streams-use-reindex-to-change-mappings-settings>>.
|
2020-06-12 11:21:31 -04:00
|
|
|
|
|
|
|
TIP: If you only want to update the mappings or settings of a data stream's
|
|
|
|
write index, we recommend you update the <<create-a-data-stream-template,data
|
|
|
|
stream's template>> and perform a <<manually-roll-over-a-data-stream,rollover>>.
|
|
|
|
|
|
|
|
.*Example*
|
|
|
|
[%collapsible]
|
|
|
|
====
|
|
|
|
The following reindex request copies documents from the `archive` index alias to
|
|
|
|
the existing `logs` data stream. Because the destination is a data stream, the
|
2020-06-15 12:59:18 -04:00
|
|
|
request's `op_type` is `create`.
|
2020-06-12 11:21:31 -04:00
|
|
|
|
|
|
|
////
|
|
|
|
[source,console]
|
|
|
|
----
|
|
|
|
PUT /_bulk?refresh=wait_for
|
|
|
|
{"create":{"_index" : "archive_1"}}
|
|
|
|
{ "@timestamp": "2020-12-08T11:04:05.000Z" }
|
|
|
|
{"create":{"_index" : "archive_2"}}
|
|
|
|
{ "@timestamp": "2020-12-08T11:06:07.000Z" }
|
|
|
|
{"create":{"_index" : "archive_2"}}
|
|
|
|
{ "@timestamp": "2020-12-09T11:07:08.000Z" }
|
|
|
|
{"create":{"_index" : "archive_2"}}
|
|
|
|
{ "@timestamp": "2020-12-09T11:07:08.000Z" }
|
|
|
|
|
|
|
|
POST /_aliases
|
|
|
|
{
|
|
|
|
"actions" : [
|
|
|
|
{ "add" : { "index" : "archive_1", "alias" : "archive" } },
|
|
|
|
{ "add" : { "index" : "archive_2", "alias" : "archive", "is_write_index" : true} }
|
|
|
|
]
|
|
|
|
}
|
|
|
|
----
|
|
|
|
// TEST[continued]
|
|
|
|
////
|
|
|
|
|
|
|
|
[source,console]
|
|
|
|
----
|
|
|
|
POST /_reindex
|
|
|
|
{
|
|
|
|
"source": {
|
|
|
|
"index": "archive"
|
|
|
|
},
|
|
|
|
"dest": {
|
|
|
|
"index": "logs",
|
|
|
|
"op_type": "create"
|
|
|
|
}
|
|
|
|
}
|
|
|
|
----
|
|
|
|
// TEST[continued]
|
|
|
|
====
|
|
|
|
|
|
|
|
You can also reindex documents from a data stream to an index, index
|
|
|
|
alias, or data stream.
|
|
|
|
|
|
|
|
.*Example*
|
|
|
|
[%collapsible]
|
|
|
|
====
|
|
|
|
The following reindex request copies documents from the `logs` data stream
|
|
|
|
to the existing `archive` index alias. Because the destination is not a data
|
|
|
|
stream, the `op_type` does not need to be specified.
|
|
|
|
|
|
|
|
[source,console]
|
|
|
|
----
|
|
|
|
POST /_reindex
|
|
|
|
{
|
|
|
|
"source": {
|
|
|
|
"index": "logs"
|
|
|
|
},
|
|
|
|
"dest": {
|
|
|
|
"index": "archive"
|
|
|
|
}
|
|
|
|
}
|
|
|
|
----
|
|
|
|
// TEST[continued]
|
|
|
|
====
|
|
|
|
|
2020-06-10 14:03:46 -04:00
|
|
|
////
|
|
|
|
[source,console]
|
|
|
|
----
|
|
|
|
DELETE /_data_stream/logs
|
|
|
|
|
|
|
|
DELETE /_index_template/logs_data_stream
|
|
|
|
----
|
|
|
|
// TEST[continued]
|
|
|
|
////
|