Data Prepper ToC Update (#2514)
* Creating PR with first file. Signed-off-by: carolxob <carolxob@amazon.com> * Adding newly created files to PR. Signed-off-by: carolxob <carolxob@amazon.com> * Reorganized files and added appropriate metadata to map ToC correctly. Signed-off-by: carolxob <carolxob@amazon.com> * Moved Authoring pipelines page. Signed-off-by: carolxob <carolxob@amazon.com> * Minor ToC updates. Signed-off-by: carolxob <carolxob@amazon.com> * Minor ToC updates to Sources section for Data Prepper. Signed-off-by: carolxob <carolxob@amazon.com> * Updated Buffers section under Data Prepper. Signed-off-by: carolxob <carolxob@amazon.com> * Minor update to otelmetricssource. Signed-off-by: carolxob <carolxob@amazon.com> * Restructured ToC in Processors section for Data Prepper. Signed-off-by: carolxob <carolxob@amazon.com> * Minor filename change. Signed-off-by: carolxob <carolxob@amazon.com> * Adjustments to metadata in ToC. Signed-off-by: carolxob <carolxob@amazon.com> * Minor edit. Signed-off-by: carolxob <carolxob@amazon.com> * Fixed nav order in metadata. Signed-off-by: carolxob <carolxob@amazon.com> * Minor edit. Signed-off-by: carolxob <carolxob@amazon.com> * Minor update top metadata for ToC. Signed-off-by: carolxob <carolxob@amazon.com> * Adjustmenets to Toc order. Signed-off-by: carolxob <carolxob@amazon.com> * Minor adjustments to ToC metadata. Signed-off-by: carolxob <carolxob@amazon.com> * Minor adjustments to Sinks section. Signed-off-by: carolxob <carolxob@amazon.com> * Adjustements to high level ToC. Signed-off-by: carolxob <carolxob@amazon.com> * Minor adjustement to Pipelines.md Signed-off-by: carolxob <carolxob@amazon.com> * Minor update. Signed-off-by: carolxob <carolxob@amazon.com> * Slight reorganization. Removed two placeholder pages for now. Signed-off-by: carolxob <carolxob@amazon.com> * Removed a page and replaced with pipelines content. Signed-off-by: carolxob <carolxob@amazon.com> * Minor changes/additions to content for placeholder pages. Signed-off-by: carolxob <carolxob@amazon.com> * Minor update to page link. Signed-off-by: carolxob <carolxob@amazon.com> * Minor adjustments to ToC metadata. Signed-off-by: carolxob <carolxob@amazon.com> * Minor edits. Signed-off-by: carolxob <carolxob@amazon.com> * Removed /clients from redirects to correct nav order. Signed-off-by: carolxob <carolxob@amazon.com> * Minor edits. Signed-off-by: carolxob <carolxob@amazon.com> * Minor adjustments to ToC metadata. Signed-off-by: carolxob <carolxob@amazon.com> * Minor adjustments. Signed-off-by: carolxob <carolxob@amazon.com> * Minor adjustment ot metadata. Signed-off-by: carolxob <carolxob@amazon.com> * TOC link fixes Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Changed page name. Signed-off-by: carolxob <carolxob@amazon.com> * Corrected references to Peer Forwarder. Signed-off-by: carolxob <carolxob@amazon.com> * Renamed Data Prepper folder. Signed-off-by: carolxob <carolxob@amazon.com> * Minor updates to phrasing and capitalization. Signed-off-by: carolxob <carolxob@amazon.com> * Minor phrasing update. Signed-off-by: carolxob <carolxob@amazon.com> * Minor phrasing update. Signed-off-by: carolxob <carolxob@amazon.com> * Minor change. Signed-off-by: carolxob <carolxob@amazon.com> * Minor change to change S3 Source to S3Source. Signed-off-by: carolxob <carolxob@amazon.com> * Updated references to peer forwarder and changed capitalization. Signed-off-by: carolxob <carolxob@amazon.com> * Updated capitalization for peer forwarder. Signed-off-by: carolxob <carolxob@amazon.com> * Made edits based on doc review feedback. Signed-off-by: carolxob <carolxob@amazon.com> * Update to one word. Signed-off-by: carolxob <carolxob@amazon.com> --------- Signed-off-by: carolxob <carolxob@amazon.com> Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Co-authored-by: Naarcha-AWS <naarcha@amazon.com>
This commit is contained in:
parent
d7e8cdedd1
commit
0249991f76
|
@ -0,0 +1,12 @@
|
||||||
|
---
|
||||||
|
layout: default
|
||||||
|
title: Common use cases
|
||||||
|
has_children: true
|
||||||
|
nav_order: 15
|
||||||
|
redirect_from:
|
||||||
|
- /data-prepper/common-use-cases/
|
||||||
|
---
|
||||||
|
|
||||||
|
# Common use cases
|
||||||
|
|
||||||
|
You can use Data Prepper for several different purposes, including trace analytics, log analytics, Amazon S3 log analytics, and metrics ingestion.
|
|
@ -1,6 +1,7 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Log analytics
|
title: Log analytics
|
||||||
|
parent: Common use cases
|
||||||
nav_order: 15
|
nav_order: 15
|
||||||
---
|
---
|
||||||
|
|
|
@ -3,8 +3,6 @@ layout: default
|
||||||
title: Getting started
|
title: Getting started
|
||||||
nav_order: 5
|
nav_order: 5
|
||||||
redirect_from:
|
redirect_from:
|
||||||
- /clients/data-prepper/getting-started/
|
|
||||||
- /data-prepper/get-started/
|
|
||||||
- /clients/data-prepper/get-started/
|
- /clients/data-prepper/get-started/
|
||||||
---
|
---
|
||||||
|
|
||||||
|
@ -44,7 +42,7 @@ You will configure two files:
|
||||||
Depending on your use case, we have a few different guides to configuring Data Prepper.
|
Depending on your use case, we have a few different guides to configuring Data Prepper.
|
||||||
|
|
||||||
* [Trace Analytics](https://github.com/opensearch-project/data-prepper/blob/main/docs/trace_analytics.md)
|
* [Trace Analytics](https://github.com/opensearch-project/data-prepper/blob/main/docs/trace_analytics.md)
|
||||||
* [Log Ingestion](https://github.com/opensearch-project/data-prepper/blob/main/docs/log_analytics.md): Learn how to set up Data Prepper for log observability.
|
* [Log Analytics]({{site.url}}{{site.baseurl}}/data-prepper/common-use-cases/log-analytics/): Learn how to set up Data Prepper for log observability.
|
||||||
* [Simple Pipeline](https://github.com/opensearch-project/data-prepper/blob/main/docs/simple_pipelines.md): Learn the basics of Data Prepper pipelines with some simple configurations.
|
* [Simple Pipeline](https://github.com/opensearch-project/data-prepper/blob/main/docs/simple_pipelines.md): Learn the basics of Data Prepper pipelines with some simple configurations.
|
||||||
|
|
||||||
## 3. Defining a pipeline
|
## 3. Defining a pipeline
|
||||||
|
@ -71,7 +69,7 @@ docker run --name data-prepper \
|
||||||
opensearchproject/data-prepper:latest
|
opensearchproject/data-prepper:latest
|
||||||
```
|
```
|
||||||
|
|
||||||
This sample pipeline configuration above demonstrates a simple pipeline with a source (`random`) sending data to a sink (`stdout`). For more examples and details about more advanced pipeline configurations, see [Pipelines]({{site.url}}{{site.baseurl}}/clients/data-prepper/pipelines).
|
The preceding example pipeline configuration above demonstrates a simple pipeline with a source (`random`) sending data to a sink (`stdout`). For further detailed examples of more advanced pipeline configurations, see [Pipelines]({{site.url}}{{site.baseurl}}/clients/data-prepper/pipelines/).
|
||||||
|
|
||||||
After starting Data Prepper, you should see log output and some UUIDs after a few seconds:
|
After starting Data Prepper, you should see log output and some UUIDs after a few seconds:
|
||||||
|
|
||||||
|
|
|
@ -5,8 +5,6 @@ nav_order: 1
|
||||||
has_children: false
|
has_children: false
|
||||||
has_toc: false
|
has_toc: false
|
||||||
redirect_from:
|
redirect_from:
|
||||||
- /clients/tools/data-prepper/
|
|
||||||
- /clients/data-prepper/
|
|
||||||
- /clients/data-prepper/index/
|
- /clients/data-prepper/index/
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|
|
@ -1,10 +1,8 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Configuring Data Prepper
|
title: Configuring Data Prepper
|
||||||
has_children: true
|
parent: Managing Data Prepper
|
||||||
nav_order: 100
|
nav_order: 10
|
||||||
redirect_from:
|
|
||||||
- /clients/data-prepper/data-prepper-reference/
|
|
||||||
---
|
---
|
||||||
|
|
||||||
# Configuring Data Prepper
|
# Configuring Data Prepper
|
||||||
|
@ -31,15 +29,15 @@ peer_forwarder | No | Object | Peer forwarder configurations. See [Peer forwarde
|
||||||
|
|
||||||
The following section details various configuration options for peer forwarder.
|
The following section details various configuration options for peer forwarder.
|
||||||
|
|
||||||
#### General options for peer forwarder
|
#### General options for peer forwarding
|
||||||
|
|
||||||
Option | Required | Type | Description
|
Option | Required | Type | Description
|
||||||
:--- | :--- | :--- | :---
|
:--- | :--- | :--- | :---
|
||||||
port | No | Integer | The port number peer forwarder server is running on. Valid options are between 0 and 65535. Defaults is 4994.
|
port | No | Integer | The peer forwarding server port. Valid options are between 0 and 65535. Defaults is 4994.
|
||||||
request_timeout | No | Integer | Request timeout in milliseconds for peer forwarder HTTP server. Default is 10000.
|
request_timeout | No | Integer | Request timeout for the peer forwarder HTTP server in milliseconds. Default is 10000.
|
||||||
server_thread_count | No | Integer | Number of threads used by peer forwarder server. Default is 200.
|
server_thread_count | No | Integer | Number of threads used by the peer forwarder server. Default is 200.
|
||||||
client_thread_count | No | Integer | Number of threads used by peer forwarder client. Default is 200.
|
client_thread_count | No | Integer | Number of threads used by the peer forwarder client. Default is 200.
|
||||||
max_connection_count | No | Integer | Maximum number of open connections for peer forwarder server. Default is 500.
|
max_connection_count | No | Integer | Maximum number of open connections for the peer forwarder server. Default is 500.
|
||||||
max_pending_requests | No | Integer | Maximum number of allowed tasks in ScheduledThreadPool work queue. Default is 1024.
|
max_pending_requests | No | Integer | Maximum number of allowed tasks in ScheduledThreadPool work queue. Default is 1024.
|
||||||
discovery_mode | No | String | Peer discovery mode to use. Valid options are `local_node`, `static`, `dns`, or `aws_cloud_map`. Defaults to `local_node`, which processes events locally.
|
discovery_mode | No | String | Peer discovery mode to use. Valid options are `local_node`, `static`, `dns`, or `aws_cloud_map`. Defaults to `local_node`, which processes events locally.
|
||||||
static_endpoints | Conditionally | List | A list containing endpoints of all Data Prepper instances. Required if `discovery_mode` is set to static.
|
static_endpoints | Conditionally | List | A list containing endpoints of all Data Prepper instances. Required if `discovery_mode` is set to static.
|
|
@ -1,7 +1,8 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Configuring Log4j
|
title: Configuring Log4j
|
||||||
nav_order: 25
|
parent: Managing Data Prepper
|
||||||
|
nav_order: 20
|
||||||
---
|
---
|
||||||
|
|
||||||
# Configuring Log4j
|
# Configuring Log4j
|
|
@ -1,7 +1,8 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Core APIs
|
title: Core APIs
|
||||||
nav_order: 20
|
parent: Managing Data Prepper
|
||||||
|
nav_order: 15
|
||||||
---
|
---
|
||||||
|
|
||||||
# Core APIs
|
# Core APIs
|
|
@ -0,0 +1,10 @@
|
||||||
|
---
|
||||||
|
layout: default
|
||||||
|
title: Managing Data Prepper
|
||||||
|
has_children: true
|
||||||
|
nav_order: 20
|
||||||
|
---
|
||||||
|
|
||||||
|
# Managing Data Prepper
|
||||||
|
|
||||||
|
You can perform administrator functions for Data Prepper, including system configuration, interacting with core APIs, Log4j configuration, and monitoring. You can set up peer forwarding to coordinate multiple Data Prepper nodes when using stateful aggregation.
|
|
@ -1,7 +1,8 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Monitoring
|
title: Monitoring
|
||||||
nav_order: 33
|
parent: Administrating Data Prepper
|
||||||
|
nav_order: 25
|
||||||
---
|
---
|
||||||
|
|
||||||
# Monitoring Data Prepper with metrics
|
# Monitoring Data Prepper with metrics
|
|
@ -1,7 +1,7 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Migrating from Open Distro
|
title: Migrating from Open Distro
|
||||||
nav_order: 35
|
nav_order: 30
|
||||||
---
|
---
|
||||||
|
|
||||||
# Migrating from Open Distro
|
# Migrating from Open Distro
|
||||||
|
|
|
@ -1,7 +1,7 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Migrating from Logstash
|
title: Migrating from Logstash
|
||||||
nav_order: 30
|
nav_order: 25
|
||||||
redirect_from:
|
redirect_from:
|
||||||
- /data-prepper/configure-logstash-data-prepper/
|
- /data-prepper/configure-logstash-data-prepper/
|
||||||
---
|
---
|
||||||
|
@ -10,7 +10,7 @@ redirect_from:
|
||||||
|
|
||||||
You can run Data Prepper with a Logstash configuration.
|
You can run Data Prepper with a Logstash configuration.
|
||||||
|
|
||||||
As mentioned in the [Getting started]({{site.url}}{{site.baseurl}}/data-prepper/get-started/) guide, you'll need to configure Data Prepper with a pipeline using a `pipelines.yaml` file.
|
As mentioned in [Getting started with Data Prepper]({{site.url}}{{site.baseurl}}/data-prepper/getting-started/), you'll need to configure Data Prepper with a pipeline using a `pipelines.yaml` file.
|
||||||
|
|
||||||
Alternatively, if you have a Logstash configuration `logstash.conf` to configure Data Prepper instead of `pipelines.yaml`.
|
Alternatively, if you have a Logstash configuration `logstash.conf` to configure Data Prepper instead of `pipelines.yaml`.
|
||||||
|
|
||||||
|
@ -28,7 +28,7 @@ As of the Data Prepper 1.2 release, the following plugins from the Logstash conf
|
||||||
|
|
||||||
## Running Data Prepper with a Logstash configuration
|
## Running Data Prepper with a Logstash configuration
|
||||||
|
|
||||||
1. To install Data Prepper's Docker image, see the Installing Data Prepper in [Get Started]({{site.url}}{{site.baseurl}}/data-prepper/getting-started#1-installing-data-prepper).
|
1. To install Data Prepper's Docker image, see Installing Data Prepper in [Getting Started]({{site.url}}{{site.baseurl}}/data-prepper/getting-started#1-installing-data-prepper).
|
||||||
|
|
||||||
2. Run the Docker image installed in Step 1 by supplying your `logstash.conf` configuration.
|
2. Run the Docker image installed in Step 1 by supplying your `logstash.conf` configuration.
|
||||||
|
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: Bounded blocking
|
title: Bounded blocking
|
||||||
parent: Buffers
|
parent: Buffers
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 50
|
nav_order: 50
|
||||||
---
|
---
|
||||||
|
|
|
@ -1,9 +1,9 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Buffers
|
title: Buffers
|
||||||
parent: Configuring Data Prepper
|
parent: Pipelines
|
||||||
has_children: true
|
has_children: true
|
||||||
nav_order: 50
|
nav_order: 20
|
||||||
---
|
---
|
||||||
|
|
||||||
# Buffers
|
# Buffers
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: add_entries
|
title: add_entries
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: aggregate
|
title: aggregate
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: copy_values
|
title: copy_values
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: csv
|
title: csv
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: date
|
title: date
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: delete_entries
|
title: delete_entries
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: drop_events
|
title: drop_events
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
||||||
|
@ -14,7 +14,7 @@ Drops all the events that are passed into this processor.
|
||||||
|
|
||||||
Option | Required | Type | Description
|
Option | Required | Type | Description
|
||||||
:--- | :--- | :--- | :---
|
:--- | :--- | :--- | :---
|
||||||
drop_when | Yes | String | Accepts a Data Prepper Expression string following the [Data Prepper Expression Syntax](https://github.com/opensearch-project/data-prepper/blob/main/docs/expression_syntax.md). Configuring `drop_events` with `drop_when: true` drops all the events received.
|
drop_when | Yes | String | Accepts a Data Prepper Expression string following the [Data Prepper Expression Syntax]({{site.url}}{{site.baseurl}}/data-prepper/pipelines/expression-syntax/). Configuring `drop_events` with `drop_when: true` drops all the events received.
|
||||||
handle_failed_events | No | Enum | Specifies how exceptions are handled when an exception occurs while evaluating an event. Default value is `drop`, which drops the event so it doesn't get sent to OpenSearch. Available options are `drop`, `drop_silently`, `skip`, `skip_silently`. For more information, see [handle_failed_events](https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/drop-events-processor#handle_failed_events).
|
handle_failed_events | No | Enum | Specifies how exceptions are handled when an exception occurs while evaluating an event. Default value is `drop`, which drops the event so it doesn't get sent to OpenSearch. Available options are `drop`, `drop_silently`, `skip`, `skip_silently`. For more information, see [handle_failed_events](https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/drop-events-processor#handle_failed_events).
|
||||||
|
|
||||||
<!---## Configuration
|
<!---## Configuration
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: grok
|
title: grok
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: json
|
title: json
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: key_value
|
title: key_value
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: lowercase_string
|
title: lowercase_string
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: otel_trace_raw
|
title: otel_trace_raw
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,8 +2,8 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: Processors
|
title: Processors
|
||||||
has_children: true
|
has_children: true
|
||||||
parent: Configuring Data Prepper
|
parent: Pipelines
|
||||||
nav_order: 100
|
nav_order: 25
|
||||||
---
|
---
|
||||||
|
|
||||||
# Processors
|
# Processors
|
||||||
|
@ -13,10 +13,6 @@ Processors perform some action on your data: filter, transform, enrich, etc.
|
||||||
Prior to Data Prepper 1.3, Processors were named Preppers. Starting in Data Prepper 1.3, the term Prepper is deprecated in favor of Processor. Data Prepper will continue to support the term "Prepper" until 2.0, where it will be removed.
|
Prior to Data Prepper 1.3, Processors were named Preppers. Starting in Data Prepper 1.3, the term Prepper is deprecated in favor of Processor. Data Prepper will continue to support the term "Prepper" until 2.0, where it will be removed.
|
||||||
{: .note }
|
{: .note }
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## copy_values
|
## copy_values
|
||||||
|
|
||||||
Copy values within an event. `copy_values` is part of [mutate event](https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/mutate-event-processors#mutate-event-processors) processors.
|
Copy values within an event. `copy_values` is part of [mutate event](https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/mutate-event-processors#mutate-event-processors) processors.
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: rename_keys
|
title: rename_keys
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 44
|
nav_order: 44
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: routes
|
title: routes
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
|
|
||||||
---
|
---
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: service_map_stateful
|
title: service_map_stateful
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: service_map_stateful
|
title: service_map_stateful
|
||||||
parent: sinks
|
parent: sinks
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: split_string
|
title: split_string
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: string_converter
|
title: string_converter
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: substitute_string
|
title: substitute_string
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: trim_string
|
title: trim_string
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: uppercase_string
|
title: uppercase_string
|
||||||
parent: Processors
|
parent: Processors
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: file sink
|
title: file sink
|
||||||
parent: Sinks
|
parent: Sinks
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: OpenSearch sink
|
title: OpenSearch sink
|
||||||
parent: Sinks
|
parent: Sinks
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: Pipeline sink
|
title: Pipeline sink
|
||||||
parent: Sinks
|
parent: Sinks
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -1,9 +1,9 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Sinks
|
title: Sinks
|
||||||
parent: Configuring Data Prepper
|
parent: Pipelines
|
||||||
has_children: true
|
has_children: true
|
||||||
nav_order: 44
|
nav_order: 30
|
||||||
---
|
---
|
||||||
|
|
||||||
# Sinks
|
# Sinks
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: stdout sink
|
title: stdout sink
|
||||||
parent: Sinks
|
parent: Sinks
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 45
|
nav_order: 45
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: http_source
|
title: http_source
|
||||||
parent: Sources
|
parent: Sources
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 5
|
nav_order: 5
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: otel_metrics_source
|
title: otel_metrics_source
|
||||||
parent: Sources
|
parent: Sources
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 10
|
nav_order: 10
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: otel_trace_source source
|
title: otel_trace_source source
|
||||||
parent: Sources
|
parent: Sources
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 15
|
nav_order: 15
|
||||||
---
|
---
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
layout: default
|
layout: default
|
||||||
title: s3
|
title: s3
|
||||||
parent: Sources
|
parent: Sources
|
||||||
grand_parent: Configuring Data Prepper
|
grand_parent: Pipelines
|
||||||
nav_order: 20
|
nav_order: 20
|
||||||
---
|
---
|
||||||
|
|
||||||
|
@ -23,11 +23,11 @@ on_error | No | String | Determines how to handle errors in Amazon SQS. Can be
|
||||||
buffer_timeout | No | Duration | The timeout for writing events to the Data Prepper buffer. Any events that the S3Source cannot write to the buffer in this time will be discarded. Default is 10 seconds.
|
buffer_timeout | No | Duration | The timeout for writing events to the Data Prepper buffer. Any events that the S3Source cannot write to the buffer in this time will be discarded. Default is 10 seconds.
|
||||||
records_to_accumulate | No | Integer | The number of messages that accumulate before writing to the buffer. Default is 100.
|
records_to_accumulate | No | Integer | The number of messages that accumulate before writing to the buffer. Default is 100.
|
||||||
metadata_root_key | No | String | Base key for adding S3 metadata to each Event. The metadata includes the key and bucket for each S3 object. Defaults to `s3/`.
|
metadata_root_key | No | String | Base key for adding S3 metadata to each Event. The metadata includes the key and bucket for each S3 object. Defaults to `s3/`.
|
||||||
disable_bucket_ownership_validation | No | Boolean | If `true`, then the S3 Source will not attempt to validate that the bucket is owned by the expected account. The only expected account is the same account that owns the SQS queue. Defaults to `false`.
|
disable_bucket_ownership_validation | No | Boolean | If `true`, the S3Source will not attempt to validate that the bucket is owned by the expected account. The expected account is the same account that owns the SQS queue. Defaults to `false`.
|
||||||
|
|
||||||
## sqs
|
## sqs
|
||||||
|
|
||||||
The following are configure usage of Amazon SQS in the S3 Source plugin.
|
The following parameters allow you to configure usage for Amazon SQS in the S3Source plugin.
|
||||||
|
|
||||||
Option | Required | Type | Description
|
Option | Required | Type | Description
|
||||||
:--- | :--- | :--- | :---
|
:--- | :--- | :--- | :---
|
|
@ -1,9 +1,9 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Sources
|
title: Sources
|
||||||
parent: Configuring Data Prepper
|
parent: Pipelines
|
||||||
has_children: true
|
has_children: true
|
||||||
nav_order: 42
|
nav_order: 15
|
||||||
---
|
---
|
||||||
|
|
||||||
# Sources
|
# Sources
|
|
@ -1,7 +1,8 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Expression syntax
|
title: Expression syntax
|
||||||
nav_order: 40
|
parent: Pipelines
|
||||||
|
nav_order: 12
|
||||||
---
|
---
|
||||||
|
|
||||||
# Expression syntax
|
# Expression syntax
|
|
@ -1,8 +1,8 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Pipeline options
|
title: Pipeline options
|
||||||
parent: Configuring Data Prepper
|
parent: Pipelines
|
||||||
nav_order: 41
|
nav_order: 11
|
||||||
---
|
---
|
||||||
|
|
||||||
# Pipeline options
|
# Pipeline options
|
|
@ -1,8 +1,10 @@
|
||||||
---
|
---
|
||||||
layout: default
|
layout: default
|
||||||
title: Pipelines
|
title: Pipelines
|
||||||
|
has_children: true
|
||||||
nav_order: 10
|
nav_order: 10
|
||||||
redirect_from:
|
redirect_from:
|
||||||
|
- /data-prepper/pipelines/
|
||||||
- /clients/data-prepper/pipelines/
|
- /clients/data-prepper/pipelines/
|
||||||
---
|
---
|
||||||
|
|
||||||
|
@ -77,7 +79,12 @@ conditional-routing-sample-pipeline:
|
||||||
|
|
||||||
## Examples
|
## Examples
|
||||||
|
|
||||||
This section provides some pipeline examples that you can use to start creating your own pipelines. For more information, see [Data Prepper configuration reference]({{site.url}}{{site.baseurl}}/clients/data-prepper/data-prepper-reference/) guide.
|
This section provides some pipeline examples that you can use to start creating your own pipelines. For more pipeline configurations, select from the following options for each component:
|
||||||
|
|
||||||
|
- [Buffers]({{site.url}}{{site.baseurl}}/data-prepper/pipelines/configuration/buffers/buffers/)
|
||||||
|
- [Processors]({{site.url}}{{site.baseurl}}/data-prepper/pipelines/configuration/processors/processors/)
|
||||||
|
- [Sinks]({{site.url}}{{site.baseurl}}/data-prepper/pipelines/configuration/sinks/sinks/)
|
||||||
|
- [Sources]({{site.url}}{{site.baseurl}}/data-prepper/pipelines/configuration/sources/sources/)
|
||||||
|
|
||||||
The Data Prepper repository has several [sample applications](https://github.com/opensearch-project/data-prepper/tree/main/examples) to help you get started.
|
The Data Prepper repository has several [sample applications](https://github.com/opensearch-project/data-prepper/tree/main/examples) to help you get started.
|
||||||
|
|
||||||
|
@ -213,10 +220,7 @@ metrics-pipeline:
|
||||||
|
|
||||||
### S3 log ingestion pipeline
|
### S3 log ingestion pipeline
|
||||||
|
|
||||||
The following example demonstrates how to use the S3 Source and Grok Processor plugins to process unstructured log data
|
The following example demonstrates how to use the S3Source and Grok Processor plugins to process unstructured log data from [Amazon Simple Storage Service](https://aws.amazon.com/s3/) (Amazon S3). This example uses application load balancer logs. As the application load balancer writes logs to S3, S3 creates notifications in Amazon SQS. Data Prepper monitors those notifications and reads the S3 objects to get the log data and process it.
|
||||||
from [Amazon Simple Storage Service](https://aws.amazon.com/s3/) (Amazon S3). This example uses Application Load
|
|
||||||
Balancer logs. As the Application Load Balancer writes logs to S3, S3 creates notifications in Amazon SQS. Data Prepper
|
|
||||||
reads those notifications and reads the S3 objects to get the log data and process it.
|
|
||||||
|
|
||||||
```yml
|
```yml
|
||||||
log-pipeline:
|
log-pipeline:
|
||||||
|
@ -293,13 +297,13 @@ docker run --name data-prepper \
|
||||||
opensearchproject/data-prepper:latest
|
opensearchproject/data-prepper:latest
|
||||||
```
|
```
|
||||||
|
|
||||||
## Configure the peer forwarder
|
## Configure peer forwarder
|
||||||
|
|
||||||
Data Prepper provides an HTTP service to forward Events between Data Prepper nodes for aggregation. This is required for operating Data Prepper in a clustered deployment. Currently, peer forwarding is supported in `aggregate`, `service_map_stateful`, and `otel_trace_raw` processors. Peer forwarder groups events based on the identification keys provided by the processors. For `service_map_stateful` and `otel_trace_raw` it's `traceId` by default and can not be configured. For `aggregate` processor, it is configurable using `identification_keys` option.
|
Data Prepper provides an HTTP service to forward Events between Data Prepper nodes for aggregation. This is required for operating Data Prepper in a clustered deployment. Currently, peer forwarding is supported in `aggregate`, `service_map_stateful`, and `otel_trace_raw` processors. Peer forwarder groups events based on the identification keys provided by the processors. For `service_map_stateful` and `otel_trace_raw` it's `traceId` by default and can not be configured. For `aggregate` processor, it is configurable using `identification_keys` option.
|
||||||
|
|
||||||
Peer forwarder supports peer discovery through one of three options: a static list, a DNS record lookup , or AWS Cloud Map. This option can be configured using `discovery_mode` option. Peer forwarder also supports SSL for verification and encrytion, and mTLS for mutual authentication in peer forwarding service.
|
Peer forwarder supports peer discovery through one of three options: a static list, a DNS record lookup , or AWS Cloud Map. Peer discovery can be configured using `discovery_mode` option. Peer forwarder also supports SSL for verification and encryption, and mTLS for mutual authentication in a peer forwarding service.
|
||||||
|
|
||||||
To configure the peer forwarder, add configuration options to `data-prepper-config.yaml` mentioned in the previous [Configure the Data Prepper server](#configure-the-data-prepper-server) section:
|
To configure peer forwarder, add configuration options to `data-prepper-config.yaml` mentioned in the [Configure the Data Prepper server](#configure-the-data-prepper-server) section:
|
||||||
|
|
||||||
```yml
|
```yml
|
||||||
peer_forwarder:
|
peer_forwarder:
|
|
@ -10,7 +10,7 @@ Log ingestion provides a way to transform unstructured log data into structured
|
||||||
|
|
||||||
## Get started with log ingestion
|
## Get started with log ingestion
|
||||||
|
|
||||||
OpenSearch Log Ingestion consists of three components---[Data Prepper]({{site.url}}{{site.baseurl}}/clients/data-prepper/index/), [OpenSearch]({{site.url}}{{site.baseurl}}/) and [OpenSearch Dashboards]({{site.url}}{{site.baseurl}}/)---that fit into the OpenSearch ecosystem. The Data Prepper repository has several [sample applications](https://github.com/opensearch-project/data-prepper/tree/main/examples) to help you get started.
|
OpenSearch Log Ingestion consists of three components---[Data Prepper]({{site.url}}{{site.baseurl}}/clients/data-prepper/index/), [OpenSearch]({{site.url}}{{site.baseurl}}/quickstart/) and [OpenSearch Dashboards]({{site.url}}{{site.baseurl}}/dashboards/index/)---that fit into the OpenSearch ecosystem. The Data Prepper repository has several [sample applications](https://github.com/opensearch-project/data-prepper/tree/main/examples) to help you get started.
|
||||||
|
|
||||||
### Basic flow of data
|
### Basic flow of data
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue