opensearch-docs-cn/_clients/data-prepper/pipelines.md

294 lines
12 KiB
Markdown
Raw Normal View History

---
layout: default
title: Pipelines
parent: Data Prepper
nav_order: 2
---
# Pipelines
![Data Prepper Pipeline]({{site.url}}{{site.baseurl}}/images/data-prepper-pipeline.png)
To use Data Prepper, you define pipelines in a configuration YAML file. Each pipeline is a combination of a source, a buffer, zero or more processors, and one or more sinks. For example:
```yml
simple-sample-pipeline:
workers: 2 # the number of workers
delay: 5000 # in milliseconds, how long workers wait between read attempts
source:
random:
buffer:
bounded_blocking:
buffer_size: 1024 # max number of records the buffer accepts
batch_size: 256 # max number of records the buffer drains after each read
processor:
- string_converter:
upper_case: true
sink:
- stdout:
```
- Sources define where your data comes from. In this case, the source is a random UUID generator (`random`).
- Buffers store data as it passes through the pipeline.
By default, Data Prepper uses its one and only buffer, the `bounded_blocking` buffer, so you can omit this section unless you developed a custom buffer or need to tune the buffer settings.
- Processors perform some action on your data: filter, transform, enrich, etc.
You can have multiple processors, which run sequentially from top to bottom, not in parallel. The `string_converter` processor transform the strings by making them uppercase.
- Sinks define where your data goes. In this case, the sink is stdout.
Data Prepper 2.0 documentation (#1510) * Change Data Prepper intro Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add next steps section Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add David's feedback Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Fix optional tags Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Address small typo * [Data Prepper 2.0]MAINT: documentation change regarding record type (#1306) * MAINT: documentation change regarding record type Signed-off-by: Chen <qchea@amazon.com> * MAINT: documentation on trace group fields Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Chen <qchea@amazon.com> * Update docs for Data Prepper 2.0 (#1404) * Update get-started Signed-off-by: Hai Yan <oeyh@amazon.com> * Update pipelines.md Signed-off-by: Hai Yan <oeyh@amazon.com> * Add peer forwarder options to references Signed-off-by: Hai Yan <oeyh@amazon.com> * Add csv processor options to refereces Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for conditional routing Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for json processor Signed-off-by: Hai Yan <oeyh@amazon.com> * Remove docs for peer forwarder plugin Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review feedback - revise sentences, fix inaccurate info and typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Add missing options for http source and peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> * Update ssl options on peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * More updates for Data Prepper 2.0 (#1469) * Update http source and opensearch sink options Signed-off-by: Hai Yan <oeyh@amazon.com> * Update docker run command Signed-off-by: Hai Yan <oeyh@amazon.com> * Add more missing options Signed-off-by: Hai Yan <oeyh@amazon.com> * Add metadata_root_key for s3 source Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments - tweak sentences and fix typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * Fix broken link Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> Co-authored-by: Naarcha-AWS <naarcha@amazon.com> Co-authored-by: Naarcha-AWS <97990722+Naarcha-AWS@users.noreply.github.com> Co-authored-by: Qi Chen <qchea@amazon.com> Co-authored-by: Hai Yan <8153134+oeyh@users.noreply.github.com>
2022-10-11 12:36:37 -04:00
Starting from Data Prepper 2.0, you can define pipelines across multiple configuration YAML files, where each file contains the configuration for one or more pipelines. This gives you more freedom to organize and chain complex pipeline configurations. For Data Prepper to load your pipeline configuration properly, place your configuration YAML files in the `pipelines` folder under your application's home directory (e.g. `/usr/share/data-prepper`).
{: .note }
## Conditional Routing
Pipelines also support **conditional routing** which allows you to route Events to different sinks based on specific conditions. To add conditional routing to a pipeline, specify a list of named routes under the `route` component and add specific routes to sinks under the `routes` property. Any sink with the `routes` property will only accept Events that match at least one of the routing conditions.
In the following example, `application-logs` is a named route with a condition set to `/log_type == "application"`. The route uses [Data Prepper expressions](https://github.com/opensearch-project/data-prepper/tree/main/examples) to define the conditions. Data Prepper only routes events that satisfy the condition to the first OpenSearch sink. By default, Data Prepper routes all Events to a sink which does not define a route. In the example, all Events route into the third OpenSearch sink.
```yml
conditional-routing-sample-pipeline:
source:
http:
processor:
route:
- application-logs: '/log_type == "application"'
- http-logs: '/log_type == "apache"'
sink:
- opensearch:
hosts: [ "https://opensearch:9200" ]
index: application_logs
routes: [application-logs]
- opensearch:
hosts: [ "https://opensearch:9200" ]
index: http_logs
routes: [http-logs]
- opensearch:
hosts: [ "https://opensearch:9200" ]
index: all_logs
```
## Examples
This section provides some pipeline examples that you can use to start creating your own pipelines. For more information, see [Data Prepper configuration reference]({{site.url}}{{site.baseurl}}/clients/data-prepper/data-prepper-reference/) guide.
The Data Prepper repository has several [sample applications](https://github.com/opensearch-project/data-prepper/tree/main/examples) to help you get started.
### Log ingestion pipeline
The following example demonstrates how to use HTTP source and Grok prepper plugins to process unstructured log data.
```yml
log-pipeline:
source:
http:
ssl: false
processor:
- grok:
match:
log: [ "%{COMMONAPACHELOG}" ]
sink:
- opensearch:
hosts: [ "https://opensearch:9200" ]
insecure: true
username: admin
password: admin
index: apache_logs
```
This example uses weak security. We strongly recommend securing all plugins which open external ports in production environments.
{: .note}
### Trace analytics pipeline
The following example demonstrates how to build a pipeline that supports the [Trace Analytics OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/observability-plugin/trace/ta-dashboards/). This pipeline takes data from the OpenTelemetry Collector and uses two other pipelines as sinks. These two separate pipelines index trace and the service map documents for the dashboard plugin.
Data Prepper 2.0 documentation (#1510) * Change Data Prepper intro Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add next steps section Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add David's feedback Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Fix optional tags Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Address small typo * [Data Prepper 2.0]MAINT: documentation change regarding record type (#1306) * MAINT: documentation change regarding record type Signed-off-by: Chen <qchea@amazon.com> * MAINT: documentation on trace group fields Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Chen <qchea@amazon.com> * Update docs for Data Prepper 2.0 (#1404) * Update get-started Signed-off-by: Hai Yan <oeyh@amazon.com> * Update pipelines.md Signed-off-by: Hai Yan <oeyh@amazon.com> * Add peer forwarder options to references Signed-off-by: Hai Yan <oeyh@amazon.com> * Add csv processor options to refereces Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for conditional routing Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for json processor Signed-off-by: Hai Yan <oeyh@amazon.com> * Remove docs for peer forwarder plugin Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review feedback - revise sentences, fix inaccurate info and typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Add missing options for http source and peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> * Update ssl options on peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * More updates for Data Prepper 2.0 (#1469) * Update http source and opensearch sink options Signed-off-by: Hai Yan <oeyh@amazon.com> * Update docker run command Signed-off-by: Hai Yan <oeyh@amazon.com> * Add more missing options Signed-off-by: Hai Yan <oeyh@amazon.com> * Add metadata_root_key for s3 source Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments - tweak sentences and fix typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * Fix broken link Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> Co-authored-by: Naarcha-AWS <naarcha@amazon.com> Co-authored-by: Naarcha-AWS <97990722+Naarcha-AWS@users.noreply.github.com> Co-authored-by: Qi Chen <qchea@amazon.com> Co-authored-by: Hai Yan <8153134+oeyh@users.noreply.github.com>
2022-10-11 12:36:37 -04:00
Starting from Data Prepper 2.0, Data Prepper no longer supports `otel_trace_raw_prepper` processor due to the Data Prepper internal data model evolution.
Instead, users should use `otel_trace_raw`.
```yml
entry-pipeline:
delay: "100"
source:
otel_trace_source:
ssl: false
buffer:
bounded_blocking:
buffer_size: 10240
batch_size: 160
sink:
- pipeline:
name: "raw-pipeline"
- pipeline:
name: "service-map-pipeline"
raw-pipeline:
source:
pipeline:
name: "entry-pipeline"
buffer:
bounded_blocking:
buffer_size: 10240
batch_size: 160
processor:
- otel_trace_raw:
sink:
- opensearch:
hosts: ["https://localhost:9200"]
insecure: true
username: admin
password: admin
index_type: trace-analytics-raw
service-map-pipeline:
delay: "100"
source:
pipeline:
name: "entry-pipeline"
buffer:
bounded_blocking:
buffer_size: 10240
batch_size: 160
processor:
- service_map_stateful:
sink:
- opensearch:
hosts: ["https://localhost:9200"]
insecure: true
username: admin
password: admin
index_type: trace-analytics-service-map
```
Data Prepper 2.0 documentation (#1510) * Change Data Prepper intro Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add next steps section Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add David's feedback Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Fix optional tags Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Address small typo * [Data Prepper 2.0]MAINT: documentation change regarding record type (#1306) * MAINT: documentation change regarding record type Signed-off-by: Chen <qchea@amazon.com> * MAINT: documentation on trace group fields Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Chen <qchea@amazon.com> * Update docs for Data Prepper 2.0 (#1404) * Update get-started Signed-off-by: Hai Yan <oeyh@amazon.com> * Update pipelines.md Signed-off-by: Hai Yan <oeyh@amazon.com> * Add peer forwarder options to references Signed-off-by: Hai Yan <oeyh@amazon.com> * Add csv processor options to refereces Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for conditional routing Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for json processor Signed-off-by: Hai Yan <oeyh@amazon.com> * Remove docs for peer forwarder plugin Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review feedback - revise sentences, fix inaccurate info and typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Add missing options for http source and peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> * Update ssl options on peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * More updates for Data Prepper 2.0 (#1469) * Update http source and opensearch sink options Signed-off-by: Hai Yan <oeyh@amazon.com> * Update docker run command Signed-off-by: Hai Yan <oeyh@amazon.com> * Add more missing options Signed-off-by: Hai Yan <oeyh@amazon.com> * Add metadata_root_key for s3 source Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments - tweak sentences and fix typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * Fix broken link Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> Co-authored-by: Naarcha-AWS <naarcha@amazon.com> Co-authored-by: Naarcha-AWS <97990722+Naarcha-AWS@users.noreply.github.com> Co-authored-by: Qi Chen <qchea@amazon.com> Co-authored-by: Hai Yan <8153134+oeyh@users.noreply.github.com>
2022-10-11 12:36:37 -04:00
To maintain similar ingestion throughput and latency, scale the `buffer_size` and `batch_size` by the estimated maximum batch size in the client request payload.
{: .tip}
### Metrics pipeline
Data Prepper supports metrics ingestion using OTel. It currently supports the following metric types:
* Gauge
* Sum
* Summary
* Histogram
Other types are not supported. Data Prepper drops all other types, including Exponential Histogram and Summary. Additionally, Data Prepper does not support Scope instrumentation.
To set up a metrics pipeline:
```yml
metrics-pipeline:
source:
otel_metrics_source:
processor:
- otel_metrics_raw_processor:
sink:
- opensearch:
hosts: ["https://localhost:9200"]
username: admin
password: admin
```
### S3 log ingestion pipeline
The following example demonstrates how to use the S3 Source and Grok Processor plugins to process unstructured log data
from [Amazon Simple Storage Service](https://aws.amazon.com/s3/) (Amazon S3). This example uses Application Load
Balancer logs. As the Application Load Balancer writes logs to S3, S3 creates notifications in Amazon SQS. Data Prepper
reads those notifications and reads the S3 objects to get the log data and process it.
Data Prepper 2.0 documentation (#1510) * Change Data Prepper intro Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add next steps section Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add David's feedback Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Fix optional tags Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Address small typo * [Data Prepper 2.0]MAINT: documentation change regarding record type (#1306) * MAINT: documentation change regarding record type Signed-off-by: Chen <qchea@amazon.com> * MAINT: documentation on trace group fields Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Chen <qchea@amazon.com> * Update docs for Data Prepper 2.0 (#1404) * Update get-started Signed-off-by: Hai Yan <oeyh@amazon.com> * Update pipelines.md Signed-off-by: Hai Yan <oeyh@amazon.com> * Add peer forwarder options to references Signed-off-by: Hai Yan <oeyh@amazon.com> * Add csv processor options to refereces Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for conditional routing Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for json processor Signed-off-by: Hai Yan <oeyh@amazon.com> * Remove docs for peer forwarder plugin Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review feedback - revise sentences, fix inaccurate info and typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Add missing options for http source and peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> * Update ssl options on peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * More updates for Data Prepper 2.0 (#1469) * Update http source and opensearch sink options Signed-off-by: Hai Yan <oeyh@amazon.com> * Update docker run command Signed-off-by: Hai Yan <oeyh@amazon.com> * Add more missing options Signed-off-by: Hai Yan <oeyh@amazon.com> * Add metadata_root_key for s3 source Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments - tweak sentences and fix typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * Fix broken link Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> Co-authored-by: Naarcha-AWS <naarcha@amazon.com> Co-authored-by: Naarcha-AWS <97990722+Naarcha-AWS@users.noreply.github.com> Co-authored-by: Qi Chen <qchea@amazon.com> Co-authored-by: Hai Yan <8153134+oeyh@users.noreply.github.com>
2022-10-11 12:36:37 -04:00
```yml
log-pipeline:
source:
s3:
notification_type: "sqs"
compression: "gzip"
codec:
newline:
sqs:
queue_url: "https://sqs.us-east-1.amazonaws.com/12345678910/ApplicationLoadBalancer"
aws:
region: "us-east-1"
sts_role_arn: "arn:aws:iam::12345678910:role/Data-Prepper"
processor:
- grok:
match:
message: ["%{DATA:type} %{TIMESTAMP_ISO8601:time} %{DATA:elb} %{DATA:client} %{DATA:target} %{BASE10NUM:request_processing_time} %{DATA:target_processing_time} %{BASE10NUM:response_processing_time} %{BASE10NUM:elb_status_code} %{DATA:target_status_code} %{BASE10NUM:received_bytes} %{BASE10NUM:sent_bytes} \"%{DATA:request}\" \"%{DATA:user_agent}\" %{DATA:ssl_cipher} %{DATA:ssl_protocol} %{DATA:target_group_arn} \"%{DATA:trace_id}\" \"%{DATA:domain_name}\" \"%{DATA:chosen_cert_arn}\" %{DATA:matched_rule_priority} %{TIMESTAMP_ISO8601:request_creation_time} \"%{DATA:actions_executed}\" \"%{DATA:redirect_url}\" \"%{DATA:error_reason}\" \"%{DATA:target_list}\" \"%{DATA:target_status_code_list}\" \"%{DATA:classification}\" \"%{DATA:classification_reason}"]
- grok:
match:
request: ["(%{NOTSPACE:http_method})? (%{NOTSPACE:http_uri})? (%{NOTSPACE:http_version})?"]
- grok:
match:
http_uri: ["(%{WORD:protocol})?(://)?(%{IPORHOST:domain})?(:)?(%{INT:http_port})?(%{GREEDYDATA:request_uri})?"]
- date:
from_time_received: true
destination: "@timestamp"
sink:
- opensearch:
hosts: [ "https://localhost:9200" ]
username: "admin"
password: "admin"
index: alb_logs
```
## Migrating from Logstash
Data Prepper supports Logstash configuration files for a limited set of plugins. Simply use the logstash config to run Data Prepper.
```bash
docker run --name data-prepper \
Data Prepper 2.0 documentation (#1510) * Change Data Prepper intro Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add next steps section Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add David's feedback Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Fix optional tags Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Address small typo * [Data Prepper 2.0]MAINT: documentation change regarding record type (#1306) * MAINT: documentation change regarding record type Signed-off-by: Chen <qchea@amazon.com> * MAINT: documentation on trace group fields Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Chen <qchea@amazon.com> * Update docs for Data Prepper 2.0 (#1404) * Update get-started Signed-off-by: Hai Yan <oeyh@amazon.com> * Update pipelines.md Signed-off-by: Hai Yan <oeyh@amazon.com> * Add peer forwarder options to references Signed-off-by: Hai Yan <oeyh@amazon.com> * Add csv processor options to refereces Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for conditional routing Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for json processor Signed-off-by: Hai Yan <oeyh@amazon.com> * Remove docs for peer forwarder plugin Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review feedback - revise sentences, fix inaccurate info and typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Add missing options for http source and peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> * Update ssl options on peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * More updates for Data Prepper 2.0 (#1469) * Update http source and opensearch sink options Signed-off-by: Hai Yan <oeyh@amazon.com> * Update docker run command Signed-off-by: Hai Yan <oeyh@amazon.com> * Add more missing options Signed-off-by: Hai Yan <oeyh@amazon.com> * Add metadata_root_key for s3 source Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments - tweak sentences and fix typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * Fix broken link Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> Co-authored-by: Naarcha-AWS <naarcha@amazon.com> Co-authored-by: Naarcha-AWS <97990722+Naarcha-AWS@users.noreply.github.com> Co-authored-by: Qi Chen <qchea@amazon.com> Co-authored-by: Hai Yan <8153134+oeyh@users.noreply.github.com>
2022-10-11 12:36:37 -04:00
-v /full/path/to/logstash.conf:/usr/share/data-prepper/pipelines/pipelines.conf \
opensearchproject/opensearch-data-prepper:latest
```
This feature is limited by feature parity of Data Prepper. As of Data Prepper 1.2 release, the following plugins from the Logstash configuration are supported:
- HTTP Input plugin
- Grok Filter plugin
- Elasticsearch Output plugin
- Amazon Elasticsearch Output plugin
## Configure the Data Prepper server
Data Prepper itself provides administrative HTTP endpoints such as `/list` to list pipelines and `/metrics/prometheus` to provide Prometheus-compatible metrics data. The port that has these endpoints has a TLS configuration and is specified by a separate YAML file. By default, these endpoints are secured by Data Prepper docker images. We strongly recommend providing your own configuration file for securing production environments. Here is an example `data-prepper-config.yaml`:
```yml
ssl: true
keyStoreFilePath: "/usr/share/data-prepper/keystore.jks"
keyStorePassword: "password"
privateKeyPassword: "other_password"
serverPort: 1234
```
To configure the Data Prepper server, run Data Prepper with the additional yaml file.
```bash
Data Prepper 2.0 documentation (#1510) * Change Data Prepper intro Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add next steps section Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add David's feedback Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Fix optional tags Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Address small typo * [Data Prepper 2.0]MAINT: documentation change regarding record type (#1306) * MAINT: documentation change regarding record type Signed-off-by: Chen <qchea@amazon.com> * MAINT: documentation on trace group fields Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Chen <qchea@amazon.com> * Update docs for Data Prepper 2.0 (#1404) * Update get-started Signed-off-by: Hai Yan <oeyh@amazon.com> * Update pipelines.md Signed-off-by: Hai Yan <oeyh@amazon.com> * Add peer forwarder options to references Signed-off-by: Hai Yan <oeyh@amazon.com> * Add csv processor options to refereces Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for conditional routing Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for json processor Signed-off-by: Hai Yan <oeyh@amazon.com> * Remove docs for peer forwarder plugin Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review feedback - revise sentences, fix inaccurate info and typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Add missing options for http source and peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> * Update ssl options on peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * More updates for Data Prepper 2.0 (#1469) * Update http source and opensearch sink options Signed-off-by: Hai Yan <oeyh@amazon.com> * Update docker run command Signed-off-by: Hai Yan <oeyh@amazon.com> * Add more missing options Signed-off-by: Hai Yan <oeyh@amazon.com> * Add metadata_root_key for s3 source Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments - tweak sentences and fix typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * Fix broken link Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> Co-authored-by: Naarcha-AWS <naarcha@amazon.com> Co-authored-by: Naarcha-AWS <97990722+Naarcha-AWS@users.noreply.github.com> Co-authored-by: Qi Chen <qchea@amazon.com> Co-authored-by: Hai Yan <8153134+oeyh@users.noreply.github.com>
2022-10-11 12:36:37 -04:00
docker run --name data-prepper \
-v /full/path/to/my-pipelines.yaml:/usr/share/data-prepper/pipelines/my-pipelines.yaml \
-v /full/path/to/data-prepper-config.yaml:/usr/share/data-prepper/data-prepper-config.yaml \
opensearchproject/data-prepper:latest
Data Prepper 2.0 documentation (#1510) * Change Data Prepper intro Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add next steps section Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Add David's feedback Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Fix optional tags Signed-off-by: Naarcha-AWS <naarcha@amazon.com> * Address small typo * [Data Prepper 2.0]MAINT: documentation change regarding record type (#1306) * MAINT: documentation change regarding record type Signed-off-by: Chen <qchea@amazon.com> * MAINT: documentation on trace group fields Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Chen <qchea@amazon.com> * Update docs for Data Prepper 2.0 (#1404) * Update get-started Signed-off-by: Hai Yan <oeyh@amazon.com> * Update pipelines.md Signed-off-by: Hai Yan <oeyh@amazon.com> * Add peer forwarder options to references Signed-off-by: Hai Yan <oeyh@amazon.com> * Add csv processor options to refereces Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for conditional routing Signed-off-by: Hai Yan <oeyh@amazon.com> * Add docs for json processor Signed-off-by: Hai Yan <oeyh@amazon.com> * Remove docs for peer forwarder plugin Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review feedback - revise sentences, fix inaccurate info and typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Add missing options for http source and peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> * Update ssl options on peer forwarder Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * More updates for Data Prepper 2.0 (#1469) * Update http source and opensearch sink options Signed-off-by: Hai Yan <oeyh@amazon.com> * Update docker run command Signed-off-by: Hai Yan <oeyh@amazon.com> * Add more missing options Signed-off-by: Hai Yan <oeyh@amazon.com> * Add metadata_root_key for s3 source Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments - tweak sentences and fix typos Signed-off-by: Hai Yan <oeyh@amazon.com> * Address review comments Signed-off-by: Hai Yan <oeyh@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> * Fix broken link Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Naarcha-AWS <naarcha@amazon.com> Signed-off-by: Chen <qchea@amazon.com> Signed-off-by: Hai Yan <oeyh@amazon.com> Co-authored-by: Naarcha-AWS <naarcha@amazon.com> Co-authored-by: Naarcha-AWS <97990722+Naarcha-AWS@users.noreply.github.com> Co-authored-by: Qi Chen <qchea@amazon.com> Co-authored-by: Hai Yan <8153134+oeyh@users.noreply.github.com>
2022-10-11 12:36:37 -04:00
```
## Configure the peer forwarder
Data Prepper provides an HTTP service to forward Events between Data Prepper nodes for aggregation. This is required for operating Data Prepper in a clustered deployment. Currently, peer forwarding is supported in `aggregate`, `service_map_stateful`, and `otel_trace_raw` processors. Peer forwarder groups events based on the identification keys provided by the processors. For `service_map_stateful` and `otel_trace_raw` it's `traceId` by default and can not be configured. For `aggregate` processor, it is configurable using `identification_keys` option.
Peer forwarder supports peer discovery through one of three options: a static list, a DNS record lookup , or AWS Cloud Map. This option can be configured using `discovery_mode` option. Peer forwarder also supports SSL for verification and encrytion, and mTLS for mutual authentication in peer forwarding service.
To configure the peer forwarder, add configuration options to `data-prepper-config.yaml` mentioned in the previous [Configure the Data Prepper server](#configure-the-data-prepper-server) section:
```yml
peer_forwarder:
discovery_mode: dns
domain_name: "data-prepper-cluster.my-domain.net"
ssl: true
ssl_certificate_file: "<cert-file-path>"
ssl_key_file: "<private-key-file-path>"
authentication:
mutual_tls:
```