Merge pull request #400 from opensearch-project/data-prepper-move

Moving data prepper to clients and tools
This commit is contained in:
Keith Chan 2022-02-09 13:06:35 -08:00 committed by GitHub
commit 2555f1ba1c
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
24 changed files with 32 additions and 32 deletions

View File

@ -1,7 +1,7 @@
--- ---
layout: default layout: default
title: Agents and ingestion tools title: Agents and ingestion tools
nav_order: 100 nav_order: 140
has_children: false has_children: false
has_toc: false has_toc: false
redirect_from: redirect_from:

View File

@ -7,7 +7,7 @@ nav_order: 3
# Data Prepper configuration reference # Data Prepper configuration reference
This page lists all supported Data Prepper server, sources, buffers, preppers, and sinks, along with their associated options. For example configuration files, see [Data Prepper]({{site.url}}{{site.baseurl}}/observability/data-prepper/pipelines/). This page lists all supported Data Prepper server, sources, buffers, preppers, and sinks, along with their associated options. For example configuration files, see [Data Prepper]({{site.url}}{{site.baseurl}}/clients/data-prepper/pipelines/).
## Data Prepper server options ## Data Prepper server options
@ -49,7 +49,7 @@ max_connection_count | No | Integer | The maximum allowed number of open connect
ssl | No | Boolea | Enables connections to the OTel source port over TLS/SSL. Defaults to `true`. ssl | No | Boolea | Enables connections to the OTel source port over TLS/SSL. Defaults to `true`.
sslKeyCertChainFile | Conditionally | String | File-system path or AWS S3 path to the security certificate (e.g. `"config/demo-data-prepper.crt"` or `"s3://my-secrets-bucket/demo-data-prepper.crt"`). Required if ssl is set to `true`. sslKeyCertChainFile | Conditionally | String | File-system path or AWS S3 path to the security certificate (e.g. `"config/demo-data-prepper.crt"` or `"s3://my-secrets-bucket/demo-data-prepper.crt"`). Required if ssl is set to `true`.
sslKeyFile | Conditionally | String | File-system path or AWS S3 path to the security key (e.g. `"config/demo-data-prepper.key"` or `"s3://my-secrets-bucket/demo-data-prepper.key"`). Required if ssl is set to `true`. sslKeyFile | Conditionally | String | File-system path or AWS S3 path to the security key (e.g. `"config/demo-data-prepper.key"` or `"s3://my-secrets-bucket/demo-data-prepper.key"`). Required if ssl is set to `true`.
useAcmCertForSSL | No | Boolean, enables TLS/SSL using certificate and private key from AWS Certificate Manager (ACM). Default is `false`. useAcmCertForSSL | No | Boolean | Whether to enable TLS/SSL using certificate and private key from AWS Certificate Manager (ACM). Default is `false`.
acmCertificateArn | Conditionally | String | Represents the ACM certificate ARN. ACM certificate take preference over S3 or local file system certificate. Required if `useAcmCertForSSL` is set to `true`. acmCertificateArn | Conditionally | String | Represents the ACM certificate ARN. ACM certificate take preference over S3 or local file system certificate. Required if `useAcmCertForSSL` is set to `true`.
awsRegion | Conditionally | String | Represents the AWS region to use ACM or S3. Required if `useAcmCertForSSL` is set to `true` or `sslKeyCertChainFile` and `sslKeyFile` are AWS S3 paths. awsRegion | Conditionally | String | Represents the AWS region to use ACM or S3. Required if `useAcmCertForSSL` is set to `true` or `sslKeyCertChainFile` and `sslKeyFile` are AWS S3 paths.
authentication | No | Object| An authentication configuration. By default, this runs an unauthenticated server. This uses pluggable authentication for HTTPS. To use basic authentication, define the `http_basic` plugin with a `username` and `password`. To provide customer authentication use or create a plugin which implements: [GrpcAuthenticationProvider](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/armeria-common/src/main/java/com/amazon/dataprepper/armeria/authentication/GrpcAuthenticationProvider.java). authentication | No | Object| An authentication configuration. By default, this runs an unauthenticated server. This uses pluggable authentication for HTTPS. To use basic authentication, define the `http_basic` plugin with a `username` and `password`. To provide customer authentication use or create a plugin which implements: [GrpcAuthenticationProvider](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/armeria-common/src/main/java/com/amazon/dataprepper/armeria/authentication/GrpcAuthenticationProvider.java).

View File

@ -41,7 +41,7 @@ docker run --name data-prepper \
opensearchproject/opensearch-data-prepper:latest opensearchproject/opensearch-data-prepper:latest
``` ```
This sample pipeline configuration above demonstrates a simple pipeline with a source (`random`) sending data to a sink (`stdout`). For more examples and details on more advanced pipeline configurations, see [Pipelines]({{site.url}}{{site.baseurl}}/observability/data-prepper/pipelines). This sample pipeline configuration above demonstrates a simple pipeline with a source (`random`) sending data to a sink (`stdout`). For more examples and details on more advanced pipeline configurations, see [Pipelines]({{site.url}}{{site.baseurl}}/clients/data-prepper/pipelines).
After starting Data Prepper, you should see log output and some UUIDs after a few seconds: After starting Data Prepper, you should see log output and some UUIDs after a few seconds:

View File

@ -1,7 +1,7 @@
--- ---
layout: default layout: default
title: Data Prepper title: Data Prepper
nav_order: 80 nav_order: 120
has_children: true has_children: true
has_toc: false has_toc: false
--- ---
@ -10,6 +10,6 @@ has_toc: false
Data Prepper is a server side data collector capable of filtering, enriching, transforming, normalizing and aggregating data for downstream analytics and visualization. Data Prepper is a server side data collector capable of filtering, enriching, transforming, normalizing and aggregating data for downstream analytics and visualization.
Data Prepper lets users build custom pipelines to improve the operational view of applications. Two common uses for Data Prepper are trace and log analytics. [Trace analytics]({{site.url}}{{site.baseurl}}/observability/trace/index/) can help you visualize the flow of events and identify performance problems, and [log analytics]({{site.url}}{{site.baseurl}}/observability/log-analytics/) can improve searching, analyzing and provide insights into your application. Data Prepper lets users build custom pipelines to improve the operational view of applications. Two common uses for Data Prepper are trace and log analytics. [Trace analytics]({{site.url}}{{site.baseurl}}/observability-plugin/trace/index/) can help you visualize the flow of events and identify performance problems, and [log analytics]({{site.url}}{{site.baseurl}}/observability-plugin/log-analytics/) can improve searching, analyzing and provide insights into your application.
To get started building your own custom pipelines with Data Prepper, see the [Get Started]({{site.url}}{{site.baseurl}}/observability/data-prepper/get-started/) guide. To get started building your own custom pipelines with Data Prepper, see the [Get Started]({{site.url}}{{site.baseurl}}/clients/data-prepper/get-started/) guide.

View File

@ -42,7 +42,7 @@ simple-sample-pipeline:
## Examples ## Examples
This section provides some pipeline examples that you can use to start creating your own pipelines. For more information, see [Data Prepper configuration reference]({{site.url}}{{site.baseurl}}/observability/data-prepper/data-prepper-reference/) guide. This section provides some pipeline examples that you can use to start creating your own pipelines. For more information, see [Data Prepper configuration reference]({{site.url}}{{site.baseurl}}/clients/data-prepper/data-prepper-reference/) guide.
The Data Prepper repository has several [sample applications](https://github.com/opensearch-project/data-prepper/tree/main/examples) to help you get started. The Data Prepper repository has several [sample applications](https://github.com/opensearch-project/data-prepper/tree/main/examples) to help you get started.
@ -73,7 +73,7 @@ This example uses weak security. We strongly recommend securing all plugins whic
### Trace Analytics pipeline ### Trace Analytics pipeline
The following example demonstrates how to build a pipeline that supports the [Trace Analytics OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/observability/trace/ta-dashboards/). This pipeline takes data from the OpenTelemetry Collector and uses two other pipelines as sinks. These two separate pipelines index trace and the service map documents for the dashboard plugin. The following example demonstrates how to build a pipeline that supports the [Trace Analytics OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/observability-plugin/trace/ta-dashboards/). This pipeline takes data from the OpenTelemetry Collector and uses two other pipelines as sinks. These two separate pipelines index trace and the service map documents for the dashboard plugin.
```yml ```yml
entry-pipeline: entry-pipeline:

View File

@ -1,7 +1,7 @@
--- ---
layout: default layout: default
title: JavaScript client title: JavaScript client
nav_order: 90 nav_order: 100
--- ---
# JavaScript client # JavaScript client

View File

@ -49,7 +49,7 @@ collections:
replication-plugin: replication-plugin:
permalink: /:collection/:path/ permalink: /:collection/:path/
output: true output: true
observability: observability-plugin:
permalink: /:collection/:path/ permalink: /:collection/:path/
output: true output: true
monitoring-plugins: monitoring-plugins:
@ -91,8 +91,8 @@ just_the_docs:
replication-plugin: replication-plugin:
name: Replication plugin name: Replication plugin
nav_fold: true nav_fold: true
observability: observability-plugin:
name: Observability name: Observability plugin
nav_fold: true nav_fold: true
monitoring-plugins: monitoring-plugins:
name: Monitoring plugins name: Monitoring plugins

View File

@ -6,7 +6,7 @@ nav_order: 10
# Event analytics # Event analytics
Event analytics in observability is where you can use [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability/ppl/index) (PPL) queries to build and view different visualizations of your data. Event analytics in observability is where you can use [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability-plugin/ppl/index) (PPL) queries to build and view different visualizations of your data.
## Get started with event analytics ## Get started with event analytics
@ -24,10 +24,10 @@ source = opensearch_dashboards_sample_data_logs | fields host | stats count()
By default, Dashboards shows results from the last 15 minutes of your data. To see data from a different timeframe, use the date and time selector. By default, Dashboards shows results from the last 15 minutes of your data. To see data from a different timeframe, use the date and time selector.
For more information about building PPL queries, see [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability/ppl/index). For more information about building PPL queries, see [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability-plugin/ppl/index).
## Save a visualization ## Save a visualization
After Dashboards generates a visualization, you must save it if you want to return to it at a later time or if you want to add it to an [operational panel]({{site.url}}{{site.baseurl}}/observability/operational-panels). After Dashboards generates a visualization, you must save it if you want to return to it at a later time or if you want to add it to an [operational panel]({{site.url}}{{site.baseurl}}/observability-plugin/operational-panels).
To save a visualization, expand the save dropdown menu next to **Run**, enter a name for your visualization, then choose **Save**. You can reopen any saved visualizations on the event analytics page. To save a visualization, expand the save dropdown menu next to **Run**, enter a name for your visualization, then choose **Save**. You can reopen any saved visualizations on the event analytics page.

View File

@ -4,8 +4,8 @@ title: About Observability
nav_order: 1 nav_order: 1
has_children: false has_children: false
redirect_from: redirect_from:
- /observability/ - /observability-plugin/
- /observability/ - /observability-plugin/
--- ---
# About Observability # About Observability
@ -16,13 +16,13 @@ Observability is collection of plugins and applications that let you visualize d
Your experience of exploring data might differ, but if you're new to exploring data to create visualizations, we recommend trying a workflow like the following: Your experience of exploring data might differ, but if you're new to exploring data to create visualizations, we recommend trying a workflow like the following:
1. Explore data over a certain timeframe using [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability/ppl/index). 1. Explore data over a certain timeframe using [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability-plugin/ppl/index).
2. Use [event analytics]({{site.url}}{{site.baseurl}}/observability/event-analytics) to turn data-driven events into visualizations. 2. Use [event analytics]({{site.url}}{{site.baseurl}}/observability-plugin/event-analytics) to turn data-driven events into visualizations.
![Sample Event Analytics View]({{site.url}}{{site.baseurl}}/images/event-analytics.png) ![Sample Event Analytics View]({{site.url}}{{site.baseurl}}/images/event-analytics.png)
3. Create [operational panels]({{site.url}}{{site.baseurl}}/observability/operational-panels) and add visualizations to compare data the way you like. 3. Create [operational panels]({{site.url}}{{site.baseurl}}/observability-plugin/operational-panels) and add visualizations to compare data the way you like.
![Sample Operational Panel View]({{site.url}}{{site.baseurl}}/images/operational-panel.png) ![Sample Operational Panel View]({{site.url}}{{site.baseurl}}/images/operational-panel.png)
4. Use [log analytics]({{site.url}}{{site.baseurl}}/observability/log-analytics) to transform unstructured log data. 4. Use [log analytics]({{site.url}}{{site.baseurl}}/observability-plugin/log-analytics) to transform unstructured log data.
5. Use [trace analytics]({{site.url}}{{site.baseurl}}/observability/trace/index) to create traces and dive deep into your data. 5. Use [trace analytics]({{site.url}}{{site.baseurl}}/observability-plugin/trace/index) to create traces and dive deep into your data.
![Sample Trace Analytics View]({{site.url}}{{site.baseurl}}/images/observability-trace.png) ![Sample Trace Analytics View]({{site.url}}{{site.baseurl}}/images/observability-trace.png)
6. Leverage [notebooks]({{site.url}}{{site.baseurl}}/observability/notebooks) to combine different visualizations and code blocks that you can share with team members. 6. Leverage [notebooks]({{site.url}}{{site.baseurl}}/observability-plugin/notebooks) to combine different visualizations and code blocks that you can share with team members.
![Sample Notebooks View]({{site.url}}{{site.baseurl}}/images/notebooks.png) ![Sample Notebooks View]({{site.url}}{{site.baseurl}}/images/notebooks.png)

View File

@ -10,7 +10,7 @@ Log ingestion provides a way to transform unstructured log data into structured
## Get started with log ingestion ## Get started with log ingestion
OpenSearch Log Ingestion consists of three components---[Data Prepper]({{site.url}}{{site.baseurl}}/observability/data-prepper/index/), [OpenSearch]({{site.url}}{{site.baseurl}}/) and [OpenSearch Dashboards]({{site.url}}{{site.baseurl}}/)---that fit into the OpenSearch ecosystem. The Data Prepper repository has several [sample applications](https://github.com/opensearch-project/data-prepper/tree/main/examples) to help you get started. OpenSearch Log Ingestion consists of three components---[Data Prepper]({{site.url}}{{site.baseurl}}/clients/data-prepper/index/), [OpenSearch]({{site.url}}{{site.baseurl}}/) and [OpenSearch Dashboards]({{site.url}}{{site.baseurl}}/)---that fit into the OpenSearch ecosystem. The Data Prepper repository has several [sample applications](https://github.com/opensearch-project/data-prepper/tree/main/examples) to help you get started.
### Basic flow of data ### Basic flow of data
@ -20,7 +20,7 @@ OpenSearch Log Ingestion consists of three components---[Data Prepper]({{site.ur
(In the [example](#example) below, [FluentBit](https://docs.fluentbit.io/manual/) is used as a log collector that collects log data from a file and sends the log data to Data Prepper). (In the [example](#example) below, [FluentBit](https://docs.fluentbit.io/manual/) is used as a log collector that collects log data from a file and sends the log data to Data Prepper).
2. [Data Prepper]({{site.url}}{{site.baseurl}}/observability/data-prepper/index/) receives the log data, transforms the data into a structure format, and indexes it on an OpenSearch cluster. 2. [Data Prepper]({{site.url}}{{site.baseurl}}/clients/data-prepper/index/) receives the log data, transforms the data into a structure format, and indexes it on an OpenSearch cluster.
3. The data can then be explored through OpenSearch search queries or the **Discover** page in OpenSearch Dashboards. 3. The data can then be explored through OpenSearch search queries or the **Discover** page in OpenSearch Dashboards.

View File

@ -6,7 +6,7 @@ nav_order: 30
# Operational panels # Operational panels
Operational panels in OpenSearch Dashboards are collections of visualizations generated using [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability/ppl/index) (PPL) queries. Operational panels in OpenSearch Dashboards are collections of visualizations generated using [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability-plugin/ppl/index) (PPL) queries.
## Get started with operational panels ## Get started with operational panels
@ -16,7 +16,7 @@ If you want to start using operational panels without adding any data, expand th
To create an operational panel and add visualizations: To create an operational panel and add visualizations:
1. From the **Add Visualization** dropdown menu, choose **Select Existing Visualization** or **Create New Visualization**, which takes you to the [event analytics]({{site.url}}{{site.baseurl}}/observability/event-analytics) explorer, where you can use PPL to create visualizations. 1. From the **Add Visualization** dropdown menu, choose **Select Existing Visualization** or **Create New Visualization**, which takes you to the [event analytics]({{site.url}}{{site.baseurl}}/observability-plugin/event-analytics) explorer, where you can use PPL to create visualizations.
1. If you're adding already existing visualizations, choose a visualization from the dropdown menu. 1. If you're adding already existing visualizations, choose a visualization from the dropdown menu.
1. Choose **Add**. 1. Choose **Add**.

View File

@ -19,9 +19,9 @@ OpenSearch Trace Analytics consists of two components---Data Prepper and the Tra
1. The [OpenTelemetry Collector](https://opentelemetry.io/docs/collector/getting-started/) receives data from the application and formats it into OpenTelemetry data. 1. The [OpenTelemetry Collector](https://opentelemetry.io/docs/collector/getting-started/) receives data from the application and formats it into OpenTelemetry data.
1. [Data Prepper]({{site.url}}{{site.baseurl}}/observability/data-prepper/index/) processes the OpenTelemetry data, transforms it for use in OpenSearch, and indexes it on an OpenSearch cluster. 1. [Data Prepper]({{site.url}}{{site.baseurl}}/clients/data-prepper/index/) processes the OpenTelemetry data, transforms it for use in OpenSearch, and indexes it on an OpenSearch cluster.
1. The [Trace Analytics OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/observability/trace/ta-dashboards/) displays the data in near real-time as a series of charts and tables, with an emphasis on service architecture, latency, error rate, and throughput. 1. The [Trace Analytics OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/observability-plugin/trace/ta-dashboards/) displays the data in near real-time as a series of charts and tables, with an emphasis on service architecture, latency, error rate, and throughput.
## Jaeger HotROD ## Jaeger HotROD
@ -78,4 +78,4 @@ curl -X GET -u 'admin:admin' -k 'https://localhost:9200/otel-v1-apm-span-000001/
Navigate to `http://localhost:5601` in a web browser and choose **Trace Analytics**. You can see the results of your single click in the Jaeger HotROD web interface: the number of traces per API and HTTP method, latency trends, a color-coded map of the service architecture, and a list of trace IDs that you can use to drill down on individual operations. Navigate to `http://localhost:5601` in a web browser and choose **Trace Analytics**. You can see the results of your single click in the Jaeger HotROD web interface: the number of traces per API and HTTP method, latency trends, a color-coded map of the service architecture, and a list of trace IDs that you can use to drill down on individual operations.
If you don't see your trace, adjust the timeframe in OpenSearch Dashboards. For more information on using the plugin, see [OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/observability/trace/ta-dashboards/). If you don't see your trace, adjust the timeframe in OpenSearch Dashboards. For more information on using the plugin, see [OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/observability-plugin/trace/ta-dashboards/).

View File

@ -262,4 +262,4 @@ You can use wildcards to delete more than one data stream.
We recommend deleting data from a data stream using an ISM policy. We recommend deleting data from a data stream using an ISM policy.
You can also use [asynchronous search]({{site.url}}{{site.baseurl}}/search-plugins/async/index/) and [SQL]({{site.url}}{{site.baseurl}}/search-plugins/sql/index/) and [PPL]({{site.url}}{{site.baseurl}}/observability/ppl/index/) to query your data stream directly. You can also use the security plugin to define granular permissions on the data stream name. You can also use [asynchronous search]({{site.url}}{{site.baseurl}}/search-plugins/async/index/) and [SQL]({{site.url}}{{site.baseurl}}/search-plugins/sql/index/) and [PPL]({{site.url}}{{site.baseurl}}/observability-plugin/ppl/index/) to query your data stream directly. You can also use the security plugin to define granular permissions on the data stream name.