Updates the images used for Data Prepper use-cases to use images with OpenSearch icons and typography. (#3844)

Signed-off-by: David Venable <dlv@amazon.com>
This commit is contained in:
David Venable 2023-04-27 14:50:50 -05:00 committed by GitHub
parent 4251fb04bd
commit 3baac55d9f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
10 changed files with 4 additions and 4 deletions

View File

@ -11,7 +11,7 @@ Data Prepper is an extendable, configurable, and scalable solution for log inges
The following image shows all of the components used for log analytics with Fluent Bit, Data Prepper, and OpenSearch.
![Log analytics component]({{site.url}}{{site.baseurl}}/images/data-prepper/log-analytics/log-analytics-components.png)
![Log analytics component]({{site.url}}{{site.baseurl}}/images/data-prepper/log-analytics/log-analytics-components.jpg)
In the application environment, run Fluent Bit. Fluent Bit can be containerized through Kubernetes, Docker, or Amazon Elastic Container Service (Amazon ECS). You can also run Fluent Bit as an agent on Amazon Elastic Compute Cloud (Amazon EC2). Configure the [Fluent Bit http output plugin](https://docs.fluentbit.io/manual/pipeline/outputs/http) to export log data to Data Prepper. Then deploy Data Prepper as an intermediate component and configure it to send the enriched log data to your OpenSearch cluster. From there, use OpenSearch Dashboards to perform more intensive visualization and analysis.
@ -19,7 +19,7 @@ In the application environment, run Fluent Bit. Fluent Bit can be containerized
Log analytics pipelines in Data Prepper are extremely customizable. The following image shows a simple pipeline.
![Log analytics component]({{site.url}}{{site.baseurl}}/images/data-prepper/log-analytics/log-ingestion-fluent-bit-data-prepper.png)
![Log analytics component]({{site.url}}{{site.baseurl}}/images/data-prepper/log-analytics/log-ingestion-pipeline.jpg)
### HTTP source

View File

@ -15,7 +15,7 @@ When using Data Prepper as a server-side component to collect trace data, you ca
The following flowchart illustrates the trace analytics workflow, from running OpenTelemetry Collector to using OpenSearch Dashboards for visualization.
<img src="{{site.url}}{{site.baseurl}}/images/data-prepper/trace-analytics/trace-analytics-components.png" alt="Trace analyticis component overview">{: .img-fluid}
<img src="{{site.url}}{{site.baseurl}}/images/data-prepper/trace-analytics/trace-analytics-components.jpg" alt="Trace analyticis component overview">{: .img-fluid}
To monitor trace analytics, you need to set up the following components in your service environment:
- Add **instrumentation** to your application so it can generate telemetry data and send it to an OpenTelemetry collector.
@ -27,7 +27,7 @@ To monitor trace analytics, you need to set up the following components in your
To monitor trace analytics in Data Prepper, we provide three pipelines: `entry-pipeline`, `raw-trace-pipeline`, and `service-map-pipeline`. The following image provides an overview of how the pipelines work together to monitor trace analytics.
<img src="{{site.url}}{{site.baseurl}}/images/data-prepper/trace-analytics/trace-analytics-feature.jpg" alt="Trace analytics pipeline overview">{: .img-fluid}
<img src="{{site.url}}{{site.baseurl}}/images/data-prepper/trace-analytics/trace-analytics-pipeline.jpg" alt="Trace analytics pipeline overview">{: .img-fluid}
### OpenTelemetry trace source

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 41 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 70 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 42 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 84 KiB