Merge pull request #281 from opensearch-project/observability

Added observability plugin
This commit is contained in:
Keith Chan 2021-11-23 09:45:29 -08:00 committed by GitHub
commit a5039bd90a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
25 changed files with 157 additions and 69 deletions

View File

@ -48,6 +48,9 @@ collections:
replication-plugin:
permalink: /:collection/:path/
output: true
observability-plugins:
permalink: /:collection/:path/
output: true
monitoring-plugins:
permalink: /:collection/:path/
output: true
@ -87,6 +90,9 @@ just_the_docs:
replication-plugin:
name: Replication plugin
nav_fold: true
observability-plugins:
name: Observability plugins
nav_fold: true
monitoring-plugins:
name: Monitoring plugins
nav_fold: true

View File

@ -0,0 +1,33 @@
---
layout: default
title: Event analytics
nav_order: 10
---
# Event analytics
Event analytics in observability is where you can use [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability-plugins/ppl/index) (PPL) queries to build and view different visualizations of your data.
## Get started with event analytics
To get started, choose **Observability** in OpenSearch Dashboards, and then choose **Event analytics**. If you want to start exploring without adding any of your own data, choose **Add sample Events Data**, and Dashboards adds some sample visualizations you can interact with.
## Build a query
To generate custom visualizations, you must first specify a PPL query. OpenSearch Dashboards then automatically creates a visualization based on the results of your query.
For example, the following PPL query returns a count of how many host addresses are currently in your data.
```
source = opensearch_dashboards_sample_data_logs | fields host | stats count()
```
By default, Dashboards shows results from the last 15 minutes of your data. To see data from a different timeframe, use the date and time selector.
For more information about building PPL queries, see [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability-plugins/ppl/index).
## Save a visualization
After Dashboards generates a visualization, you must save it if you want to return to it at a later time or if you want to add it to an [operational panel]({{site.url}}{{site.baseurl}}/observability-plugins/operational-panels).
To save a visualization, expand the save dropdown menu next to **Run**, enter a name for your visualization, then choose **Save**. You can reopen any saved visualizations on the event analytics page.

View File

@ -0,0 +1,26 @@
---
layout: default
title: About Observability
nav_order: 1
has_children: false
redirect_from:
- /observability-plugins/
---
# About Observability
OpenSearch Dashboards
{: .label .label-yellow :}
The Observability plugins are a collection of plugins that let you visualize data-driven events by using Piped Processing Language to explore, discover, and query data stored in OpenSearch.
Your experience of exploring data might differ, but if you're new to exploring data to create visualizations, we recommend trying a workflow like the following:
1. Explore data over a certain timeframe using [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability-plugins/ppl/index).
1. Use [event analytics]({{site.url}}{{site.baseurl}}/observability-plugins/event-analytics) to turn data-driven events into visualizations.
![Sample Event Analytics View]({{site.url}}{{site.baseurl}}/images/event-analytics.png)
1. Create [operational panels]({{site.url}}{{site.baseurl}}/observability-plugins/operational-panels) and add visualizations to compare data the way you like.
![Sample Operational Panel View]({{site.url}}{{site.baseurl}}/images/operational-panel.png)
1. Use [trace analytics]({{site.url}}{{site.baseurl}}/observability-plugins/trace/index) to create traces and dive deep into your data.
![Sample Trace Analytics View]({{site.url}}{{site.baseurl}}/images/observability-trace.png)
1. Leverage [notebooks]({{site.url}}{{site.baseurl}}/observability-plugins/notebooks) to combine different visualizations and code blocks that you can share with team members.
![Sample Notebooks View]({{site.url}}{{site.baseurl}}/images/notebooks.png)

View File

@ -0,0 +1,25 @@
---
layout: default
title: Operational panels
nav_order: 30
---
# Operational panels
Operational panels in OpenSearch Dashboards are collections of visualizations generated using [Piped Processing Language]({{site.url}}{{site.baseurl}}/observability-plugins/ppl/index) (PPL) queries.
## Get started with operational panels
If you want to start using operational panels without adding any data, expand the **Action** menu, choose **Add samples**, and Dashboards adds a set of operational panels with saved visualizations for you to explore.
## Create an operational panel
To create an operational panel and add visualizations:
1. From the **Add Visualization** dropdown menu, choose **Select Existing Visualization** or **Create New Visualization**, which takes you to the [event analytics]({{site.url}}{{site.baseurl}}/observability-plugins/event-analytics) explorer, where you can use PPL to create visualizations.
1. If you're adding already existing visualizations, choose a visualization from the dropdown menu.
1. Choose **Add**.
![Sample operational panel]({{site.url}}{{site.baseurl}}/images/operational-panel.png)
To search for a particular visualization in your operation panels, use PPL queries to search for data you've already added to your panel.

View File

@ -47,7 +47,7 @@ search source=accounts;
| account_number | firstname | address | balance | gender | city | employer | state | age | email | lastname |
:--- | :--- | :--- | :--- | :--- | :--- | :--- | :--- | :--- | :--- | :---
| 1 | Amber | 880 Holmes Lane | 39225 | M | Brogan | Pyrami | IL | 32 | amberduke@pyrami.com | Duke
| 1 | Amber | 880 Holmes Lane | 39225 | M | Brogan | Pyrami | IL | 32 | amberduke@pyrami.com | Duke
| 6 | Hattie | 671 Bristol Street | 5686 | M | Dante | Netagy | TN | 36 | hattiebond@netagy.com | Bond
| 13 | Nanette | 789 Madison Street | 32838 | F | Nogal | Quility | VA | 28 | null | Bates
| 18 | Dale | 467 Hutchinson Court | 4180 | M | Orick | null | MD | 33 | daleadams@boink.com | Adams
@ -80,7 +80,7 @@ Field | Description | Type | Required | Default
`int` | Retain the specified number of duplicate events for each combination. The number must be greater than 0. If you do not specify a number, only the first occurring event is kept and all other duplicates are removed from the results. | `string` | No | 1
`keepempty` | If true, keep the document if any field in the field list has a null value or a field missing. | `nested list of objects` | No | False
`consecutive` | If true, remove only consecutive events with duplicate combinations of values. | `Boolean` | No | False
`field-list` | Specify a comma-delimited field list. At least one field is required. | `string` or comma-separated list of strings | Yes | -
`field-list` | Specify a comma-delimited field list. At least one field is required. | `String` or comma-separated list of strings | Yes | -
*Example 1*: Dedup by one field
@ -90,7 +90,7 @@ To remove duplicate documents with the same gender:
search source=accounts | dedup gender | fields account_number, gender;
```
| account_number | gender
| account_number | gender
:--- | :--- |
1 | M
13 | F
@ -104,7 +104,7 @@ To keep two duplicate documents with the same gender:
search source=accounts | dedup 2 gender | fields account_number, gender;
```
| account_number | gender
| account_number | gender
:--- | :--- |
1 | M
6 | M
@ -145,7 +145,7 @@ To remove duplicates of consecutive documents:
search source=accounts | dedup gender consecutive=true | fields account_number, gender;
```
| account_number | gender
| account_number | gender
:--- | :--- |
1 | M
13 | F
@ -176,9 +176,9 @@ search source=accounts | eval doubleAge = age * 2 | fields age, doubleAge;
| age | doubleAge
:--- | :--- |
32 | 64
36 | 72
28 | 56
32 | 64
36 | 72
28 | 56
33 | 66
*Example 2*: Overwrite the existing field
@ -191,10 +191,10 @@ search source=accounts | eval age = age + 1 | fields age;
| age
:--- |
| 33
| 37
| 29
| 34
| 33
| 37
| 29
| 34
*Example 3*: Create a new field with a field defined with the `eval` command
@ -206,10 +206,10 @@ search source=accounts | eval doubleAge = age * 2, ddAge = doubleAge * 2 | field
| age | doubleAge | ddAge
:--- | :--- |
| 32 | 64 | 128
| 36 | 72 | 144
| 28 | 56 | 112
| 33 | 66 | 132
| 32 | 64 | 128
| 36 | 72 | 144
| 28 | 56 | 112
| 33 | 66 | 132
## fields
@ -234,11 +234,11 @@ To get `account_number`, `firstname`, and `lastname` fields from a search result
search source=accounts | fields account_number, firstname, lastname;
```
| account_number | firstname | lastname
| account_number | firstname | lastname
:--- | :--- |
| 1 | Amber | Duke
| 6 | Hattie | Bond
| 13 | Nanette | Bates
| 1 | Amber | Duke
| 6 | Hattie | Bond
| 13 | Nanette | Bates
| 18 | Dale | Adams
*Example 2*: Remove specified fields from a search result
@ -251,10 +251,10 @@ search source=accounts | fields account_number, firstname, lastname | fields - a
| firstname | lastname
:--- | :--- |
| Amber | Duke
| Hattie | Bond
| Nanette | Bates
| Dale | Adams
| Amber | Duke
| Hattie | Bond
| Nanette | Bates
| Dale | Adams
## rename
@ -281,9 +281,9 @@ search source=accounts | rename account_number as an | fields an;
| an
:--- |
| 1
| 6
| 13
| 1
| 6
| 13
| 18
*Example 2*: Rename multiple fields
@ -296,10 +296,10 @@ search source=accounts | rename account_number as an, employer as emp | fields a
| an | emp
:--- | :--- |
| 1 | Pyrami
| 6 | Netagy
| 1 | Pyrami
| 6 | Netagy
| 13 | Quility
| 18 | null
| 18 | null
## sort
@ -327,9 +327,9 @@ search source=accounts | sort age | fields account_number, age;
| account_number | age |
:--- | :--- |
| 13 | 28
| 1 | 32
| 18 | 33
| 13 | 28
| 1 | 32
| 18 | 33
| 6 | 36
*Example 2*: Sort by one field and return all results
@ -342,9 +342,9 @@ search source=accounts | sort 0 age | fields account_number, age;
| account_number | age |
:--- | :--- |
| 13 | 28
| 1 | 32
| 18 | 33
| 13 | 28
| 1 | 32
| 18 | 33
| 6 | 36
*Example 3*: Sort by one field in descending order
@ -357,9 +357,9 @@ search source=accounts | sort - age | fields account_number, age;
| account_number | age |
:--- | :--- |
| 6 | 36
| 18 | 33
| 1 | 32
| 6 | 36
| 18 | 33
| 1 | 32
| 13 | 28
*Example 4*: Specify the number of sorted documents to return
@ -372,8 +372,8 @@ search source=accounts | sort 2 age | fields account_number, age;
| account_number | age |
:--- | :--- |
| 13 | 28
| 1 | 32
| 13 | 28
| 1 | 32
*Example 5*: Sort by multiple fields
@ -385,9 +385,9 @@ search source=accounts | sort + gender, - age | fields account_number, gender, a
| account_number | gender | age |
:--- | :--- | :--- |
| 13 | F | 28
| 6 | M | 36
| 18 | M | 33
| 13 | F | 28
| 6 | M | 36
| 18 | M | 33
| 1 | M | 32
## stats
@ -438,7 +438,7 @@ search source=accounts | stats avg(age) by gender;
| gender | avg(age)
:--- | :--- |
| F | 28.0
| F | 28.0
| M | 33.666666666666664
*Example 3*: Calculate the average and sum of a field by group
@ -451,7 +451,7 @@ search source=accounts | stats avg(age), sum(age) by gender;
| gender | avg(age) | sum(age)
:--- | :--- |
| F | 28 | 28
| F | 28 | 28
| M | 33.666666666666664 | 101
*Example 4*: Calculate the maximum value of a field
@ -464,7 +464,7 @@ search source=accounts | stats max(age);
| max(age)
:--- |
| 36
| 36
*Example 5*: Calculate the maximum and minimum value of a field by group
@ -476,7 +476,7 @@ search source=accounts | stats max(age), min(age) by gender;
| gender | min(age) | max(age)
:--- | :--- | :--- |
| F | 28 | 28
| F | 28 | 28
| M | 32 | 36
## where
@ -503,7 +503,7 @@ search source=accounts | where account_number=1 or gender=\"F\" | fields account
| account_number | gender
:--- | :--- |
| 1 | M
| 1 | M
| 13 | F
## head
@ -573,7 +573,7 @@ search source=accounts | rare gender;
| gender
:--- |
| F
| F
| M
*Example 2*: Find the least common values grouped by gender
@ -586,7 +586,7 @@ search source=accounts | rare age by gender;
| gender | age
:--- | :--- |
| F | 28
| F | 28
| M | 32
| M | 33
@ -616,7 +616,7 @@ search source=accounts | top gender;
| gender
:--- |
| M
| M
| F
*Example 2*: Find the most common value in a field
@ -629,7 +629,7 @@ search source=accounts | top 1 gender;
| gender
:--- |
| M
| M
*Example 2*: Find the most common values grouped by gender
@ -641,5 +641,5 @@ search source=accounts | top 1 age by gender;
| gender | age
:--- | :--- |
| F | 28
| F | 28
| M | 32

View File

@ -1,7 +1,7 @@
---
layout: default
title: Piped processing language
nav_order: 42
nav_order: 40
has_children: true
has_toc: false
redirect_from:
@ -52,9 +52,9 @@ search source=accounts
firstname | lastname |
:--- | :--- |
Amber | Duke
Hattie | Bond
Nanette | Bates
Dale | Adams
Amber | Duke
Hattie | Bond
Nanette | Bates
Dale | Adams
![PPL query workbench]({{site.url}}{{site.baseurl}}/images/ppl.png)

View File

@ -7,7 +7,7 @@ nav_order: 25
# Data Prepper configuration reference
This page lists all supported Data Prepper sources, buffers, preppers, and sinks, along with their associated options. For example configuration files, see [Data Prepper]({{site.url}}{{site.baseurl}}/monitoring-plugins/trace/data-prepper/).
This page lists all supported Data Prepper sources, buffers, preppers, and sinks, along with their associated options. For example configuration files, see [Data Prepper]({{site.url}}{{site.baseurl}}/observability-plugins/trace/data-prepper/).
## Data Prepper server options

View File

@ -105,7 +105,7 @@ service-map-pipeline:
trace_analytics_service_map: true
```
To learn more, see the [Data Prepper configuration reference]({{site.url}}{{site.baseurl}}/monitoring-plugins/trace/data-prepper-reference/).
To learn more, see the [Data Prepper configuration reference]({{site.url}}{{site.baseurl}}/observability-plugins/trace/data-prepper-reference/).
## Configure the Data Prepper server
Data Prepper itself provides administrative HTTP endpoints such as `/list` to list pipelines and `/metrics/prometheus` to provide Prometheus-compatible metrics data. The port which serves these endpoints, as well as TLS configuration, is specified by a separate YAML file. Example:

View File

@ -20,9 +20,9 @@ OpenSearch Trace Analytics consists of two components---Data Prepper and the Tra
1. The [OpenTelemetry Collector](https://opentelemetry.io/docs/collector/getting-started/) receives data from the application and formats it into OpenTelemetry data.
1. [Data Prepper]({{site.url}}{{site.baseurl}}/monitoring-plugins/trace/data-prepper/) processes the OpenTelemetry data, transforms it for use in OpenSearch, and indexes it on an OpenSearch cluster.
1. [Data Prepper]({{site.url}}{{site.baseurl}}/observability-plugins/trace/data-prepper/) processes the OpenTelemetry data, transforms it for use in OpenSearch, and indexes it on an OpenSearch cluster.
1. The [Trace Analytics OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/monitoring-plugins/trace/ta-dashboards/) displays the data in near real-time as a series of charts and tables, with an emphasis on service architecture, latency, error rate, and throughput.
1. The [Trace Analytics OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/observability-plugins/trace/ta-dashboards/) displays the data in near real-time as a series of charts and tables, with an emphasis on service architecture, latency, error rate, and throughput.
## Jaeger HotROD
@ -80,4 +80,4 @@ curl -X GET -u 'admin:admin' -k 'https://localhost:9200/otel-v1-apm-span-000001/
Navigate to `http://localhost:5601` in a web browser and choose **Trace Analytics**. You can see the results of your single click in the Jaeger HotROD web interface: the number of traces per API and HTTP method, latency trends, a color-coded map of the service architecture, and a list of trace IDs that you can use to drill down on individual operations.
If you don't see your trace, adjust the timeframe in OpenSearch Dashboards. For more information on using the plugin, see [OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/monitoring-plugins/trace/ta-dashboards/).
If you don't see your trace, adjust the timeframe in OpenSearch Dashboards. For more information on using the plugin, see [OpenSearch Dashboards plugin]({{site.url}}{{site.baseurl}}/observability-plugins/trace/ta-dashboards/).

View File

@ -1,11 +1,9 @@
---
layout: default
title: Trace analytics
nav_order: 48
nav_order: 60
has_children: true
has_toc: false
redirect_from:
- /monitoring-plugins/trace/
---
# Trace Analytics

View File

@ -228,7 +228,7 @@ POST logs-redis/_rollover
If you now perform a `GET` operation on the `logs-redis` data stream, you see that the generation ID is incremented from 1 to 2.
You can also set up an [Index State Management (ISM) policy]({{site.url}}{{site.baseurl}}/im-plugin/ism/policies/) to automate the rollover process for the data stream.
You can also set up an [Index State Management (ISM) policy]({{site.url}}{{site.baseurl}}/im-plugin/ism/policies/) to automate the rollover process for the data stream.
The ISM policy is applied to the backing indices at the time of their creation. When you associate a policy to a data stream, it only affects the future backing indices of that data stream.
You also dont need to provide the `rollover_alias` setting, because the ISM policy infers this information from the backing index.
@ -262,4 +262,4 @@ You can use wildcards to delete more than one data stream.
We recommend deleting data from a data stream using an ISM policy.
You can also use [asynchronous search]({{site.url}}{{site.baseurl}}/search-plugins/async/index/) and [SQL]({{site.url}}{{site.baseurl}}/search-plugins/sql/index/) and [PPL]({{site.url}}{{site.baseurl}}/search-plugins/ppl/index/) to query your data stream directly. You can also use the security plugin to define granular permissions on the data stream name.
You can also use [asynchronous search]({{site.url}}{{site.baseurl}}/search-plugins/async/index/) and [SQL]({{site.url}}{{site.baseurl}}/search-plugins/sql/index/) and [PPL]({{site.url}}{{site.baseurl}}/observability-plugins/ppl/index/) to query your data stream directly. You can also use the security plugin to define granular permissions on the data stream name.

BIN
images/event-analytics.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 271 KiB

BIN
images/notebooks.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 584 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 578 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 409 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 165 KiB

After

Width:  |  Height:  |  Size: 179 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 200 KiB

After

Width:  |  Height:  |  Size: 216 KiB