incorporated feedback

This commit is contained in:
ashwinkumar12345 2021-07-06 23:51:45 -07:00
parent 70c9b6d9c8
commit 96f08a51d7
3 changed files with 17 additions and 18 deletions

View File

@ -151,7 +151,7 @@ file {
This is the date stored within the @timestamp fields, which is the time and date of the event.
Send a request to the pipeline and verify that a filename is outputted that contains the events date.
You can embed the date in other outputs as well, for example into the index name in Opensearch.
You can embed the date in other outputs as well, for example into the index name in OpenSearch.
## Sending time information
@ -199,7 +199,7 @@ Users might be using a wide range of browsers, devices, and OS's. Doing this man
You can't use `grok` patterns because the `grok` pattern only matches the usage in the string as whole and doesn't figure out which browser the visitor used for instance.
Logstash ships with a file containing regular expressions for this purpose. This makes it really easy to extract user agent information, which you could send to Opensearch and run aggregations on.
Logstash ships with a file containing regular expressions for this purpose. This makes it really easy to extract user agent information, which you could send to OpenSearch and run aggregations on.
To do this, add a `source` option that contains the name of the field. In this case, that's the `agent` field.
By default the user agent plugin, adds a number of fields at the top-level of the event.
@ -216,7 +216,7 @@ Start Logstah and send an HTTP request.
You can see a field named `ua` with a number of keys including the browser name and version, the OS, and the device.
You could Opensearch Dashboards to create a pie chart that shows how many visitors are from mobile devices and how many are desktop users. Or, you could get statistics on which browser versions are popular.
You could OpenSearch Dashboards to create a pie chart that shows how many visitors are from mobile devices and how many are desktop users. Or, you could get statistics on which browser versions are popular.
## Enriching geographical data
@ -241,6 +241,6 @@ If you only need the country name for instance, include an option named `fields`
Some of the fields are not always available such as city name and region because translating IP addresses into geographical locations is generally not that accurate. If the `geoip` plugin fails to look up the geographical location, it adds a tag named `geoip_lookup_failure`.
You can use the `geoip` plugin with the Opensearch output because `location` object within the `geoip` object, is a standard format for representing geospatial data in JSON. This is the same format as Opensearch uses for its `geo_point` data type.
You can use the `geoip` plugin with the OpenSearch output because `location` object within the `geoip` object, is a standard format for representing geospatial data in JSON. This is the same format as OpenSearch uses for its `geo_point` data type.
You can use the powerful geospatial queries of Opensearch for working with geographical data.
You can use the powerful geospatial queries of OpenSearch for working with geographical data.

View File

@ -49,7 +49,9 @@ Some of the popular codecs are `json` and `multiline`. The `json` codec processe
You can also write conditional statements within pipeline configurations to perform certain actions, if a certain criteria is met.
## Install Logstash on MAC / Linux
## Install Logstash
The OpenSearch Logstash plugin has two installation options at this time: Linux (ARM64/X64) and Docker (ARM64/X64).
Make sure you have [Java Development Kit (JDK)](https://www.oracle.com/java/technologies/javase-downloads.html) version 8 or 11 installed.
@ -58,7 +60,7 @@ Make sure you have [Java Development Kit (JDK)](https://www.oracle.com/java/tech
2. Navigate to the downloaded folder in the terminal and extract the files:
```bash
tar -zxvf logstash-7.13.2-darwin-x86_64.tar.gz
tar -zxvf logstash-oss-with-opensearch-output-plugin-7.13.2-linux-x64.tar.gz
```
3. Navigate to the `logstash-7.13.2` directory.

View File

@ -1,20 +1,20 @@
---
layout: default
title: Send events to Opensearch
title: Ship events to OpenSearch
parent: Logstash
nav_order: 220
---
# Send events to Opensearch
# Ship events to OpenSearch
You can send Logstash events to an Opensearch cluster and then visualize your events with Kibana.
You can Ship Logstash events to an OpenSearch cluster and then visualize your events with OpenSearch Dashboards.
Make sure you have [Logstash]({{site.url}}{{site.baseurl}}/logstash/index/#install-logstash-on-mac--linux), [OpenSearch]({{site.url}}{{site.baseurl}}/opensearch/install/index/), and [OpenSearch Dashboards]({{site.url}}{{site.baseurl}}/dashboards/install/index/).
{: .note }
## Opensearch output plugin
## OpenSearch output plugin
To run the Opensearch output plugin, add the following configuration in your `pipeline.conf` file:
To run the OpenSearch output plugin, add the following configuration in your `pipeline.conf` file:
```yml
output {
@ -38,11 +38,6 @@ output {
stdin {
codec => json
}
http {
host => "127.0.0.1"
port => 8080
}
}
output {
@ -56,6 +51,8 @@ output {
}
```
The Logstash pipeline accepts JSON input through the terminal and ships the events to an OpenSearch cluster running locally. Logstash writes the events to an index with the `logstash-logs-%{+YYYY.MM.dd}` naming convention.
2. Start Logstash:
```bash
@ -70,7 +67,7 @@ output {
{ "amount": 10, "quantity": 2}
```
4. Open Opensearch and search for the processed event:
4. Start OpenSearch Dashboards and choose **Dev Tools**:
```json
GET _cat/indices?v