more changes
This commit is contained in:
parent
e0004a2f52
commit
9648a850ad
|
@ -10,15 +10,15 @@ has_toc: true
|
|||
|
||||
Logstash is a real-time event processing engine. It's part of the Opensearch stack which includes Opensearch, Beats, and Opensearch Dashboards.
|
||||
|
||||
You can send events to Logstash from a number of different sources. Logstash processes the events and sends it one or more destinations. For example, you can send access logs from a web server to Logstash. Logstash extracts useful information from each log and sends it to Opensearch.
|
||||
You can send events to Logstash from many different sources. Logstash processes the events and sends it one or more destinations. For example, you can send access logs from a web server to Logstash. Logstash extracts useful information from each log and sends it to a destination like Opensearch.
|
||||
|
||||
Sending events to Logstash lets you decouple event processing from your app. Your app only needs to send events to Logstash and Logstash handles the rest.
|
||||
Sending events to Logstash lets you decouple event processing from your app. Your app only needs to send events to Logstash and doesn’t need to know anything about what happens to the events afterwards.
|
||||
|
||||
The open-source community originally built Logstash for processing log data. But now you can process any of type of events, including events in XML or JSON format.
|
||||
The open-source community originally built Logstash for processing log data but now you can process any type of events, including events in XML or JSON format.
|
||||
|
||||
## Structure of a pipeline
|
||||
|
||||
The way that Logstash works is that you configure a pipeline that has three phases — inputs, filters, and outputs.
|
||||
The way that Logstash works is that you configure a pipeline that has three phases—inputs, filters, and outputs.
|
||||
|
||||
Each phase uses one or more plugins. Logstash has over 200 built-in plugins so chances are that you’ll find what you need. Apart from the built-in plugins, you can use plugins from the community or even write your own.
|
||||
|
||||
|
@ -40,9 +40,9 @@ output {
|
|||
|
||||
where:
|
||||
|
||||
* `input` receives events like logs from multiple sources simultaneously. Logstash supports input plugins for TCP/UDP, files, syslog, Microsoft Windows EventLogs, stdin, HTTP, and so on. You can also use an open source collection of input tools called Beats to gather events. The input plugin sends the events to a filter.
|
||||
* `input` receives events like logs from multiple sources simultaneously. Logstash supports a number of input plugins for TCP/UDP, files, syslog, Microsoft Windows EventLogs, stdin, HTTP, and so on. You can also use an open source collection of input tools called Beats to gather events. The input plugin sends the events to a filter.
|
||||
* `filter` parses and enriches the events in one way or the other. Logstash has a large collection of filter plugins that modify events and pass them on to an output. For example, a `grok` filter parses unstructured events into fields and a `mutate` filter changes fields. Filters are executed sequentially.
|
||||
* `output` ships the filtered events to one or more destinations. Logstash supports output plugins to send data to Opensearch, TCP/UDP, emails, files, stdout, HTTP, Nagios, and so on.
|
||||
* `output` ships the filtered events to one or more destinations. Logstash supports a wide range of output plugins for destinations like Opensearch, TCP/UDP, emails, files, stdout, HTTP, Nagios, and so on.
|
||||
|
||||
Both the input and output phases support codecs to process events as they enter or exit the pipeline.
|
||||
Some of the popular codecs are `json` and `multiline`. The `json` codec processes data that’s in JSON format and the `multiline` codec merges multiple line events into a single line.
|
||||
|
@ -57,10 +57,10 @@ Make sure you have [Java Development Kit (JDK)](https://www.oracle.com/java/tech
|
|||
2. Navigate to the downloaded folder in the terminal and extract the files:
|
||||
|
||||
```bash
|
||||
tar -zxvf logstash-7.12.1-darwin-x86_64.tar.gz
|
||||
tar -zxvf logstash-7.13.2-darwin-x86_64.tar.gz
|
||||
```
|
||||
|
||||
3. Navigate to the `logstash-7.12.1` directory.
|
||||
3. Navigate to the `logstash-7.13.2` directory.
|
||||
- You can add your pipeline configurations to the `config` directory. Logstash saves any data from the plugins in the `data` directory. The `bin` directory contains the binaries for starting Logstash and managing plugins.
|
||||
|
||||
## Process text from the terminal
|
||||
|
|
|
@ -0,0 +1,80 @@
|
|||
---
|
||||
layout: default
|
||||
title: Send events to Opensearch
|
||||
parent: Logstash
|
||||
nav_order: 220
|
||||
---
|
||||
|
||||
# Send events to Opensearch
|
||||
|
||||
You can send Logstash events to an Opensearch cluster and then visualize your log data with Kibana.
|
||||
|
||||
Make sure you have Logstash, Opensearch, and Kibana installed.
|
||||
{: .note }
|
||||
|
||||
## Opensearch output plugin
|
||||
|
||||
To run the Opensearch output plugin, add the following configuration in your `pipeline.conf` file:
|
||||
|
||||
```yml
|
||||
output {
|
||||
opensearch {
|
||||
hosts => "https://localhost:9200"
|
||||
user => "admin"
|
||||
password => "admin"
|
||||
index => "logstash-logs-%{+YYYY.MM.dd}"
|
||||
ssl_certificate_verification => false
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
## Sample walkthrough
|
||||
|
||||
1. Open the `config/pipeline.conf` file and add in the following configuration:
|
||||
|
||||
```yml
|
||||
input {
|
||||
stdin {
|
||||
codec => json
|
||||
}
|
||||
|
||||
http {
|
||||
host => "127.0.0.1"
|
||||
port => 8080
|
||||
}
|
||||
}
|
||||
|
||||
output {
|
||||
opensearch {
|
||||
hosts => "https://localhost:9200"
|
||||
user => "admin"
|
||||
password => "admin"
|
||||
index => "logstash-logs-%{+YYYY.MM.dd}"
|
||||
ssl_certificate_verification => false
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
2. Start Logstash:
|
||||
|
||||
```bash
|
||||
$ bin/logstash -f config/pipeline.conf --config.reload.automatic
|
||||
```
|
||||
|
||||
`config/pipeline.conf` is a relative path to the `pipeline.conf` file. You can use an absolute path as well.
|
||||
|
||||
3. Add a JSON object in the terminal:
|
||||
|
||||
```json
|
||||
{ "amount": 10, "quantity": 2}
|
||||
```
|
||||
|
||||
4. Open Opensearch and search for the processed event:
|
||||
|
||||
```json
|
||||
GET _cat/indices?v
|
||||
|
||||
health | status | index | uuid | pri | rep | docs.count | docs.deleted | store.size | pri.store.size
|
||||
green | open | logstash-logs-2021.07.01 | iuh648LYSnmQrkGf70pplA | 1 | 1 | 1 | 0 | 10.3kb | 5.1kb
|
||||
```
|
Loading…
Reference in New Issue