Adds more information about ingest attachment properties extraction

This is coming from thsi thread on discuss: https://discuss.elastic.co/t/ingest-attachment-plugin-exception/69167/10
This commit is contained in:
David Pilato 2016-12-21 12:13:16 +01:00
parent ae01a51b44
commit 80843afb19
1 changed files with 20 additions and 1 deletions

View File

@ -52,7 +52,7 @@ The node must be stopped before removing the plugin.
| `field` | yes | - | The field to get the base64 encoded field from
| `target_field` | no | attachment | The field that will hold the attachment information
| `indexed_chars` | no | 100000 | The number of chars being used for extraction to prevent huge fields. Use `-1` for no limit.
| `properties` | no | all | Properties to select to be stored. Can be `content`, `title`, `name`, `author`, `keywords`, `date`, `content_type`, `content_length`, `language`
| `properties` | no | all properties | Properties to select to be stored. Can be `content`, `title`, `name`, `author`, `keywords`, `date`, `content_type`, `content_length`, `language`
| `ignore_missing` | no | `false` | If `true` and `field` does not exist, the processor quietly exits without modifying the document
|======
@ -102,6 +102,25 @@ Returns this:
--------------------------------------------------
// TESTRESPONSE
To specify only some fields to be extracted:
[source,js]
--------------------------------------------------
PUT _ingest/pipeline/attachment
{
"description" : "Extract attachment information",
"processors" : [
{
"attachment" : {
"field" : "data",
"properties": [ "content", "title" ]
}
}
]
}
--------------------------------------------------
NOTE: Extracting contents from binary data is a resource intensive operation and
consumes a lot of resources. It is highly recommended to run pipelines
using this processor in a dedicated ingest node.