[DOCS] Fix headings for simple analyzer docs (#58910) (#58918)

This commit is contained in:
James Rodewig 2020-07-02 09:52:05 -04:00 committed by GitHub
parent ff04212257
commit 6436792aac
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 31 additions and 36 deletions

View File

@ -4,25 +4,25 @@
<titleabbrev>Simple</titleabbrev> <titleabbrev>Simple</titleabbrev>
++++ ++++
The `simple` analyzer breaks text into terms whenever it encounters a The `simple` analyzer breaks text into tokens at any non-letter character, such
character which is not a letter. All terms are lower cased. as numbers, spaces, hyphens and apostrophes, discards non-letter characters,
and changes uppercase to lowercase.
[float] [[analysis-simple-analyzer-ex]]
=== Example output ==== Example
[source,console] [source,console]
--------------------------- ----
POST _analyze POST _analyze
{ {
"analyzer": "simple", "analyzer": "simple",
"text": "The 2 QUICK Brown-Foxes jumped over the lazy dog's bone." "text": "The 2 QUICK Brown-Foxes jumped over the lazy dog's bone."
} }
--------------------------- ----
/////////////////////
////
[source,console-result] [source,console-result]
---------------------------- ----
{ {
"tokens": [ "tokens": [
{ {
@ -104,52 +104,47 @@ POST _analyze
} }
] ]
} }
---------------------------- ----
////
///////////////////// The `simple` analyzer parses the sentence and produces the following
tokens:
The above sentence would produce the following terms:
[source,text] [source,text]
--------------------------- ----
[ the, quick, brown, foxes, jumped, over, the, lazy, dog, s, bone ] [ the, quick, brown, foxes, jumped, over, the, lazy, dog, s, bone ]
--------------------------- ----
[float] [[analysis-simple-analyzer-definition]]
=== Configuration ==== Definition
The `simple` analyzer is not configurable. The `simple` analyzer is defined by one tokenizer:
[float]
=== Definition
The `simple` analzyer consists of:
Tokenizer:: Tokenizer::
* <<analysis-lowercase-tokenizer,Lower Case Tokenizer>> * <<analysis-lowercase-tokenizer, Lowercase Tokenizer>>
If you need to customize the `simple` analyzer then you need to recreate [[analysis-simple-analyzer-customize]]
it as a `custom` analyzer and modify it, usually by adding token filters. ==== Customize
This would recreate the built-in `simple` analyzer and you can use it as
a starting point for further customization: To customize the `simple` analyzer, duplicate it to create the basis for
a custom analyzer. This custom analyzer can be modified as required, usually by
adding token filters.
[source,console] [source,console]
---------------------------------------------------- ----
PUT /simple_example PUT /my_index
{ {
"settings": { "settings": {
"analysis": { "analysis": {
"analyzer": { "analyzer": {
"rebuilt_simple": { "my_custom_simple_analyzer": {
"tokenizer": "lowercase", "tokenizer": "lowercase",
"filter": [ <1> "filter": [ <1>
] ]
} }
} }
} }
} }
} }
---------------------------------------------------- ----
// TEST[s/\n$/\nstartyaml\n - compare_analyzers: {index: simple_example, first: simple, second: rebuilt_simple}\nendyaml\n/] <1> Add token filters here.
<1> You'd add any token filters here.