Correct the type of the 'analyzer' parameter in the _analyze docs. (#56650)

This optional parameter can only be a string. To test out a transient custom
analysis chain, users are expected to use the 'tokenizer', 'filter', and
'char_filter' parameters.
This commit is contained in:
Julie Tibshirani 2020-05-13 11:01:55 -07:00
parent 2f0663c490
commit a92d138c77
1 changed files with 4 additions and 8 deletions

View File

@ -56,11 +56,9 @@ the analyze API uses the <<analysis-standard-analyzer,standard analyzer>>.
`analyzer`::
+
--
(Optional, string or <<analysis-custom-analyzer,custom analyzer object>>)
Analyzer used to analyze for the provided `text`.
See <<analysis-analyzers>> for a list of built-in analyzers.
You can also provide a <<analysis-custom-analyzer,custom analyzer>>.
(Optional, string)
The name of the analyzer that should be applied to the provided `text`. This could be a
<<analysis-analyzers, built-in analyzer>>, or an analyzer that's been configured in the index.
If this parameter is not specified,
the analyze API uses the analyzer defined in the field's mapping.
@ -187,8 +185,6 @@ GET /_analyze
}
--------------------------------------------------
deprecated[5.0.0, Use `filter`/`char_filter` instead of `filters`/`char_filters` and `token_filters` has been removed]
Custom tokenizers, token filters, and character filters can be specified in the request body as follows:
[source,console]