OpenSearch/docs/reference/analysis
Christoph Büscher 41feaf137c
[Docs] Fix error in Common Grams Token Filter (#36774)
The first example given is missing the two single-token cases for "is" and "a".
The later usage example is slightly wrong in that custom analyzers should
go under `settings.analysis.analyzer`.
2018-12-18 16:54:06 +01:00
..
analyzers Add a prebuilt ICU Analyzer (#34958) 2018-11-21 09:00:48 +00:00
charfilters Make hits.total an object in the search response (#35849) 2018-12-05 19:49:06 +01:00
tokenfilters [Docs] Fix error in Common Grams Token Filter (#36774) 2018-12-18 16:54:06 +01:00
tokenizers Make hits.total an object in the search response (#35849) 2018-12-05 19:49:06 +01:00
analyzers.asciidoc First pass at improving analyzer docs (#18269) 2016-05-11 14:17:56 +02:00
anatomy.asciidoc Correction of the names of numirals (#21531) 2016-11-25 14:30:49 +01:00
charfilters.asciidoc Hindu-Arabico-Latino Numerals (#22476) 2017-01-10 15:24:56 +01:00
normalizers.asciidoc Make sure to use the type _doc in the REST documentation. (#34662) 2018-10-22 11:54:04 -07:00
testing.asciidoc Allow `_doc` as a type. (#27816) 2017-12-14 17:47:53 +01:00
tokenfilters.asciidoc Add predicate_token_filter (#33431) 2018-09-11 09:16:39 +01:00
tokenizers.asciidoc [Feature] Adding a char_group tokenizer (#24186) 2018-05-22 16:26:31 +02:00