5da9e5dcbc
* Docs: Improved tokenizer docs Added descriptions and runnable examples * Addressed Nik's comments * Added TESTRESPONSEs for all tokenizer examples * Added TESTRESPONSEs for all analyzer examples too * Added docs, examples, and TESTRESPONSES for character filters * Skipping two tests: One interprets "$1" as a stack variable - same problem exists with the REST tests The other because the "took" value is always different * Fixed tests with "took" * Fixed failing tests and removed preserve_original from fingerprint analyzer |
||
---|---|---|
.. | ||
analyzers | ||
charfilters | ||
tokenfilters | ||
tokenizers | ||
analyzers.asciidoc | ||
anatomy.asciidoc | ||
charfilters.asciidoc | ||
testing.asciidoc | ||
tokenfilters.asciidoc | ||
tokenizers.asciidoc |