5da9e5dcbc
* Docs: Improved tokenizer docs Added descriptions and runnable examples * Addressed Nik's comments * Added TESTRESPONSEs for all tokenizer examples * Added TESTRESPONSEs for all analyzer examples too * Added docs, examples, and TESTRESPONSES for character filters * Skipping two tests: One interprets "$1" as a stack variable - same problem exists with the REST tests The other because the "took" value is always different * Fixed tests with "took" * Fixed failing tests and removed preserve_original from fingerprint analyzer |
||
---|---|---|
.. | ||
configuring.asciidoc | ||
custom-analyzer.asciidoc | ||
fingerprint-analyzer.asciidoc | ||
keyword-analyzer.asciidoc | ||
lang-analyzer.asciidoc | ||
pattern-analyzer.asciidoc | ||
simple-analyzer.asciidoc | ||
standard-analyzer.asciidoc | ||
stop-analyzer.asciidoc | ||
whitespace-analyzer.asciidoc |