OpenSearch/docs/reference/analysis/tokenizers/thai-tokenizer.asciidoc

8 lines
290 B
Plaintext

[[analysis-thai-tokenizer]]
=== Thai Tokenizer
A tokenizer of type `thai` that segments Thai text into words. This tokenizer
uses the built-in Thai segmentation algorithm included with Java to divide
up Thai text. Text in other languages in general will be treated the same
as `standard`.