mirror of
https://github.com/apache/lucene.git
synced 2025-02-06 18:18:38 +00:00
78e45c92a7
LUCENE-2219: Chinese, SmartChinese, Wikipedia tokenizers generate incorrect offsets, test end() in BaseTokenStreamTestCase git-svn-id: https://svn.apache.org/repos/asf/lucene/java/trunk@900196 13f79535-47bb-0310-9956-ffa450edef68