lucene/contrib/wikipedia
Robert Muir 78e45c92a7 LUCENE-2207: CJKTokenizer generates tokens with incorrect offsets
LUCENE-2219: Chinese, SmartChinese, Wikipedia tokenizers generate incorrect offsets, test end() in BaseTokenStreamTestCase


git-svn-id: https://svn.apache.org/repos/asf/lucene/java/trunk@900196 13f79535-47bb-0310-9956-ffa450edef68
2010-01-17 19:25:57 +00:00
..
src LUCENE-2207: CJKTokenizer generates tokens with incorrect offsets 2010-01-17 19:25:57 +00:00
build.xml LUCENE-1103 2008-01-04 14:29:15 +00:00
pom.xml.template bulk fix svn:eol-style to native for text files 2009-06-22 22:18:56 +00:00