lucene/contrib/benchmark
Doron Cohen 25f80c71c9 LUCENE-1209: Fixed DocMaker settings by round. Prior to this fix, DocMaker settings of
first round were used in all rounds.  (E.g. term vectors.)


git-svn-id: https://svn.apache.org/repos/asf/lucene/java/trunk@635280 13f79535-47bb-0310-9956-ffa450edef68
2008-03-09 16:43:32 +00:00
..
conf make the name of multi-value property long enough for the values. 2008-03-09 08:28:53 +00:00
lib LUCENE-848: make sure we use the right version (2.9.0) of XercesJ 2007-07-02 01:06:07 +00:00
src LUCENE-1209: Fixed DocMaker settings by round. Prior to this fix, DocMaker settings of 2008-03-09 16:43:32 +00:00
.rsync-filter LUCENE-848. Add Wikipedia benchmarking support 2007-07-01 02:19:10 +00:00
CHANGES.txt LUCENE-1209: Fixed DocMaker settings by round. Prior to this fix, DocMaker settings of 2008-03-09 16:43:32 +00:00
README.enwiki LUCENE-848. Add Wikipedia benchmarking support 2007-07-01 02:19:10 +00:00
build.xml LUCENE-1128 and 1129: Add highlighting support to benchmarking, plus fix minor traversalSize bug in ReadTask, also added a few new algorithms to try out 2008-01-24 14:39:44 +00:00
pom.xml.template - LUCENE-908: Improvements and simplifications for how the MANIFEST file and the META-INF dir are created. 2007-08-22 23:16:48 +00:00

README.enwiki

Support exists for downloading, parsing, and loading the English
version of wikipedia (enwiki).

The build file can automatically try to download the most current
enwiki dataset (pages-articles.xml.bz2) from the "latest" directory,
http://download.wikimedia.org/enwiki/latest/. However, this file
doesn't always exist, depending on where wikipedia is in the dump
process and whether prior dumps have succeeded. If this file doesn't
exist, you can sometimes find an older or in progress version by
looking in the dated directories under
http://download.wikimedia.org/enwiki/. For example, as of this
writing, there is a page file in
http://download.wikimedia.org/enwiki/20070402/. You can download this
file manually and put it in temp. Note that the file you download will
probably have the date in the name, e.g.,
http://download.wikimedia.org/enwiki/20070402/enwiki-20070402-pages-articles.xml.bz2. When
you put it in temp, rename it to enwiki-latest-pages-articles.xml.bz2.

After that, ant enwiki should process the data set and run a load
test. Ant targets get-enwiki, expand-enwiki, and extract-enwiki can
also be used to download, decompress, and extract (to individual files
in work/enwiki) the dataset, respectively.