lucene/modules/benchmark
Michael McCandless 72cc92c903 LUCENE-3109: revert
git-svn-id: https://svn.apache.org/repos/asf/lucene/dev/trunk@1310998 13f79535-47bb-0310-9956-ffa450edef68
2012-04-08 13:38:01 +00:00
..
conf LUCENE-3768: fix typos in .alg files and test that all .alg files can be parsed 2012-02-15 14:46:05 +00:00
lib LUCENE-3945: use sha1 checksums to verify jars pulled from ivy match expectations 2012-04-04 17:53:32 +00:00
scripts LUCENE-2845: move contrib/benchmark to modules/benchmark 2011-01-04 12:28:10 +00:00
src LUCENE-3109: revert 2012-04-08 13:38:01 +00:00
CHANGES.txt LUCENE-3937: Removed remaining references to the patched xercesImpl jar; added benchmark/CHANGES.txt entry. 2012-03-29 23:27:45 +00:00
LICENSE.txt add missing LICENSE/NOTICE to benchmarks module 2011-03-06 20:48:10 +00:00
NOTICE.txt LUCENE-3937: Removed remaining references to the patched xercesImpl jar; added benchmark/CHANGES.txt entry. 2012-03-29 23:27:45 +00:00
README.enwiki LUCENE-2845: move contrib/benchmark to modules/benchmark 2011-01-04 12:28:10 +00:00
build.xml IntelliJ configuration: enable IntelliJ .alg file copying to benchmark/build/classes/test/conf/ via a new Ant task 'copy-alg-files-for-testing', which is now also called by the benchmark module's compile-test target. (Standard IntelliJ configuration mechanisms don't support copying to a sub-directory of the test output directory.) 2012-04-01 19:46:49 +00:00
ivy.xml LUCENE-3930: nuke jars from source tree and use ivy 2012-03-30 18:04:43 +00:00

README.enwiki

Support exists for downloading, parsing, and loading the English
version of wikipedia (enwiki).

The build file can automatically try to download the most current
enwiki dataset (pages-articles.xml.bz2) from the "latest" directory,
http://download.wikimedia.org/enwiki/latest/. However, this file
doesn't always exist, depending on where wikipedia is in the dump
process and whether prior dumps have succeeded. If this file doesn't
exist, you can sometimes find an older or in progress version by
looking in the dated directories under
http://download.wikimedia.org/enwiki/. For example, as of this
writing, there is a page file in
http://download.wikimedia.org/enwiki/20070402/. You can download this
file manually and put it in temp. Note that the file you download will
probably have the date in the name, e.g.,
http://download.wikimedia.org/enwiki/20070402/enwiki-20070402-pages-articles.xml.bz2. When
you put it in temp, rename it to enwiki-latest-pages-articles.xml.bz2.

After that, ant enwiki should process the data set and run a load
test. Ant targets get-enwiki, expand-enwiki, and extract-enwiki can
also be used to download, decompress, and extract (to individual files
in work/enwiki) the dataset, respectively.