[[Testing Framework Cheatsheet]] = Testing [partintro] Elasticsearch uses jUnit for testing, it also uses randomness in the tests, that can be set using a seed, the following is a cheatsheet of options for running the tests for ES. == Creating packages To create a distribution without running the tests, simply run the following: ----------------------------- mvn clean package -DskipTests ----------------------------- == Other test options To disable and enable network transport, set the `Des.node.mode`. Use network transport: ------------------------------------ -Des.node.mode=network ------------------------------------ Use local transport (default since 1.3): ------------------------------------- -Des.node.mode=local ------------------------------------- Alternatively, you can set the `ES_TEST_LOCAL` environment variable: ------------------------------------- export ES_TEST_LOCAL=true && mvn test ------------------------------------- === Running Elasticsearch from a checkout In order to run Elasticsearch from source without building a package, you can run it using Maven: ------------------------------------- ./run.sh ------------------------------------- === Test case filtering. - `tests.class` is a class-filtering shell-like glob pattern, - `tests.method` is a method-filtering glob pattern. Run a single test case (variants) ---------------------------------------------------------- mvn test -Dtests.class=org.elasticsearch.package.ClassName mvn test "-Dtests.class=*.ClassName" ---------------------------------------------------------- Run all tests in a package and sub-packages ---------------------------------------------------- mvn test "-Dtests.class=org.elasticsearch.package.*" ---------------------------------------------------- Run any test methods that contain 'esi' (like: ...r*esi*ze...). ------------------------------- mvn test "-Dtests.method=*esi*" ------------------------------- You can also filter tests by certain annotations ie: * `@Nightly` - tests that only run in nightly builds (disabled by default) * `@Backwards` - backwards compatibility tests (disabled by default) * `@AwaitsFix` - tests that are waiting for a bugfix (disabled by default) * `@BadApple` - tests that are known to fail randomly (disabled by default) Those annotation names can be combined into a filter expression like: ------------------------------------------------ mvn test -Dtests.filter="@nightly and not @backwards" ------------------------------------------------ to run all nightly test but not the ones that are backwards tests. `tests.filter` supports the boolean operators `and, or, not` and grouping ie: --------------------------------------------------------------- mvn test -Dtests.filter="@nightly and not(@badapple or @backwards)" --------------------------------------------------------------- === Seed and repetitions. Run with a given seed (seed is a hex-encoded long). ------------------------------ mvn test -Dtests.seed=DEADBEEF ------------------------------ === Repeats _all_ tests of ClassName N times. Every test repetition will have a different method seed (derived from a single random master seed). -------------------------------------------------- mvn test -Dtests.iters=N -Dtests.class=*.ClassName -------------------------------------------------- === Repeats _all_ tests of ClassName N times. Every test repetition will have exactly the same master (0xdead) and method-level (0xbeef) seed. ------------------------------------------------------------------------ mvn test -Dtests.iters=N -Dtests.class=*.ClassName -Dtests.seed=DEAD:BEEF ------------------------------------------------------------------------ === Repeats a given test N times (note the filters - individual test repetitions are given suffixes, ie: testFoo[0], testFoo[1], etc... so using testmethod or tests.method ending in a glob is necessary to ensure iterations are run). ------------------------------------------------------------------------- mvn test -Dtests.iters=N -Dtests.class=*.ClassName -Dtests.method=mytest* ------------------------------------------------------------------------- Repeats N times but skips any tests after the first failure or M initial failures. ------------------------------------------------------------- mvn test -Dtests.iters=N -Dtests.failfast=true -Dtestcase=... mvn test -Dtests.iters=N -Dtests.maxfailures=M -Dtestcase=... ------------------------------------------------------------- === Test groups. Test groups can be enabled or disabled (true/false). Default value provided below in [brackets]. ------------------------------------------------------------------ mvn test -Dtests.nightly=[false] - nightly test group (@Nightly) mvn test -Dtests.weekly=[false] - weekly tests (@Weekly) mvn test -Dtests.awaitsfix=[false] - known issue (@AwaitsFix) ------------------------------------------------------------------ === Load balancing and caches. By default, the tests run sequentially on a single forked JVM. To run with more forked JVMs than the default use: ---------------------------- mvn test -Dtests.jvms=8 test ---------------------------- Don't count hypercores for CPU-intense tests and leave some slack for JVM-internal threads (like the garbage collector). Make sure there is enough RAM to handle child JVMs. === Test compatibility. It is possible to provide a version that allows to adapt the tests behaviour to older features or bugs that have been changed or fixed in the meantime. ----------------------------------------- mvn test -Dtests.compatibility=1.0.0 ----------------------------------------- === Miscellaneous. Run all tests without stopping on errors (inspect log files). ----------------------------------------- mvn test -Dtests.haltonfailure=false test ----------------------------------------- Run more verbose output (slave JVM parameters, etc.). ---------------------- mvn test -verbose test ---------------------- Change the default suite timeout to 5 seconds for all tests (note the exclamation mark). --------------------------------------- mvn test -Dtests.timeoutSuite=5000! ... --------------------------------------- Change the logging level of ES (not mvn) -------------------------------- mvn test -Des.logger.level=DEBUG -------------------------------- Print all the logging output from the test runs to the commandline even if tests are passing. ------------------------------ mvn test -Dtests.output=always ------------------------------ Configure the heap size. ------------------------------ mvn test -Dtests.heap.size=512m ------------------------------ Pass arbitrary jvm arguments. ------------------------------ # specify heap dump path mvn test -Dtests.jvm.argline="-XX:HeapDumpPath=/path/to/heapdumps" # enable gc logging mvn test -Dtests.jvm.argline="-verbose:gc" # enable security debugging mvn test -Dtests.jvm.argline="-Djava.security.debug=access,failure" ------------------------------ == Backwards Compatibility Tests Running backwards compatibility tests is disabled by default since it requires a release version of elasticsearch to be present on the test system. To run backwards compatibilty tests untar or unzip a release and run the tests with the following command: --------------------------------------------------------------------------- mvn test -Dtests.filter="@backwards" -Dtests.bwc.version=x.y.z -Dtests.bwc.path=/path/to/elasticsearch -Dtests.security.manager=false --------------------------------------------------------------------------- Note that backwards tests must be run with security manager disabled. If the elasticsearch release is placed under `./backwards/elasticsearch-x.y.z` the path can be omitted: --------------------------------------------------------------------------- mvn test -Dtests.filter="@backwards" -Dtests.bwc.version=x.y.z -Dtests.security.manager=false --------------------------------------------------------------------------- To setup the bwc test environment execute the following steps (provided you are already in your elasticsearch clone): --------------------------------------------------------------------------- $ mkdir backwards && cd backwards $ curl -O https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-1.2.1.tar.gz $ tar -xzf elasticsearch-1.2.1.tar.gz --------------------------------------------------------------------------- == Running integration tests To run the integration tests: --------------------------------------------------------------------------- mvn verify --------------------------------------------------------------------------- Note that this will also run the unit tests first. If you want to just run the integration tests only (because you are debugging them): --------------------------------------------------------------------------- mvn verify -Dskip.unit.tests --------------------------------------------------------------------------- == Testing the REST layer The available integration tests make use of the java API to communicate with the elasticsearch nodes, using the internal binary transport (port 9300 by default). The REST layer is tested through specific tests that are shared between all the elasticsearch official clients and consist of YAML files that describe the operations to be executed and the obtained results that need to be tested. The REST tests are run automatically when executing the maven test command. To run only the REST tests use the following command: --------------------------------------------------------------------------- mvn verify -Dtests.filter="@Rest" -Dskip.unit.tests=true --------------------------------------------------------------------------- `RestNIT` are the executable test classes that runs all the yaml suites available within the `rest-api-spec` folder. The REST tests support all the options provided by the randomized runner, plus the following: * `tests.rest[true|false]`: determines whether the REST tests need to be run (default) or not. * `tests.rest.suite`: comma separated paths of the test suites to be run (by default loaded from /rest-api-spec/test). It is possible to run only a subset of the tests providing a sub-folder or even a single yaml file (the default /rest-api-spec/test prefix is optional when files are loaded from classpath) e.g. -Dtests.rest.suite=index,get,create/10_with_id * `tests.rest.blacklist`: comma separated globs that identify tests that are blacklisted and need to be skipped e.g. -Dtests.rest.blacklist=index/*/Index document,get/10_basic/* * `tests.rest.spec`: REST spec path (default /rest-api-spec/api) Note that the REST tests, like all the integration tests, can be run against an external cluster by specifying the `tests.cluster` property, which if present needs to contain a comma separated list of nodes to connect to (e.g. localhost:9300). A transport client will be created based on that and used for all the before|after test operations, and to extract the http addresses of the nodes so that REST requests can be sent to them. == Skip validate To disable validation step (forbidden API or `// NOCOMMIT`) use --------------------------------------------------------------------------- mvn test -Dvalidate.skip=true --------------------------------------------------------------------------- You can also skip this by using the "dev" profile: --------------------------------------------------------------------------- mvn test -Pdev --------------------------------------------------------------------------- == Testing scripts The simplest way to test scripts and the packaged distributions is to use Vagrant. You can get started by following there five easy steps: . Install Virtual Box and Vagrant. . (Optional) Install vagrant-cachier to squeeze a bit more performance out of the process: -------------------------------------- vagrant plugin install vagrant-cachier -------------------------------------- . Validate your installed dependencies: ------------------------------------- mvn -Dtests.vagrant -pl qa/vagrant validate ------------------------------------- . Download the VMs. Since Maven or ant or something eats the progress reports from Vagrant when you run it inside mvn its probably best if you run this one time to setup all the VMs one at a time. Run this to download and setup the VMs we use for testing by default: -------------------------------------------------------- vagrant up --provision trusty --provider virtualbox && vagrant halt trusty vagrant up --provision centos-7 --provider virtualbox && vagrant halt centos-7 -------------------------------------------------------- or run this to download and setup all the VMs: ------------------------------------------------------------------------------- vagrant halt for box in $(vagrant status | grep 'poweroff\|not created' | cut -f1 -d' '); do vagrant up --provision $box --provider virtualbox vagrant halt $box done ------------------------------------------------------------------------------- . Smoke test the maven/ant dance that we use to get vagrant involved in integration testing is working: --------------------------------------------- mvn -Dtests.vagrant -Psmoke-vms -pl qa/vagrant verify --------------------------------------------- or this to validate all the VMs: ------------------------------------------------- mvn -Dtests.vagrant=all -Psmoke-vms -pl qa/vagrant verify ------------------------------------------------- That will start up the VMs and then immediate quit. . Finally run the tests. The fastest way to get this started is to run: ----------------------------------- mvn clean install -DskipTests mvn -Dtests.vagrant -pl qa/vagrant verify ----------------------------------- You could just run: -------------------- mvn -Dtests.vagrant verify -------------------- but that will run all the tests. Which is probably a good thing, but not always what you want. Whichever snippet you run mvn will build the tar, zip and deb packages. If you have rpmbuild installed it'll build the rpm package as well. Then mvn will spin up trusty and verify the tar, zip, and deb package. If you have rpmbuild installed it'll spin up centos-7 and verify the tar, zip and rpm packages. We chose those two distributions as the default because they cover deb and rpm packaging and SyvVinit and systemd. You can control the boxes that are used for testing like so. Run just fedora-22 with: -------------------------------------------- mvn -Dtests.vagrant -pl qa/vagrant verify -DboxesToTest=fedora-22 -------------------------------------------- or run jessie and trusty: ------------------------------------------------------------------ mvn -Dtests.vagrant -pl qa/vagrant verify -DboxesToTest='jessie, trusty' ------------------------------------------------------------------ or run all the boxes: --------------------------------------- mvn -Dtests.vagrant=all -pl qa/vagrant verify --------------------------------------- If you want to run a specific test on several boxes you can do: --------------------------------------- mvn -Dtests.vagrant=all -pl qa/vagrant verify -DtestScripts=*tar*.bats --------------------------------------- Its important to know that if you ctrl-c any of these `mvn` runs that you'll probably leave a VM up. You can terminate it by running: ------------ vagrant halt ------------ This is just regular vagrant so you can run normal multi box vagrant commands to test things manually. Just run: --------------------------------------- vagrant up trusty --provider virtualbox && vagrant ssh trusty --------------------------------------- to get an Ubuntu or ------------------------------------------- vagrant up centos-7 --provider virtualbox && vagrant ssh centos-7 ------------------------------------------- to get a CentOS. Once you are done with them you should halt them: ------------------- vagrant halt trusty ------------------- These are the linux flavors the Vagrantfile currently supports: * precise aka Ubuntu 12.04 * trusty aka Ubuntu 14.04 * vivid aka Ubuntun 15.04 * jessie aka Debian 8, the current debina stable distribution * centos-6 * centos-7 * fedora-22 * oel-7 aka Oracle Enterprise Linux 7 * sles-12 We're missing the following from the support matrix because there aren't high quality boxes available in vagrant atlas: * sles-11 * opensuse-13 * oel-6 We're missing the follow because our tests are very linux/bash centric: * Windows Server 2012 Its important to think of VMs like cattle: if they become lame you just shoot them and let vagrant reprovision them. Say you've hosed your precise VM: ---------------------------------------------------- vagrant ssh precise -c 'sudo rm -rf /bin'; echo oops ---------------------------------------------------- All you've got to do to get another one is ---------------------------------------------- vagrant destroy -f trusty && vagrant up trusty --provider virtualbox ---------------------------------------------- The whole process takes a minute and a half on a modern laptop, two and a half without vagrant-cachier. Its possible that some downloads will fail and it'll be impossible to restart them. This is a bug in vagrant. See the instructions here for how to work around it: https://github.com/mitchellh/vagrant/issues/4479 Some vagrant commands will work on all VMs at once: ------------------ vagrant halt vagrant destroy -f ------------------ ---------- vagrant up ---------- would normally start all the VMs but we've prevented that because that'd consume a ton of ram. == Testing scripts more directly In general its best to stick to testing in vagrant because the bats scripts are destructive. When working with a single package its generally faster to run its tests in a tighter loop than maven provides. In one window: -------------------------------- mvn -pl distribution/rpm package -------------------------------- and in another window: ---------------------------------------------------- vagrant up centos-7 --provider virtualbox && vagrant ssh centos-7 cd $RPM sudo bats $BATS/*rpm*.bats ---------------------------------------------------- If you wanted to retest all the release artifacts on a single VM you could: ------------------------------------------------- # Build all the distributions fresh but skip recompiling elasticsearch: mvn -amd -pl distribution install -DskipTests # Copy them all the testroot mvn -Dtests.vagrant -pl qa/vagrant pre-integration-test vagrant up trusty --provider virtualbox && vagrant ssh trusty cd $TESTROOT sudo bats $BATS/*.bats ------------------------------------------------- == Coverage analysis Tests can be run instrumented with jacoco to produce a coverage report in `target/site/jacoco/`. Unit test coverage: --------------------------------------------------------------------------- mvn -Dtests.coverage test jacoco:report --------------------------------------------------------------------------- Integration test coverage: --------------------------------------------------------------------------- mvn -Dtests.coverage -Dskip.unit.tests verify jacoco:report --------------------------------------------------------------------------- Combined (Unit+Integration) coverage: --------------------------------------------------------------------------- mvn -Dtests.coverage verify jacoco:report --------------------------------------------------------------------------- == Debugging from an IDE If you want to run elasticsearch from your IDE, you should execute ./run.sh It opens a remote debugging port that you can connect with your IDE.