Merge branch 'master' into feature/query-refactoring

Conflicts:
	core/src/test/java/org/elasticsearch/common/io/streams/BytesStreamsTests.java
	core/src/test/java/org/elasticsearch/search/highlight/HighlighterSearchIT.java
	core/src/test/java/org/elasticsearch/search/query/SearchQueryIT.java
	core/src/test/java/org/elasticsearch/test/transport/AssertingLocalTransport.java
This commit is contained in:
Christoph Büscher 2015-08-04 10:53:19 +02:00
commit 4cceb08a0b
901 changed files with 3611 additions and 3073 deletions

View File

@ -73,9 +73,7 @@ mvn test "-Dtests.method=*esi*"
You can also filter tests by certain annotations ie: You can also filter tests by certain annotations ie:
* `@Slow` - tests that are know to take a long time to execute
* `@Nightly` - tests that only run in nightly builds (disabled by default) * `@Nightly` - tests that only run in nightly builds (disabled by default)
* `@Integration` - integration tests
* `@Backwards` - backwards compatibility tests (disabled by default) * `@Backwards` - backwards compatibility tests (disabled by default)
* `@AwaitsFix` - tests that are waiting for a bugfix (disabled by default) * `@AwaitsFix` - tests that are waiting for a bugfix (disabled by default)
* `@BadApple` - tests that are known to fail randomly (disabled by default) * `@BadApple` - tests that are known to fail randomly (disabled by default)
@ -83,15 +81,15 @@ You can also filter tests by certain annotations ie:
Those annotation names can be combined into a filter expression like: Those annotation names can be combined into a filter expression like:
------------------------------------------------ ------------------------------------------------
mvn test -Dtests.filter="@nightly and not @slow" mvn test -Dtests.filter="@nightly and not @backwards"
------------------------------------------------ ------------------------------------------------
to run all nightly test but not the ones that are slow. `tests.filter` supports to run all nightly test but not the ones that are backwards tests. `tests.filter` supports
the boolean operators `and, or, not` and grouping ie: the boolean operators `and, or, not` and grouping ie:
--------------------------------------------------------------- ---------------------------------------------------------------
mvn test -Dtests.filter="@nightly and not(@slow or @backwards)" mvn test -Dtests.filter="@nightly and not(@badapple or @backwards)"
--------------------------------------------------------------- ---------------------------------------------------------------
=== Seed and repetitions. === Seed and repetitions.
@ -147,7 +145,6 @@ Default value provided below in [brackets].
mvn test -Dtests.nightly=[false] - nightly test group (@Nightly) mvn test -Dtests.nightly=[false] - nightly test group (@Nightly)
mvn test -Dtests.weekly=[false] - weekly tests (@Weekly) mvn test -Dtests.weekly=[false] - weekly tests (@Weekly)
mvn test -Dtests.awaitsfix=[false] - known issue (@AwaitsFix) mvn test -Dtests.awaitsfix=[false] - known issue (@AwaitsFix)
mvn test -Dtests.slow=[true] - slow tests (@Slow)
------------------------------------------------------------------ ------------------------------------------------------------------
=== Load balancing and caches. === Load balancing and caches.
@ -248,6 +245,21 @@ $ curl -O https://download.elasticsearch.org/elasticsearch/elasticsearch/elastic
$ tar -xzf elasticsearch-1.2.1.tar.gz $ tar -xzf elasticsearch-1.2.1.tar.gz
--------------------------------------------------------------------------- ---------------------------------------------------------------------------
== Running integration tests
To run the integration tests:
---------------------------------------------------------------------------
mvn verify
---------------------------------------------------------------------------
Note that this will also run the unit tests first. If you want to just
run the integration tests only (because you are debugging them):
---------------------------------------------------------------------------
mvn verify -Dskip.unit.tests
---------------------------------------------------------------------------
== Testing the REST layer == Testing the REST layer
The available integration tests make use of the java API to communicate with The available integration tests make use of the java API to communicate with
@ -261,10 +273,10 @@ The REST tests are run automatically when executing the maven test command. To r
REST tests use the following command: REST tests use the following command:
--------------------------------------------------------------------------- ---------------------------------------------------------------------------
mvn test -Dtests.filter="@Rest" mvn verify -Dtests.filter="@Rest"
--------------------------------------------------------------------------- ---------------------------------------------------------------------------
`ElasticsearchRestTests` is the executable test class that runs all the `RestNIT` are the executable test classes that runs all the
yaml suites available within the `rest-api-spec` folder. yaml suites available within the `rest-api-spec` folder.
The REST tests support all the options provided by the randomized runner, plus the following: The REST tests support all the options provided by the randomized runner, plus the following:
@ -321,3 +333,12 @@ ES_CLEAN_BEFORE_TEST=true bats 30_deb_package.bats
The current mode of execution is to copy all the packages that should be tested The current mode of execution is to copy all the packages that should be tested
into one directory, then copy the bats files into the same directory and run into one directory, then copy the bats files into the same directory and run
those. those.
== Coverage analysis
To run tests instrumented with jacoco and produce a coverage report in
`target/site/jacoco/`:
---------------------------------------------------------------------------
mvn -Dtests.coverage test jacoco:report
---------------------------------------------------------------------------

View File

@ -16,11 +16,6 @@
<name>Elasticsearch Core</name> <name>Elasticsearch Core</name>
<description>Elasticsearch - Open Source, Distributed, RESTful Search Engine</description> <description>Elasticsearch - Open Source, Distributed, RESTful Search Engine</description>
<properties>
<skip.integ.tests>true</skip.integ.tests>
</properties>
<dependencies> <dependencies>
<dependency> <dependency>
<groupId>org.hamcrest</groupId> <groupId>org.hamcrest</groupId>
@ -297,119 +292,6 @@
</execution> </execution>
</executions> </executions>
</plugin> </plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>shaded</shadedClassifierName>
<shadeTestJar>false</shadeTestJar>
<minimizeJar>true</minimizeJar>
<promoteTransitiveDependencies>true</promoteTransitiveDependencies>
<createDependencyReducedPom>false</createDependencyReducedPom>
<artifactSet>
<includes>
<include>com.google.guava:guava</include>
<include>com.carrotsearch:hppc</include>
<include>com.fasterxml.jackson.core:jackson-core</include>
<include>com.fasterxml.jackson.dataformat:jackson-dataformat-smile</include>
<include>com.fasterxml.jackson.dataformat:jackson-dataformat-yaml</include>
<include>com.fasterxml.jackson.dataformat:jackson-dataformat-cbor</include>
<include>joda-time:joda-time</include>
<include>org.joda:joda-convert</include>
<include>io.netty:netty</include>
<include>com.ning:compress-lzf</include>
<include>com.github.spullara.mustache.java:compiler</include>
<include>com.tdunning:t-digest</include>
<include>org.apache.commons:commons-lang3</include>
<include>commons-cli:commons-cli</include>
<include>com.twitter:jsr166e</include>
</includes>
</artifactSet>
<transformers>
<!-- copy over MANIFEST.MF from unshaded jar, but mark jar as shaded too -->
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<manifestEntries>
<X-Build-Shaded>true</X-Build-Shaded>
</manifestEntries>
</transformer>
</transformers>
<relocations>
<relocation>
<pattern>com.google.common</pattern>
<shadedPattern>org.elasticsearch.common</shadedPattern>
</relocation>
<relocation>
<pattern>com.carrotsearch.hppc</pattern>
<shadedPattern>org.elasticsearch.common.hppc</shadedPattern>
</relocation>
<relocation>
<pattern>com.twitter.jsr166e</pattern>
<shadedPattern>org.elasticsearch.common.util.concurrent.jsr166e</shadedPattern>
</relocation>
<relocation>
<pattern>com.fasterxml.jackson</pattern>
<shadedPattern>org.elasticsearch.common.jackson</shadedPattern>
</relocation>
<relocation>
<pattern>org.joda.time</pattern>
<shadedPattern>org.elasticsearch.common.joda.time</shadedPattern>
</relocation>
<relocation>
<pattern>org.jboss.netty</pattern>
<shadedPattern>org.elasticsearch.common.netty</shadedPattern>
</relocation>
<relocation>
<pattern>com.ning.compress</pattern>
<shadedPattern>org.elasticsearch.common.compress</shadedPattern>
</relocation>
<relocation>
<pattern>com.github.mustachejava</pattern>
<shadedPattern>org.elasticsearch.common.mustache</shadedPattern>
</relocation>
<relocation>
<pattern>com.tdunning.math.stats</pattern>
<shadedPattern>org.elasticsearch.common.stats</shadedPattern>
</relocation>
<relocation>
<pattern>org.apache.commons.lang</pattern>
<shadedPattern>org.elasticsearch.common.lang</shadedPattern>
</relocation>
<relocation>
<pattern>org.apache.commons.cli</pattern>
<shadedPattern>org.elasticsearch.common.cli.commons</shadedPattern>
</relocation>
</relocations>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/license/**</exclude>
<exclude>META-INF/*</exclude>
<exclude>META-INF/maven/**</exclude>
<exclude>LICENSE</exclude>
<exclude>NOTICE</exclude>
<exclude>/*.txt</exclude>
<exclude>build.properties</exclude>
</excludes>
</filter>
<filter>
<artifact>io.netty:netty</artifact>
<includes>
<include>org/jboss/netty/**</include>
</includes>
</filter>
</filters>
</configuration>
</plugin>
<plugin> <plugin>
<groupId>org.apache.maven.plugins</groupId> <groupId>org.apache.maven.plugins</groupId>

View File

@ -222,7 +222,7 @@ public class Bootstrap {
} }
} }
public static void main(String[] args) { public static void main(String[] args) throws Throwable {
BootstrapCLIParser bootstrapCLIParser = new BootstrapCLIParser(); BootstrapCLIParser bootstrapCLIParser = new BootstrapCLIParser();
CliTool.ExitStatus status = bootstrapCLIParser.execute(args); CliTool.ExitStatus status = bootstrapCLIParser.execute(args);
@ -239,28 +239,16 @@ public class Bootstrap {
foreground = false; foreground = false;
} }
String stage = "Settings"; Tuple<Settings, Environment> tuple = initialSettings(foreground);
Settings settings = tuple.v1();
Environment environment = tuple.v2();
Settings settings = null; if (environment.pidFile() != null) {
Environment environment = null; PidFile.create(environment.pidFile(), true);
try {
Tuple<Settings, Environment> tuple = initialSettings(foreground);
settings = tuple.v1();
environment = tuple.v2();
if (environment.pidFile() != null) {
stage = "Pid";
PidFile.create(environment.pidFile(), true);
}
stage = "Logging";
setupLogging(settings, environment);
} catch (Exception e) {
String errorMessage = buildErrorMessage(stage, e);
sysError(errorMessage, true);
System.exit(3);
} }
setupLogging(settings, environment);
if (System.getProperty("es.max-open-files", "false").equals("true")) { if (System.getProperty("es.max-open-files", "false").equals("true")) {
ESLogger logger = Loggers.getLogger(Bootstrap.class); ESLogger logger = Loggers.getLogger(Bootstrap.class);
logger.info("max_open_files [{}]", ProcessProbe.getInstance().getMaxFileDescriptorCount()); logger.info("max_open_files [{}]", ProcessProbe.getInstance().getMaxFileDescriptorCount());
@ -272,7 +260,6 @@ public class Bootstrap {
logger.warn("jvm uses the client vm, make sure to run `java` with the server vm for best performance by adding `-server` to the command line"); logger.warn("jvm uses the client vm, make sure to run `java` with the server vm for best performance by adding `-server` to the command line");
} }
stage = "Initialization";
try { try {
if (!foreground) { if (!foreground) {
Loggers.disableConsoleLogging(); Loggers.disableConsoleLogging();
@ -284,7 +271,6 @@ public class Bootstrap {
INSTANCE.setup(true, settings, environment); INSTANCE.setup(true, settings, environment);
stage = "Startup";
INSTANCE.start(); INSTANCE.start();
if (!foreground) { if (!foreground) {
@ -295,14 +281,9 @@ public class Bootstrap {
if (INSTANCE.node != null) { if (INSTANCE.node != null) {
logger = Loggers.getLogger(Bootstrap.class, INSTANCE.node.settings().get("name")); logger = Loggers.getLogger(Bootstrap.class, INSTANCE.node.settings().get("name"));
} }
String errorMessage = buildErrorMessage(stage, e);
if (foreground) {
sysError(errorMessage, true);
Loggers.disableConsoleLogging();
}
logger.error("Exception", e); logger.error("Exception", e);
System.exit(3); throw e;
} }
} }
@ -323,38 +304,4 @@ public class Bootstrap {
System.err.flush(); System.err.flush();
} }
} }
private static String buildErrorMessage(String stage, Throwable e) {
StringBuilder errorMessage = new StringBuilder("{").append(Version.CURRENT).append("}: ");
errorMessage.append(stage).append(" Failed ...\n");
if (e instanceof CreationException) {
CreationException createException = (CreationException) e;
Set<String> seenMessages = newHashSet();
int counter = 1;
for (Message message : createException.getErrorMessages()) {
String detailedMessage;
if (message.getCause() == null) {
detailedMessage = message.getMessage();
} else {
detailedMessage = ExceptionsHelper.detailedMessage(message.getCause(), true, 0);
}
if (detailedMessage == null) {
detailedMessage = message.getMessage();
}
if (seenMessages.contains(detailedMessage)) {
continue;
}
seenMessages.add(detailedMessage);
errorMessage.append("").append(counter++).append(") ").append(detailedMessage);
}
} else {
errorMessage.append("- ").append(ExceptionsHelper.detailedMessage(e, true, 0));
}
if (Loggers.getLogger(Bootstrap.class).isDebugEnabled()) {
errorMessage.append("\n").append(ExceptionsHelper.stackTrace(e));
}
return errorMessage.toString();
}
} }

View File

@ -24,7 +24,7 @@ package org.elasticsearch.bootstrap;
*/ */
public class Elasticsearch extends Bootstrap { public class Elasticsearch extends Bootstrap {
public static void main(String[] args) { public static void main(String[] args) throws Throwable {
Bootstrap.main(args); Bootstrap.main(args);
} }
} }

View File

@ -25,7 +25,7 @@ package org.elasticsearch.bootstrap;
*/ */
public class ElasticsearchF { public class ElasticsearchF {
public static void main(String[] args) { public static void main(String[] args) throws Throwable {
System.setProperty("es.foreground", "yes"); System.setProperty("es.foreground", "yes");
Bootstrap.main(args); Bootstrap.main(args);
} }

View File

@ -47,8 +47,11 @@ import java.util.jar.Manifest;
public class JarHell { public class JarHell {
/** Simple driver class, can be used eg. from builds. Returns non-zero on jar-hell */ /** Simple driver class, can be used eg. from builds. Returns non-zero on jar-hell */
@SuppressForbidden(reason = "command line tool")
public static void main(String args[]) throws Exception { public static void main(String args[]) throws Exception {
System.out.println("checking for jar hell...");
checkJarHell(); checkJarHell();
System.out.println("no jar hell found");
} }
/** /**

View File

@ -116,14 +116,15 @@ final class Security {
/** returns dynamic Permissions to configured paths */ /** returns dynamic Permissions to configured paths */
static Permissions createPermissions(Environment environment) throws IOException { static Permissions createPermissions(Environment environment) throws IOException {
// TODO: improve test infra so we can reduce permissions where read/write
// is not really needed...
Permissions policy = new Permissions(); Permissions policy = new Permissions();
// read-only dirs
addPath(policy, environment.binFile(), "read,readlink");
addPath(policy, environment.libFile(), "read,readlink");
addPath(policy, environment.pluginsFile(), "read,readlink");
addPath(policy, environment.configFile(), "read,readlink");
// read-write dirs
addPath(policy, environment.tmpFile(), "read,readlink,write,delete"); addPath(policy, environment.tmpFile(), "read,readlink,write,delete");
addPath(policy, environment.homeFile(), "read,readlink,write,delete");
addPath(policy, environment.configFile(), "read,readlink,write,delete");
addPath(policy, environment.logsFile(), "read,readlink,write,delete"); addPath(policy, environment.logsFile(), "read,readlink,write,delete");
addPath(policy, environment.pluginsFile(), "read,readlink,write,delete");
for (Path path : environment.dataFiles()) { for (Path path : environment.dataFiles()) {
addPath(policy, path, "read,readlink,write,delete"); addPath(policy, path, "read,readlink,write,delete");
} }
@ -134,7 +135,8 @@ final class Security {
addPath(policy, path, "read,readlink,write,delete"); addPath(policy, path, "read,readlink,write,delete");
} }
if (environment.pidFile() != null) { if (environment.pidFile() != null) {
addPath(policy, environment.pidFile().getParent(), "read,readlink,write,delete"); // we just need permission to remove the file if its elsewhere.
policy.add(new FilePermission(environment.pidFile().toString(), "delete"));
} }
return policy; return policy;
} }

View File

@ -19,6 +19,7 @@
package org.elasticsearch.common.util.concurrent; package org.elasticsearch.common.util.concurrent;
import com.google.common.collect.Lists; import com.google.common.collect.Lists;
import org.elasticsearch.common.Priority; import org.elasticsearch.common.Priority;
import org.elasticsearch.common.unit.TimeValue; import org.elasticsearch.common.unit.TimeValue;
@ -161,8 +162,10 @@ public class PrioritizedEsThreadPoolExecutor extends EsThreadPoolExecutor {
private Runnable runnable; private Runnable runnable;
private final long insertionOrder; private final long insertionOrder;
private volatile ScheduledFuture<?> timeoutFuture;
private volatile boolean started = false; // these two variables are protected by 'this'
private ScheduledFuture<?> timeoutFuture;
private boolean started = false;
TieBreakingPrioritizedRunnable(PrioritizedRunnable runnable, long insertionOrder) { TieBreakingPrioritizedRunnable(PrioritizedRunnable runnable, long insertionOrder) {
this(runnable, runnable.priority(), insertionOrder); this(runnable, runnable.priority(), insertionOrder);
@ -176,10 +179,12 @@ public class PrioritizedEsThreadPoolExecutor extends EsThreadPoolExecutor {
@Override @Override
public void run() { public void run() {
// make the task as stared. This is needed for synchronization with the timeout handling synchronized (this) {
// see #scheduleTimeout() // make the task as stared. This is needed for synchronization with the timeout handling
started = true; // see #scheduleTimeout()
FutureUtils.cancel(timeoutFuture); started = true;
FutureUtils.cancel(timeoutFuture);
}
runAndClean(runnable); runAndClean(runnable);
} }
@ -193,17 +198,20 @@ public class PrioritizedEsThreadPoolExecutor extends EsThreadPoolExecutor {
} }
public void scheduleTimeout(ScheduledExecutorService timer, final Runnable timeoutCallback, TimeValue timeValue) { public void scheduleTimeout(ScheduledExecutorService timer, final Runnable timeoutCallback, TimeValue timeValue) {
timeoutFuture = timer.schedule(new Runnable() { synchronized (this) {
@Override if (timeoutFuture != null) {
public void run() { throw new IllegalStateException("scheduleTimeout may only be called once");
if (remove(TieBreakingPrioritizedRunnable.this)) { }
runAndClean(timeoutCallback); if (started == false) {
} timeoutFuture = timer.schedule(new Runnable() {
@Override
public void run() {
if (remove(TieBreakingPrioritizedRunnable.this)) {
runAndClean(timeoutCallback);
}
}
}, timeValue.nanos(), TimeUnit.NANOSECONDS);
} }
}, timeValue.nanos(), TimeUnit.NANOSECONDS);
if (started) {
// if the actual action already it might have missed the setting of the future. Clean it ourselves.
FutureUtils.cancel(timeoutFuture);
} }
} }

View File

@ -45,8 +45,6 @@ public class Environment {
private final Settings settings; private final Settings settings;
private final Path homeFile;
private final Path[] dataFiles; private final Path[] dataFiles;
private final Path[] dataWithClusterFiles; private final Path[] dataWithClusterFiles;
@ -57,6 +55,12 @@ public class Environment {
private final Path pluginsFile; private final Path pluginsFile;
/** location of bin/, used by plugin manager */
private final Path binFile;
/** location of lib/, */
private final Path libFile;
private final Path logsFile; private final Path logsFile;
/** Path to the PID file (can be null if no PID file is configured) **/ /** Path to the PID file (can be null if no PID file is configured) **/
@ -83,6 +87,7 @@ public class Environment {
public Environment(Settings settings) { public Environment(Settings settings) {
this.settings = settings; this.settings = settings;
final Path homeFile;
if (settings.get("path.home") != null) { if (settings.get("path.home") != null) {
homeFile = PathUtils.get(cleanPath(settings.get("path.home"))); homeFile = PathUtils.get(cleanPath(settings.get("path.home")));
} else { } else {
@ -133,6 +138,9 @@ public class Environment {
} else { } else {
pidFile = null; pidFile = null;
} }
binFile = homeFile.resolve("bin");
libFile = homeFile.resolve("lib");
} }
/** /**
@ -142,13 +150,6 @@ public class Environment {
return this.settings; return this.settings;
} }
/**
* The home of the installation.
*/
public Path homeFile() {
return homeFile;
}
/** /**
* The data location. * The data location.
*/ */
@ -236,6 +237,14 @@ public class Environment {
return pluginsFile; return pluginsFile;
} }
public Path binFile() {
return binFile;
}
public Path libFile() {
return libFile;
}
public Path logsFile() { public Path logsFile() {
return logsFile; return logsFile;
} }

View File

@ -140,8 +140,8 @@ public class Node implements Releasable {
if (logger.isDebugEnabled()) { if (logger.isDebugEnabled()) {
Environment env = tuple.v2(); Environment env = tuple.v2();
logger.debug("using home [{}], config [{}], data [{}], logs [{}], plugins [{}]", logger.debug("using config [{}], data [{}], logs [{}], plugins [{}]",
env.homeFile(), env.configFile(), Arrays.toString(env.dataFiles()), env.logsFile(), env.pluginsFile()); env.configFile(), Arrays.toString(env.dataFiles()), env.logsFile(), env.pluginsFile());
} }
this.pluginsService = new PluginsService(tuple.v1(), tuple.v2()); this.pluginsService = new PluginsService(tuple.v1(), tuple.v2());

View File

@ -32,6 +32,7 @@ import org.apache.lucene.util.Counter;
import org.elasticsearch.action.percolate.PercolateShardRequest; import org.elasticsearch.action.percolate.PercolateShardRequest;
import org.elasticsearch.action.search.SearchType; import org.elasticsearch.action.search.SearchType;
import org.elasticsearch.cache.recycler.PageCacheRecycler; import org.elasticsearch.cache.recycler.PageCacheRecycler;
import org.elasticsearch.client.Client;
import org.elasticsearch.common.*; import org.elasticsearch.common.*;
import org.elasticsearch.common.collect.ImmutableOpenMap; import org.elasticsearch.common.collect.ImmutableOpenMap;
import org.elasticsearch.common.lease.Releasables; import org.elasticsearch.common.lease.Releasables;
@ -58,7 +59,7 @@ import org.elasticsearch.search.aggregations.SearchContextAggregations;
import org.elasticsearch.search.dfs.DfsSearchResult; import org.elasticsearch.search.dfs.DfsSearchResult;
import org.elasticsearch.search.fetch.FetchSearchResult; import org.elasticsearch.search.fetch.FetchSearchResult;
import org.elasticsearch.search.fetch.FetchSubPhase; import org.elasticsearch.search.fetch.FetchSubPhase;
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext; import org.elasticsearch.search.fetch.FetchSubPhaseContext;
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext; import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
import org.elasticsearch.search.fetch.script.ScriptFieldsContext; import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
import org.elasticsearch.search.fetch.source.FetchSourceContext; import org.elasticsearch.search.fetch.source.FetchSourceContext;
@ -116,6 +117,7 @@ public class PercolateContext extends SearchContext {
private SearchContextAggregations aggregations; private SearchContextAggregations aggregations;
private QuerySearchResult querySearchResult; private QuerySearchResult querySearchResult;
private Sort sort; private Sort sort;
private final Map<String, FetchSubPhaseContext> subPhaseContexts = new HashMap<>();
public PercolateContext(PercolateShardRequest request, SearchShardTarget searchShardTarget, IndexShard indexShard, public PercolateContext(PercolateShardRequest request, SearchShardTarget searchShardTarget, IndexShard indexShard,
IndexService indexService, PageCacheRecycler pageCacheRecycler, IndexService indexService, PageCacheRecycler pageCacheRecycler,
@ -282,6 +284,15 @@ public class PercolateContext extends SearchContext {
return this; return this;
} }
@Override
public <SubPhaseContext extends FetchSubPhaseContext> SubPhaseContext getFetchSubPhaseContext(FetchSubPhase.ContextFactory<SubPhaseContext> contextFactory) {
String subPhaseName = contextFactory.getName();
if (subPhaseContexts.get(subPhaseName) == null) {
subPhaseContexts.put(subPhaseName, contextFactory.newContextInstance());
}
return (SubPhaseContext) subPhaseContexts.get(subPhaseName);
}
// Unused: // Unused:
@Override @Override
public void preProcess() { public void preProcess() {
@ -378,16 +389,6 @@ public class PercolateContext extends SearchContext {
throw new UnsupportedOperationException(); throw new UnsupportedOperationException();
} }
@Override
public boolean hasFieldDataFields() {
throw new UnsupportedOperationException();
}
@Override
public FieldDataFieldsContext fieldDataFields() {
throw new UnsupportedOperationException();
}
@Override @Override
public boolean hasScriptFields() { public boolean hasScriptFields() {
throw new UnsupportedOperationException(); throw new UnsupportedOperationException();

View File

@ -462,7 +462,7 @@ public class PluginManager {
} }
Path binDir(Environment env) { Path binDir(Environment env) {
return env.homeFile().resolve("bin").resolve(name); return env.binFile().resolve(name);
} }
Path configDir(Environment env) { Path configDir(Environment env) {

View File

@ -42,6 +42,8 @@ import org.elasticsearch.rest.*;
import org.elasticsearch.rest.action.support.RestActionListener; import org.elasticsearch.rest.action.support.RestActionListener;
import org.elasticsearch.rest.action.support.RestResponseListener; import org.elasticsearch.rest.action.support.RestResponseListener;
import org.elasticsearch.rest.action.support.RestTable; import org.elasticsearch.rest.action.support.RestTable;
import org.joda.time.DateTime;
import org.joda.time.DateTimeZone;
import java.util.Locale; import java.util.Locale;
@ -112,6 +114,9 @@ public class RestIndicesAction extends AbstractCatAction {
table.addCell("docs.count", "alias:dc,docsCount;text-align:right;desc:available docs"); table.addCell("docs.count", "alias:dc,docsCount;text-align:right;desc:available docs");
table.addCell("docs.deleted", "alias:dd,docsDeleted;text-align:right;desc:deleted docs"); table.addCell("docs.deleted", "alias:dd,docsDeleted;text-align:right;desc:deleted docs");
table.addCell("creation.date", "alias:cd;default:false;desc:index creation date (millisecond value)");
table.addCell("creation.date.string", "alias:cds;default:false;desc:index creation date (as string)");
table.addCell("store.size", "sibling:pri;alias:ss,storeSize;text-align:right;desc:store size of primaries & replicas"); table.addCell("store.size", "sibling:pri;alias:ss,storeSize;text-align:right;desc:store size of primaries & replicas");
table.addCell("pri.store.size", "text-align:right;desc:store size of primaries"); table.addCell("pri.store.size", "text-align:right;desc:store size of primaries");
@ -320,6 +325,9 @@ public class RestIndicesAction extends AbstractCatAction {
table.addCell(indexStats == null ? null : indexStats.getPrimaries().getDocs().getCount()); table.addCell(indexStats == null ? null : indexStats.getPrimaries().getDocs().getCount());
table.addCell(indexStats == null ? null : indexStats.getPrimaries().getDocs().getDeleted()); table.addCell(indexStats == null ? null : indexStats.getPrimaries().getDocs().getDeleted());
table.addCell(indexMetaData.creationDate());
table.addCell(new DateTime(indexMetaData.creationDate(), DateTimeZone.getDefault()));
table.addCell(indexStats == null ? null : indexStats.getTotal().getStore().size()); table.addCell(indexStats == null ? null : indexStats.getTotal().getStore().size());
table.addCell(indexStats == null ? null : indexStats.getPrimaries().getStore().size()); table.addCell(indexStats == null ? null : indexStats.getPrimaries().getStore().size());

View File

@ -32,6 +32,7 @@ import org.elasticsearch.search.aggregations.AggregationModule;
import org.elasticsearch.search.controller.SearchPhaseController; import org.elasticsearch.search.controller.SearchPhaseController;
import org.elasticsearch.search.dfs.DfsPhase; import org.elasticsearch.search.dfs.DfsPhase;
import org.elasticsearch.search.fetch.FetchPhase; import org.elasticsearch.search.fetch.FetchPhase;
import org.elasticsearch.search.fetch.FetchSubPhaseModule;
import org.elasticsearch.search.fetch.explain.ExplainFetchSubPhase; import org.elasticsearch.search.fetch.explain.ExplainFetchSubPhase;
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsFetchSubPhase; import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsFetchSubPhase;
import org.elasticsearch.search.fetch.innerhits.InnerHitsFetchSubPhase; import org.elasticsearch.search.fetch.innerhits.InnerHitsFetchSubPhase;
@ -63,7 +64,8 @@ public class SearchModule extends AbstractModule implements SpawnModules {
new HighlightModule(), new HighlightModule(),
new SuggestModule(), new SuggestModule(),
new FunctionScoreModule(), new FunctionScoreModule(),
new AggregationModule()); new AggregationModule(),
new FetchSubPhaseModule());
} }
@Override @Override
@ -73,14 +75,6 @@ public class SearchModule extends AbstractModule implements SpawnModules {
bind(SearchPhaseController.class).asEagerSingleton(); bind(SearchPhaseController.class).asEagerSingleton();
bind(FetchPhase.class).asEagerSingleton(); bind(FetchPhase.class).asEagerSingleton();
bind(ExplainFetchSubPhase.class).asEagerSingleton();
bind(FieldDataFieldsFetchSubPhase.class).asEagerSingleton();
bind(ScriptFieldsFetchSubPhase.class).asEagerSingleton();
bind(FetchSourceSubPhase.class).asEagerSingleton();
bind(VersionFetchSubPhase.class).asEagerSingleton();
bind(MatchedQueriesFetchSubPhase.class).asEagerSingleton();
bind(HighlightPhase.class).asEagerSingleton();
bind(InnerHitsFetchSubPhase.class).asEagerSingleton();
bind(SearchServiceTransportAction.class).asEagerSingleton(); bind(SearchServiceTransportAction.class).asEagerSingleton();
bind(MoreLikeThisFetchService.class).asEagerSingleton(); bind(MoreLikeThisFetchService.class).asEagerSingleton();

View File

@ -22,6 +22,7 @@ import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorFactory; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorFactory;
import org.elasticsearch.search.aggregations.support.AggregationContext; import org.elasticsearch.search.aggregations.support.AggregationContext;
import org.elasticsearch.search.aggregations.support.AggregationPath; import org.elasticsearch.search.aggregations.support.AggregationPath;
import org.elasticsearch.search.aggregations.support.AggregationPath.PathElement;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -162,40 +163,79 @@ public class AggregatorFactories {
for (PipelineAggregatorFactory factory : pipelineAggregatorFactories) { for (PipelineAggregatorFactory factory : pipelineAggregatorFactories) {
pipelineAggregatorFactoriesMap.put(factory.getName(), factory); pipelineAggregatorFactoriesMap.put(factory.getName(), factory);
} }
Set<String> aggFactoryNames = new HashSet<>(); Map<String, AggregatorFactory> aggFactoriesMap = new HashMap<>();
for (AggregatorFactory aggFactory : aggFactories) { for (AggregatorFactory aggFactory : aggFactories) {
aggFactoryNames.add(aggFactory.name); aggFactoriesMap.put(aggFactory.name, aggFactory);
} }
List<PipelineAggregatorFactory> orderedPipelineAggregatorrs = new LinkedList<>(); List<PipelineAggregatorFactory> orderedPipelineAggregatorrs = new LinkedList<>();
List<PipelineAggregatorFactory> unmarkedFactories = new ArrayList<PipelineAggregatorFactory>(pipelineAggregatorFactories); List<PipelineAggregatorFactory> unmarkedFactories = new ArrayList<PipelineAggregatorFactory>(pipelineAggregatorFactories);
Set<PipelineAggregatorFactory> temporarilyMarked = new HashSet<PipelineAggregatorFactory>(); Set<PipelineAggregatorFactory> temporarilyMarked = new HashSet<PipelineAggregatorFactory>();
while (!unmarkedFactories.isEmpty()) { while (!unmarkedFactories.isEmpty()) {
PipelineAggregatorFactory factory = unmarkedFactories.get(0); PipelineAggregatorFactory factory = unmarkedFactories.get(0);
resolvePipelineAggregatorOrder(aggFactoryNames, pipelineAggregatorFactoriesMap, orderedPipelineAggregatorrs, unmarkedFactories, temporarilyMarked, factory); resolvePipelineAggregatorOrder(aggFactoriesMap, pipelineAggregatorFactoriesMap, orderedPipelineAggregatorrs,
unmarkedFactories, temporarilyMarked, factory);
} }
return orderedPipelineAggregatorrs; return orderedPipelineAggregatorrs;
} }
private void resolvePipelineAggregatorOrder(Set<String> aggFactoryNames, Map<String, PipelineAggregatorFactory> pipelineAggregatorFactoriesMap, private void resolvePipelineAggregatorOrder(Map<String, AggregatorFactory> aggFactoriesMap,
Map<String, PipelineAggregatorFactory> pipelineAggregatorFactoriesMap,
List<PipelineAggregatorFactory> orderedPipelineAggregators, List<PipelineAggregatorFactory> unmarkedFactories, Set<PipelineAggregatorFactory> temporarilyMarked, List<PipelineAggregatorFactory> orderedPipelineAggregators, List<PipelineAggregatorFactory> unmarkedFactories, Set<PipelineAggregatorFactory> temporarilyMarked,
PipelineAggregatorFactory factory) { PipelineAggregatorFactory factory) {
if (temporarilyMarked.contains(factory)) { if (temporarilyMarked.contains(factory)) {
throw new IllegalStateException("Cyclical dependancy found with pipeline aggregator [" + factory.getName() + "]"); throw new IllegalArgumentException("Cyclical dependancy found with pipeline aggregator [" + factory.getName() + "]");
} else if (unmarkedFactories.contains(factory)) { } else if (unmarkedFactories.contains(factory)) {
temporarilyMarked.add(factory); temporarilyMarked.add(factory);
String[] bucketsPaths = factory.getBucketsPaths(); String[] bucketsPaths = factory.getBucketsPaths();
for (String bucketsPath : bucketsPaths) { for (String bucketsPath : bucketsPaths) {
List<String> bucketsPathElements = AggregationPath.parse(bucketsPath).getPathElementsAsStringList(); List<AggregationPath.PathElement> bucketsPathElements = AggregationPath.parse(bucketsPath).getPathElements();
String firstAggName = bucketsPathElements.get(0); String firstAggName = bucketsPathElements.get(0).name;
if (bucketsPath.equals("_count") || bucketsPath.equals("_key") || aggFactoryNames.contains(firstAggName)) { if (bucketsPath.equals("_count") || bucketsPath.equals("_key")) {
continue;
} else if (aggFactoriesMap.containsKey(firstAggName)) {
AggregatorFactory aggFactory = aggFactoriesMap.get(firstAggName);
for (int i = 1; i < bucketsPathElements.size(); i++) {
PathElement pathElement = bucketsPathElements.get(i);
String aggName = pathElement.name;
if ((i == bucketsPathElements.size() - 1) && (aggName.equalsIgnoreCase("_key") || aggName.equals("_count"))) {
break;
} else {
// Check the non-pipeline sub-aggregator
// factories
AggregatorFactory[] subFactories = aggFactory.factories.factories;
boolean foundSubFactory = false;
for (AggregatorFactory subFactory : subFactories) {
if (aggName.equals(subFactory.name)) {
aggFactory = subFactory;
foundSubFactory = true;
break;
}
}
// Check the pipeline sub-aggregator factories
if (!foundSubFactory && (i == bucketsPathElements.size() - 1)) {
List<PipelineAggregatorFactory> subPipelineFactories = aggFactory.factories.pipelineAggregatorFactories;
for (PipelineAggregatorFactory subFactory : subPipelineFactories) {
if (aggName.equals(subFactory.name())) {
foundSubFactory = true;
break;
}
}
}
if (!foundSubFactory) {
throw new IllegalArgumentException("No aggregation [" + aggName + "] found for path [" + bucketsPath
+ "]");
}
}
}
continue; continue;
} else { } else {
PipelineAggregatorFactory matchingFactory = pipelineAggregatorFactoriesMap.get(firstAggName); PipelineAggregatorFactory matchingFactory = pipelineAggregatorFactoriesMap.get(firstAggName);
if (matchingFactory != null) { if (matchingFactory != null) {
resolvePipelineAggregatorOrder(aggFactoryNames, pipelineAggregatorFactoriesMap, orderedPipelineAggregators, unmarkedFactories, resolvePipelineAggregatorOrder(aggFactoriesMap, pipelineAggregatorFactoriesMap, orderedPipelineAggregators,
unmarkedFactories,
temporarilyMarked, matchingFactory); temporarilyMarked, matchingFactory);
} else { } else {
throw new IllegalStateException("No aggregation found for path [" + bucketsPath + "]"); throw new IllegalArgumentException("No aggregation found for path [" + bucketsPath + "]");
} }
} }
} }

View File

@ -37,7 +37,7 @@ public abstract class PipelineAggregatorFactory {
/** /**
* Constructs a new pipeline aggregator factory. * Constructs a new pipeline aggregator factory.
* *
* @param name * @param name
* The aggregation name * The aggregation name
* @param type * @param type
@ -49,10 +49,14 @@ public abstract class PipelineAggregatorFactory {
this.bucketsPaths = bucketsPaths; this.bucketsPaths = bucketsPaths;
} }
public String name() {
return name;
}
/** /**
* Validates the state of this factory (makes sure the factory is properly * Validates the state of this factory (makes sure the factory is properly
* configured) * configured)
* *
* @param pipelineAggregatorFactories * @param pipelineAggregatorFactories
* @param factories * @param factories
* @param parent * @param parent
@ -66,7 +70,7 @@ public abstract class PipelineAggregatorFactory {
/** /**
* Creates the pipeline aggregator * Creates the pipeline aggregator
* *
* @param context * @param context
* The aggregation context * The aggregation context
* @param parent * @param parent
@ -77,7 +81,7 @@ public abstract class PipelineAggregatorFactory {
* with <tt>0</tt> as a bucket ordinal. Some factories can take * with <tt>0</tt> as a bucket ordinal. Some factories can take
* advantage of this in order to return more optimized * advantage of this in order to return more optimized
* implementations. * implementations.
* *
* @return The created aggregator * @return The created aggregator
*/ */
public final PipelineAggregator create() throws IOException { public final PipelineAggregator create() throws IOException {

View File

@ -45,6 +45,7 @@ import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.MappedFieldType; import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.internal.SourceFieldMapper; import org.elasticsearch.index.mapper.internal.SourceFieldMapper;
import org.elasticsearch.index.mapper.object.ObjectMapper; import org.elasticsearch.index.mapper.object.ObjectMapper;
import org.elasticsearch.index.query.functionscore.ScoreFunctionParser;
import org.elasticsearch.search.SearchHit; import org.elasticsearch.search.SearchHit;
import org.elasticsearch.search.SearchHitField; import org.elasticsearch.search.SearchHitField;
import org.elasticsearch.search.SearchParseElement; import org.elasticsearch.search.SearchParseElement;
@ -83,13 +84,10 @@ public class FetchPhase implements SearchPhase {
private final FetchSubPhase[] fetchSubPhases; private final FetchSubPhase[] fetchSubPhases;
@Inject @Inject
public FetchPhase(HighlightPhase highlightPhase, ScriptFieldsFetchSubPhase scriptFieldsPhase, public FetchPhase(Set<FetchSubPhase> fetchSubPhases, InnerHitsFetchSubPhase innerHitsFetchSubPhase) {
MatchedQueriesFetchSubPhase matchedQueriesPhase, ExplainFetchSubPhase explainPhase, VersionFetchSubPhase versionPhase,
FetchSourceSubPhase fetchSourceSubPhase, FieldDataFieldsFetchSubPhase fieldDataFieldsFetchSubPhase,
InnerHitsFetchSubPhase innerHitsFetchSubPhase) {
innerHitsFetchSubPhase.setFetchPhase(this); innerHitsFetchSubPhase.setFetchPhase(this);
this.fetchSubPhases = new FetchSubPhase[]{scriptFieldsPhase, matchedQueriesPhase, explainPhase, highlightPhase, this.fetchSubPhases = fetchSubPhases.toArray(new FetchSubPhase[fetchSubPhases.size() + 1]);
fetchSourceSubPhase, versionPhase, fieldDataFieldsFetchSubPhase, innerHitsFetchSubPhase}; this.fetchSubPhases[fetchSubPhases.size()] = innerHitsFetchSubPhase;
} }
@Override @Override

View File

@ -114,4 +114,23 @@ public interface FetchSubPhase {
boolean hitsExecutionNeeded(SearchContext context); boolean hitsExecutionNeeded(SearchContext context);
void hitsExecute(SearchContext context, InternalSearchHit[] hits); void hitsExecute(SearchContext context, InternalSearchHit[] hits);
/**
* This interface is in the fetch phase plugin mechanism.
* Whenever a new search is executed we create a new {@link SearchContext} that holds individual contexts for each {@link org.elasticsearch.search.fetch.FetchSubPhase}.
* Fetch phases that use the plugin mechanism must provide a ContextFactory to the SearchContext that creates the fetch phase context and also associates them with a name.
* See {@link SearchContext#getFetchSubPhaseContext(FetchSubPhase.ContextFactory)}
*/
public interface ContextFactory<SubPhaseContext extends FetchSubPhaseContext> {
/**
* The name of the context.
*/
public String getName();
/**
* Creates a new instance of a FetchSubPhaseContext that holds all information a FetchSubPhase needs to execute on hits.
*/
public SubPhaseContext newContextInstance();
}
} }

View File

@ -0,0 +1,47 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.search.fetch;
/**
* All configuration and context needed by the FetchSubPhase to execute on hits.
* The only required information in this base class is whether or not the sub phase needs to be run at all.
* It can be extended by FetchSubPhases to hold information the phase needs to execute on hits.
* See {@link org.elasticsearch.search.fetch.FetchSubPhase.ContextFactory} and also {@link org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext} for an example.
*/
public class FetchSubPhaseContext {
// This is to store if the FetchSubPhase should be executed at all.
private boolean hitExecutionNeeded = false;
/**
* Set if this phase should be executed at all.
*/
void setHitExecutionNeeded(boolean hitExecutionNeeded) {
this.hitExecutionNeeded = hitExecutionNeeded;
}
/**
* Returns if this phase be executed at all.
*/
public boolean hitExecutionNeeded() {
return hitExecutionNeeded;
}
}

View File

@ -0,0 +1,74 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.search.fetch;
import com.google.common.collect.Lists;
import org.elasticsearch.common.inject.AbstractModule;
import org.elasticsearch.common.inject.multibindings.Multibinder;
import org.elasticsearch.search.fetch.explain.ExplainFetchSubPhase;
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsFetchSubPhase;
import org.elasticsearch.search.fetch.innerhits.InnerHitsFetchSubPhase;
import org.elasticsearch.search.fetch.matchedqueries.MatchedQueriesFetchSubPhase;
import org.elasticsearch.search.fetch.script.ScriptFieldsFetchSubPhase;
import org.elasticsearch.search.fetch.source.FetchSourceSubPhase;
import org.elasticsearch.search.fetch.version.VersionFetchSubPhase;
import org.elasticsearch.search.highlight.HighlightPhase;
import java.util.List;
/**
* Module for registering fetch sub phases. Fetch phases are executed when the document is finally
* retrieved from the shard. To implement a new fetch phase one needs to implement the following classes and interfaces
* <p/>
* <ul>
* <li> {@link FetchSubPhaseParseElement} </li>
* <li> {@link FetchSubPhase} </li>
* <li> {@link FetchSubPhaseContext} </li>
* </ul>
* <p/>
* The FetchSubPhase must then be registered with this module with {@link FetchSubPhaseModule#registerFetchSubPhase(Class<? extends FetchSubPhase>)}.
* See {@link FieldDataFieldsFetchSubPhase} for an example.
*/
public class FetchSubPhaseModule extends AbstractModule {
private List<Class<? extends FetchSubPhase>> fetchSubPhases = Lists.newArrayList();
public FetchSubPhaseModule() {
registerFetchSubPhase(ExplainFetchSubPhase.class);
registerFetchSubPhase(FieldDataFieldsFetchSubPhase.class);
registerFetchSubPhase(ScriptFieldsFetchSubPhase.class);
registerFetchSubPhase(FetchSourceSubPhase.class);
registerFetchSubPhase(VersionFetchSubPhase.class);
registerFetchSubPhase(MatchedQueriesFetchSubPhase.class);
registerFetchSubPhase(HighlightPhase.class);
}
public void registerFetchSubPhase(Class<? extends FetchSubPhase> subPhase) {
fetchSubPhases.add(subPhase);
}
@Override
protected void configure() {
Multibinder<FetchSubPhase> parserMapBinder = Multibinder.newSetBinder(binder(), FetchSubPhase.class);
for (Class<? extends FetchSubPhase> clazz : fetchSubPhases) {
parserMapBinder.addBinding().to(clazz);
}
bind(InnerHitsFetchSubPhase.class).asEagerSingleton();
}
}

View File

@ -0,0 +1,48 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.search.fetch;
import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.search.SearchParseElement;
import org.elasticsearch.search.internal.SearchContext;
/**
* A parse element for a {@link org.elasticsearch.search.fetch.FetchSubPhase} that is used when parsing a search request.
*/
public abstract class FetchSubPhaseParseElement<SubPhaseContext extends FetchSubPhaseContext> implements SearchParseElement {
@Override
final public void parse(XContentParser parser, SearchContext context) throws Exception {
SubPhaseContext fetchSubPhaseContext = context.getFetchSubPhaseContext(getContextFactory());
// this is to make sure that the SubFetchPhase knows it should execute
fetchSubPhaseContext.setHitExecutionNeeded(true);
innerParse(parser, fetchSubPhaseContext, context);
}
/**
* Implement the actual parsing here.
*/
protected abstract void innerParse(XContentParser parser, SubPhaseContext fetchSubPhaseContext, SearchContext searchContext) throws Exception;
/**
* Return the ContextFactory for this FetchSubPhase.
*/
protected abstract FetchSubPhase.ContextFactory<SubPhaseContext> getContextFactory();
}

View File

@ -19,13 +19,14 @@
package org.elasticsearch.search.fetch.fielddata; package org.elasticsearch.search.fetch.fielddata;
import com.google.common.collect.Lists; import com.google.common.collect.Lists;
import org.elasticsearch.search.fetch.FetchSubPhaseContext;
import java.util.List; import java.util.List;
/** /**
* All the required context to pull a field from the field data cache. * All the required context to pull a field from the field data cache.
*/ */
public class FieldDataFieldsContext { public class FieldDataFieldsContext extends FetchSubPhaseContext {
public static class FieldDataField { public static class FieldDataField {
private final String name; private final String name;

View File

@ -27,6 +27,7 @@ import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.search.SearchHitField; import org.elasticsearch.search.SearchHitField;
import org.elasticsearch.search.SearchParseElement; import org.elasticsearch.search.SearchParseElement;
import org.elasticsearch.search.fetch.FetchSubPhase; import org.elasticsearch.search.fetch.FetchSubPhase;
import org.elasticsearch.search.fetch.FetchSubPhaseContext;
import org.elasticsearch.search.internal.InternalSearchHit; import org.elasticsearch.search.internal.InternalSearchHit;
import org.elasticsearch.search.internal.InternalSearchHitField; import org.elasticsearch.search.internal.InternalSearchHitField;
import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.internal.SearchContext;
@ -43,6 +44,20 @@ import java.util.Map;
*/ */
public class FieldDataFieldsFetchSubPhase implements FetchSubPhase { public class FieldDataFieldsFetchSubPhase implements FetchSubPhase {
public static final String[] NAMES = {"fielddata_fields", "fielddataFields"};
public static final ContextFactory<FieldDataFieldsContext> CONTEXT_FACTORY = new ContextFactory<FieldDataFieldsContext>() {
@Override
public String getName() {
return NAMES[0];
}
@Override
public FieldDataFieldsContext newContextInstance() {
return new FieldDataFieldsContext();
}
};
@Inject @Inject
public FieldDataFieldsFetchSubPhase() { public FieldDataFieldsFetchSubPhase() {
} }
@ -66,12 +81,12 @@ public class FieldDataFieldsFetchSubPhase implements FetchSubPhase {
@Override @Override
public boolean hitExecutionNeeded(SearchContext context) { public boolean hitExecutionNeeded(SearchContext context) {
return context.hasFieldDataFields(); return context.getFetchSubPhaseContext(CONTEXT_FACTORY).hitExecutionNeeded();
} }
@Override @Override
public void hitExecute(SearchContext context, HitContext hitContext) { public void hitExecute(SearchContext context, HitContext hitContext) {
for (FieldDataFieldsContext.FieldDataField field : context.fieldDataFields().fields()) { for (FieldDataFieldsContext.FieldDataField field : context.getFetchSubPhaseContext(CONTEXT_FACTORY).fields()) {
if (hitContext.hit().fieldsOrNull() == null) { if (hitContext.hit().fieldsOrNull() == null) {
hitContext.hit().fields(new HashMap<String, SearchHitField>(2)); hitContext.hit().fields(new HashMap<String, SearchHitField>(2));
} }

View File

@ -20,12 +20,15 @@ package org.elasticsearch.search.fetch.fielddata;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.search.SearchParseElement; import org.elasticsearch.search.SearchParseElement;
import org.elasticsearch.search.fetch.FetchSubPhase;
import org.elasticsearch.search.fetch.FetchSubPhaseContext;
import org.elasticsearch.search.fetch.FetchSubPhaseParseElement;
import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.internal.SearchContext;
/** /**
* Parses field name values from the {@code fielddata_fields} parameter in a * Parses field name values from the {@code fielddata_fields} parameter in a
* search request. * search request.
* * <p/>
* <pre> * <pre>
* { * {
* "query": {...}, * "query": {...},
@ -33,20 +36,26 @@ import org.elasticsearch.search.internal.SearchContext;
* } * }
* </pre> * </pre>
*/ */
public class FieldDataFieldsParseElement implements SearchParseElement { public class FieldDataFieldsParseElement extends FetchSubPhaseParseElement<FieldDataFieldsContext> {
@Override @Override
public void parse(XContentParser parser, SearchContext context) throws Exception { protected void innerParse(XContentParser parser, FieldDataFieldsContext fieldDataFieldsContext, SearchContext searchContext) throws Exception {
XContentParser.Token token = parser.currentToken(); XContentParser.Token token = parser.currentToken();
if (token == XContentParser.Token.START_ARRAY) { if (token == XContentParser.Token.START_ARRAY) {
while (parser.nextToken() != XContentParser.Token.END_ARRAY) { while (parser.nextToken() != XContentParser.Token.END_ARRAY) {
String fieldName = parser.text(); String fieldName = parser.text();
context.fieldDataFields().add(new FieldDataFieldsContext.FieldDataField(fieldName)); fieldDataFieldsContext.add(new FieldDataFieldsContext.FieldDataField(fieldName));
} }
} else if (token == XContentParser.Token.VALUE_STRING) { } else if (token == XContentParser.Token.VALUE_STRING) {
String fieldName = parser.text(); String fieldName = parser.text();
context.fieldDataFields().add(new FieldDataFieldsContext.FieldDataField(fieldName)); fieldDataFieldsContext.add(new FieldDataFieldsContext.FieldDataField(fieldName));
} else { } else {
throw new IllegalStateException("Expected either a VALUE_STRING or an START_ARRAY but got " + token); throw new IllegalStateException("Expected either a VALUE_STRING or an START_ARRAY but got " + token);
} }
} }
@Override
protected FetchSubPhase.ContextFactory getContextFactory() {
return FieldDataFieldsFetchSubPhase.CONTEXT_FACTORY;
}
} }

View File

@ -53,7 +53,8 @@ import org.elasticsearch.search.SearchShardTarget;
import org.elasticsearch.search.aggregations.SearchContextAggregations; import org.elasticsearch.search.aggregations.SearchContextAggregations;
import org.elasticsearch.search.dfs.DfsSearchResult; import org.elasticsearch.search.dfs.DfsSearchResult;
import org.elasticsearch.search.fetch.FetchSearchResult; import org.elasticsearch.search.fetch.FetchSearchResult;
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext; import org.elasticsearch.search.fetch.FetchSubPhase;
import org.elasticsearch.search.fetch.FetchSubPhaseContext;
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext; import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
import org.elasticsearch.search.fetch.script.ScriptFieldsContext; import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
import org.elasticsearch.search.fetch.source.FetchSourceContext; import org.elasticsearch.search.fetch.source.FetchSourceContext;
@ -99,7 +100,6 @@ public class DefaultSearchContext extends SearchContext {
private boolean explain; private boolean explain;
private boolean version = false; // by default, we don't return versions private boolean version = false; // by default, we don't return versions
private List<String> fieldNames; private List<String> fieldNames;
private FieldDataFieldsContext fieldDataFields;
private ScriptFieldsContext scriptFields; private ScriptFieldsContext scriptFields;
private FetchSourceContext fetchSourceContext; private FetchSourceContext fetchSourceContext;
private int from = -1; private int from = -1;
@ -126,11 +126,13 @@ public class DefaultSearchContext extends SearchContext {
private volatile long lastAccessTime = -1; private volatile long lastAccessTime = -1;
private InnerHitsContext innerHitsContext; private InnerHitsContext innerHitsContext;
private final Map<String, FetchSubPhaseContext> subPhaseContexts = new HashMap<>();
public DefaultSearchContext(long id, ShardSearchRequest request, SearchShardTarget shardTarget, public DefaultSearchContext(long id, ShardSearchRequest request, SearchShardTarget shardTarget,
Engine.Searcher engineSearcher, IndexService indexService, IndexShard indexShard, Engine.Searcher engineSearcher, IndexService indexService, IndexShard indexShard,
ScriptService scriptService, PageCacheRecycler pageCacheRecycler, ScriptService scriptService, PageCacheRecycler pageCacheRecycler,
BigArrays bigArrays, Counter timeEstimateCounter, ParseFieldMatcher parseFieldMatcher, BigArrays bigArrays, Counter timeEstimateCounter, ParseFieldMatcher parseFieldMatcher,
TimeValue timeout TimeValue timeout
) { ) {
super(parseFieldMatcher); super(parseFieldMatcher);
this.id = id; this.id = id;
@ -302,6 +304,16 @@ public class DefaultSearchContext extends SearchContext {
return this; return this;
} }
@Override
public <SubPhaseContext extends FetchSubPhaseContext> SubPhaseContext getFetchSubPhaseContext(FetchSubPhase.ContextFactory<SubPhaseContext> contextFactory) {
String subPhaseName = contextFactory.getName();
if (subPhaseContexts.get(subPhaseName) == null) {
subPhaseContexts.put(subPhaseName, contextFactory.newContextInstance());
}
return (SubPhaseContext) subPhaseContexts.get(subPhaseName);
}
@Override @Override
public SearchContextHighlight highlight() { public SearchContextHighlight highlight() {
return highlight; return highlight;
@ -338,19 +350,6 @@ public class DefaultSearchContext extends SearchContext {
this.rescore.add(rescore); this.rescore.add(rescore);
} }
@Override
public boolean hasFieldDataFields() {
return fieldDataFields != null;
}
@Override
public FieldDataFieldsContext fieldDataFields() {
if (fieldDataFields == null) {
fieldDataFields = new FieldDataFieldsContext();
}
return this.fieldDataFields;
}
@Override @Override
public boolean hasScriptFields() { public boolean hasScriptFields() {
return scriptFields != null; return scriptFields != null;

View File

@ -45,7 +45,8 @@ import org.elasticsearch.search.SearchShardTarget;
import org.elasticsearch.search.aggregations.SearchContextAggregations; import org.elasticsearch.search.aggregations.SearchContextAggregations;
import org.elasticsearch.search.dfs.DfsSearchResult; import org.elasticsearch.search.dfs.DfsSearchResult;
import org.elasticsearch.search.fetch.FetchSearchResult; import org.elasticsearch.search.fetch.FetchSearchResult;
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext; import org.elasticsearch.search.fetch.FetchSubPhase;
import org.elasticsearch.search.fetch.FetchSubPhaseContext;
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext; import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
import org.elasticsearch.search.fetch.script.ScriptFieldsContext; import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
import org.elasticsearch.search.fetch.source.FetchSourceContext; import org.elasticsearch.search.fetch.source.FetchSourceContext;
@ -209,16 +210,6 @@ public abstract class FilteredSearchContext extends SearchContext {
in.addRescore(rescore); in.addRescore(rescore);
} }
@Override
public boolean hasFieldDataFields() {
return in.hasFieldDataFields();
}
@Override
public FieldDataFieldsContext fieldDataFields() {
return in.fieldDataFields();
}
@Override @Override
public boolean hasScriptFields() { public boolean hasScriptFields() {
return in.hasScriptFields(); return in.hasScriptFields();
@ -628,4 +619,9 @@ public abstract class FilteredSearchContext extends SearchContext {
public void copyContextAndHeadersFrom(HasContextAndHeaders other) { public void copyContextAndHeadersFrom(HasContextAndHeaders other) {
in.copyContextAndHeadersFrom(other); in.copyContextAndHeadersFrom(other);
} }
@Override
public <SubPhaseContext extends FetchSubPhaseContext> SubPhaseContext getFetchSubPhaseContext(FetchSubPhase.ContextFactory<SubPhaseContext> contextFactory) {
return in.getFetchSubPhaseContext(contextFactory);
}
} }

View File

@ -50,7 +50,8 @@ import org.elasticsearch.search.SearchShardTarget;
import org.elasticsearch.search.aggregations.SearchContextAggregations; import org.elasticsearch.search.aggregations.SearchContextAggregations;
import org.elasticsearch.search.dfs.DfsSearchResult; import org.elasticsearch.search.dfs.DfsSearchResult;
import org.elasticsearch.search.fetch.FetchSearchResult; import org.elasticsearch.search.fetch.FetchSearchResult;
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext; import org.elasticsearch.search.fetch.FetchSubPhase;
import org.elasticsearch.search.fetch.FetchSubPhaseContext;
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext; import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
import org.elasticsearch.search.fetch.script.ScriptFieldsContext; import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
import org.elasticsearch.search.fetch.source.FetchSourceContext; import org.elasticsearch.search.fetch.source.FetchSourceContext;
@ -163,6 +164,8 @@ public abstract class SearchContext implements Releasable, HasContextAndHeaders
public abstract SearchContext aggregations(SearchContextAggregations aggregations); public abstract SearchContext aggregations(SearchContextAggregations aggregations);
public abstract <SubPhaseContext extends FetchSubPhaseContext> SubPhaseContext getFetchSubPhaseContext(FetchSubPhase.ContextFactory<SubPhaseContext> contextFactory);
public abstract SearchContextHighlight highlight(); public abstract SearchContextHighlight highlight();
public abstract void highlight(SearchContextHighlight highlight); public abstract void highlight(SearchContextHighlight highlight);
@ -182,10 +185,6 @@ public abstract class SearchContext implements Releasable, HasContextAndHeaders
public abstract void addRescore(RescoreSearchContext rescore); public abstract void addRescore(RescoreSearchContext rescore);
public abstract boolean hasFieldDataFields();
public abstract FieldDataFieldsContext fieldDataFields();
public abstract boolean hasScriptFields(); public abstract boolean hasScriptFields();
public abstract ScriptFieldsContext scriptFields(); public abstract ScriptFieldsContext scriptFields();

View File

@ -30,7 +30,6 @@ import org.elasticsearch.index.query.ParsedQuery;
import org.elasticsearch.search.Scroll; import org.elasticsearch.search.Scroll;
import org.elasticsearch.search.aggregations.SearchContextAggregations; import org.elasticsearch.search.aggregations.SearchContextAggregations;
import org.elasticsearch.search.fetch.FetchSearchResult; import org.elasticsearch.search.fetch.FetchSearchResult;
import org.elasticsearch.search.fetch.fielddata.FieldDataFieldsContext;
import org.elasticsearch.search.fetch.innerhits.InnerHitsContext; import org.elasticsearch.search.fetch.innerhits.InnerHitsContext;
import org.elasticsearch.search.fetch.script.ScriptFieldsContext; import org.elasticsearch.search.fetch.script.ScriptFieldsContext;
import org.elasticsearch.search.fetch.source.FetchSourceContext; import org.elasticsearch.search.fetch.source.FetchSourceContext;
@ -62,7 +61,6 @@ public class SubSearchContext extends FilteredSearchContext {
private int docsIdsToLoadSize; private int docsIdsToLoadSize;
private List<String> fieldNames; private List<String> fieldNames;
private FieldDataFieldsContext fieldDataFields;
private ScriptFieldsContext scriptFields; private ScriptFieldsContext scriptFields;
private FetchSourceContext fetchSourceContext; private FetchSourceContext fetchSourceContext;
private SearchContextHighlight highlight; private SearchContextHighlight highlight;
@ -132,19 +130,6 @@ public class SubSearchContext extends FilteredSearchContext {
throw new UnsupportedOperationException("Not supported"); throw new UnsupportedOperationException("Not supported");
} }
@Override
public boolean hasFieldDataFields() {
return fieldDataFields != null;
}
@Override
public FieldDataFieldsContext fieldDataFields() {
if (fieldDataFields == null) {
fieldDataFields = new FieldDataFieldsContext();
}
return this.fieldDataFields;
}
@Override @Override
public boolean hasScriptFields() { public boolean hasScriptFields() {
return scriptFields != null; return scriptFields != null;

View File

@ -24,7 +24,7 @@ import org.apache.lucene.analysis.MockTokenizer;
import org.apache.lucene.analysis.TokenStream; import org.apache.lucene.analysis.TokenStream;
import org.apache.lucene.analysis.Tokenizer; import org.apache.lucene.analysis.Tokenizer;
import org.apache.lucene.analysis.tokenattributes.CharTermAttribute; import org.apache.lucene.analysis.tokenattributes.CharTermAttribute;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -33,7 +33,7 @@ import static org.hamcrest.Matchers.equalTo;
/** /**
*/ */
public class TruncateTokenFilterTests extends ElasticsearchTestCase { public class TruncateTokenFilterTests extends ESTestCase {
@Test @Test
public void simpleTest() throws IOException { public void simpleTest() throws IOException {

View File

@ -24,7 +24,7 @@ import org.apache.lucene.analysis.MockTokenizer;
import org.apache.lucene.analysis.TokenStream; import org.apache.lucene.analysis.TokenStream;
import org.apache.lucene.analysis.Tokenizer; import org.apache.lucene.analysis.Tokenizer;
import org.apache.lucene.analysis.tokenattributes.CharTermAttribute; import org.apache.lucene.analysis.tokenattributes.CharTermAttribute;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -33,7 +33,7 @@ import static org.hamcrest.Matchers.equalTo;
/** /**
*/ */
public class UniqueTokenFilterTests extends ElasticsearchTestCase { public class UniqueTokenFilterTests extends ESTestCase {
@Test @Test
public void simpleTest() throws IOException { public void simpleTest() throws IOException {

View File

@ -41,7 +41,7 @@ import org.apache.lucene.search.similarities.DefaultSimilarity;
import org.apache.lucene.search.similarities.Similarity; import org.apache.lucene.search.similarities.Similarity;
import org.apache.lucene.store.Directory; import org.apache.lucene.store.Directory;
import org.apache.lucene.util.TestUtil; import org.apache.lucene.util.TestUtil;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -56,7 +56,7 @@ import static org.hamcrest.Matchers.equalTo;
/** /**
*/ */
public class BlendedTermQueryTest extends ElasticsearchTestCase { public class BlendedTermQueryTest extends ESTestCase {
@Test @Test
public void testBooleanQuery() throws IOException { public void testBooleanQuery() throws IOException {

View File

@ -21,7 +21,7 @@ package org.apache.lucene.search.postingshighlight;
import org.apache.lucene.search.highlight.DefaultEncoder; import org.apache.lucene.search.highlight.DefaultEncoder;
import org.apache.lucene.search.highlight.SimpleHTMLEncoder; import org.apache.lucene.search.highlight.SimpleHTMLEncoder;
import org.apache.lucene.util.BytesRef; import org.apache.lucene.util.BytesRef;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.equalTo;
@ -29,7 +29,7 @@ import static org.hamcrest.CoreMatchers.notNullValue;
import static org.hamcrest.MatcherAssert.assertThat; import static org.hamcrest.MatcherAssert.assertThat;
public class CustomPassageFormatterTests extends ElasticsearchTestCase { public class CustomPassageFormatterTests extends ESTestCase {
@Test @Test
public void testSimpleFormat() { public void testSimpleFormat() {

View File

@ -28,12 +28,12 @@ import org.apache.lucene.search.*;
import org.apache.lucene.search.highlight.DefaultEncoder; import org.apache.lucene.search.highlight.DefaultEncoder;
import org.apache.lucene.store.Directory; import org.apache.lucene.store.Directory;
import org.elasticsearch.search.highlight.HighlightUtils; import org.elasticsearch.search.highlight.HighlightUtils;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.equalTo;
public class CustomPostingsHighlighterTests extends ElasticsearchTestCase { public class CustomPostingsHighlighterTests extends ESTestCase {
@Test @Test
public void testCustomPostingsHighlighter() throws Exception { public void testCustomPostingsHighlighter() throws Exception {

View File

@ -20,7 +20,7 @@ under the License.
package org.apache.lucene.search.postingshighlight; package org.apache.lucene.search.postingshighlight;
import org.elasticsearch.search.highlight.HighlightUtils; import org.elasticsearch.search.highlight.HighlightUtils;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.text.BreakIterator; import java.text.BreakIterator;
@ -30,7 +30,7 @@ import java.util.Locale;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.equalTo;
public class CustomSeparatorBreakIteratorTests extends ElasticsearchTestCase { public class CustomSeparatorBreakIteratorTests extends ESTestCase {
@Test @Test
public void testBreakOnCustomSeparator() throws Exception { public void testBreakOnCustomSeparator() throws Exception {

View File

@ -21,12 +21,12 @@ package org.apache.lucene.util;
import org.elasticsearch.common.geo.GeoDistance; import org.elasticsearch.common.geo.GeoDistance;
import org.elasticsearch.common.unit.DistanceUnit; import org.elasticsearch.common.unit.DistanceUnit;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import static org.hamcrest.number.IsCloseTo.closeTo; import static org.hamcrest.number.IsCloseTo.closeTo;
public class SloppyMathTests extends ElasticsearchTestCase { public class SloppyMathTests extends ESTestCase {
@Test @Test
public void testAccuracy() { public void testAccuracy() {

View File

@ -40,7 +40,7 @@ import org.elasticsearch.index.query.TestQueryParsingException;
import org.elasticsearch.rest.RestStatus; import org.elasticsearch.rest.RestStatus;
import org.elasticsearch.search.SearchParseException; import org.elasticsearch.search.SearchParseException;
import org.elasticsearch.search.SearchShardTarget; import org.elasticsearch.search.SearchShardTarget;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.elasticsearch.test.TestSearchContext; import org.elasticsearch.test.TestSearchContext;
import org.elasticsearch.test.VersionUtils; import org.elasticsearch.test.VersionUtils;
import org.elasticsearch.test.hamcrest.ElasticsearchAssertions; import org.elasticsearch.test.hamcrest.ElasticsearchAssertions;
@ -55,7 +55,7 @@ import java.nio.file.NoSuchFileException;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
public class ElasticsearchExceptionTests extends ElasticsearchTestCase { public class ESExceptionTests extends ESTestCase {
@Test @Test
public void testStatus() { public void testStatus() {

View File

@ -62,7 +62,7 @@ import org.elasticsearch.search.SearchShardTarget;
import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.internal.SearchContext;
import org.elasticsearch.search.warmer.IndexWarmerMissingException; import org.elasticsearch.search.warmer.IndexWarmerMissingException;
import org.elasticsearch.snapshots.SnapshotException; import org.elasticsearch.snapshots.SnapshotException;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.elasticsearch.test.TestSearchContext; import org.elasticsearch.test.TestSearchContext;
import org.elasticsearch.test.VersionUtils; import org.elasticsearch.test.VersionUtils;
import org.elasticsearch.test.hamcrest.ElasticsearchAssertions; import org.elasticsearch.test.hamcrest.ElasticsearchAssertions;
@ -81,7 +81,7 @@ import java.nio.file.attribute.BasicFileAttributes;
import java.util.HashSet; import java.util.HashSet;
import java.util.Set; import java.util.Set;
public class ExceptionSerializationTests extends ElasticsearchTestCase { public class ExceptionSerializationTests extends ESTestCase {
public void testExceptionRegistration() public void testExceptionRegistration()
throws ClassNotFoundException, IOException, URISyntaxException { throws ClassNotFoundException, IOException, URISyntaxException {

View File

@ -24,10 +24,11 @@ import com.google.common.collect.Sets;
import junit.framework.TestCase; import junit.framework.TestCase;
import org.apache.lucene.util.LuceneTestCase; import org.apache.lucene.util.LuceneTestCase;
import org.elasticsearch.common.SuppressForbidden;
import org.elasticsearch.common.io.PathUtils; import org.elasticsearch.common.io.PathUtils;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.elasticsearch.test.ElasticsearchTokenStreamTestCase; import org.elasticsearch.test.ESTokenStreamTestCase;
import org.junit.Ignore; import org.junit.Ignore;
import org.junit.Test; import org.junit.Test;
@ -41,9 +42,9 @@ import java.util.HashSet;
import java.util.Set; import java.util.Set;
/** /**
* Simple class that ensures that all subclasses concrete of ElasticsearchTestCase end with either Test | Tests * Simple class that ensures that all subclasses concrete of ESTestCase end with either Test | Tests
*/ */
public class NamingConventionTests extends ElasticsearchTestCase { public class NamingConventionTests extends ESTestCase {
// see https://github.com/elasticsearch/elasticsearch/issues/9945 // see https://github.com/elasticsearch/elasticsearch/issues/9945
public void testNamingConventions() public void testNamingConventions()
@ -51,6 +52,7 @@ public class NamingConventionTests extends ElasticsearchTestCase {
final Set<Class> notImplementing = new HashSet<>(); final Set<Class> notImplementing = new HashSet<>();
final Set<Class> pureUnitTest = new HashSet<>(); final Set<Class> pureUnitTest = new HashSet<>();
final Set<Class> missingSuffix = new HashSet<>(); final Set<Class> missingSuffix = new HashSet<>();
final Set<Class> integTestsInDisguise = new HashSet<>();
String[] packages = {"org.elasticsearch", "org.apache.lucene"}; String[] packages = {"org.elasticsearch", "org.apache.lucene"};
for (final String packageName : packages) { for (final String packageName : packages) {
final String path = "/" + packageName.replace('.', '/'); final String path = "/" + packageName.replace('.', '/');
@ -76,12 +78,19 @@ public class NamingConventionTests extends ElasticsearchTestCase {
Class<?> clazz = loadClass(filename); Class<?> clazz = loadClass(filename);
if (Modifier.isAbstract(clazz.getModifiers()) == false && Modifier.isInterface(clazz.getModifiers()) == false) { if (Modifier.isAbstract(clazz.getModifiers()) == false && Modifier.isInterface(clazz.getModifiers()) == false) {
if (clazz.getName().endsWith("Tests") || if (clazz.getName().endsWith("Tests") ||
clazz.getName().endsWith("IT") ||
clazz.getName().endsWith("Test")) { // don't worry about the ones that match the pattern clazz.getName().endsWith("Test")) { // don't worry about the ones that match the pattern
if (ESIntegTestCase.class.isAssignableFrom(clazz)) {
integTestsInDisguise.add(clazz);
}
if (isTestCase(clazz) == false) { if (isTestCase(clazz) == false) {
notImplementing.add(clazz); notImplementing.add(clazz);
} }
} else if (clazz.getName().endsWith("IT")) {
if (isTestCase(clazz) == false) {
notImplementing.add(clazz);
}
// otherwise fine
} else if (isTestCase(clazz)) { } else if (isTestCase(clazz)) {
missingSuffix.add(clazz); missingSuffix.add(clazz);
} else if (junit.framework.Test.class.isAssignableFrom(clazz) || hasTestAnnotation(clazz)) { } else if (junit.framework.Test.class.isAssignableFrom(clazz) || hasTestAnnotation(clazz)) {
@ -107,7 +116,7 @@ public class NamingConventionTests extends ElasticsearchTestCase {
} }
private boolean isTestCase(Class<?> clazz) { private boolean isTestCase(Class<?> clazz) {
return ElasticsearchTestCase.class.isAssignableFrom(clazz) || ElasticsearchTestCase.class.isAssignableFrom(clazz) || ElasticsearchTokenStreamTestCase.class.isAssignableFrom(clazz) || LuceneTestCase.class.isAssignableFrom(clazz); return LuceneTestCase.class.isAssignableFrom(clazz);
} }
private Class<?> loadClass(String filename) throws ClassNotFoundException { private Class<?> loadClass(String filename) throws ClassNotFoundException {
@ -140,31 +149,34 @@ public class NamingConventionTests extends ElasticsearchTestCase {
assertTrue(pureUnitTest.remove(PlainUnitTheSecond.class)); assertTrue(pureUnitTest.remove(PlainUnitTheSecond.class));
String classesToSubclass = Joiner.on(',').join( String classesToSubclass = Joiner.on(',').join(
ElasticsearchTestCase.class.getSimpleName(), ESTestCase.class.getSimpleName(),
ElasticsearchTestCase.class.getSimpleName(), ESTestCase.class.getSimpleName(),
ElasticsearchTokenStreamTestCase.class.getSimpleName(), ESTokenStreamTestCase.class.getSimpleName(),
LuceneTestCase.class.getSimpleName()); LuceneTestCase.class.getSimpleName());
assertTrue("Not all subclasses of " + ElasticsearchTestCase.class.getSimpleName() + assertTrue("Not all subclasses of " + ESTestCase.class.getSimpleName() +
" match the naming convention. Concrete classes must end with [Test|Tests]: " + missingSuffix.toString(), " match the naming convention. Concrete classes must end with [Test|Tests]: " + missingSuffix.toString(),
missingSuffix.isEmpty()); missingSuffix.isEmpty());
assertTrue("Pure Unit-Test found must subclass one of [" + classesToSubclass +"] " + pureUnitTest.toString(), assertTrue("Pure Unit-Test found must subclass one of [" + classesToSubclass +"] " + pureUnitTest.toString(),
pureUnitTest.isEmpty()); pureUnitTest.isEmpty());
assertTrue("Classes ending with Test|Tests] must subclass [" + classesToSubclass +"] " + notImplementing.toString(), assertTrue("Classes ending with Test|Tests] must subclass [" + classesToSubclass +"] " + notImplementing.toString(),
notImplementing.isEmpty()); notImplementing.isEmpty());
assertTrue("Subclasses of ESIntegTestCase should end with IT as they are integration tests: " + integTestsInDisguise, integTestsInDisguise.isEmpty());
} }
/* /*
* Some test the test classes * Some test the test classes
*/ */
@SuppressForbidden(reason = "Ignoring test the tester")
@Ignore @Ignore
public static final class NotImplementingTests {} public static final class NotImplementingTests {}
@SuppressForbidden(reason = "Ignoring test the tester")
@Ignore @Ignore
public static final class NotImplementingTest {} public static final class NotImplementingTest {}
public static final class WrongName extends ElasticsearchTestCase {} public static final class WrongName extends ESTestCase {}
public static final class WrongNameTheSecond extends ElasticsearchTestCase {} public static final class WrongNameTheSecond extends ESTestCase {}
public static final class PlainUnit extends TestCase {} public static final class PlainUnit extends TestCase {}

View File

@ -22,7 +22,7 @@ package org.elasticsearch;
import org.elasticsearch.cluster.metadata.IndexMetaData; import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.common.lucene.Lucene; import org.elasticsearch.common.lucene.Lucene;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.elasticsearch.test.VersionUtils; import org.elasticsearch.test.VersionUtils;
import org.hamcrest.Matchers; import org.hamcrest.Matchers;
import org.junit.Test; import org.junit.Test;
@ -39,7 +39,7 @@ import static org.hamcrest.CoreMatchers.equalTo;
import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.sameInstance; import static org.hamcrest.Matchers.sameInstance;
public class VersionTests extends ElasticsearchTestCase { public class VersionTests extends ESTestCase {
public void testMavenVersion() { public void testMavenVersion() {
// maven sets this property to ensure that the latest version // maven sets this property to ensure that the latest version

View File

@ -19,7 +19,6 @@
package org.elasticsearch.action; package org.elasticsearch.action;
import org.apache.lucene.util.LuceneTestCase.Slow;
import org.elasticsearch.action.admin.indices.alias.Alias; import org.elasticsearch.action.admin.indices.alias.Alias;
import org.elasticsearch.action.admin.indices.analyze.AnalyzeAction; import org.elasticsearch.action.admin.indices.analyze.AnalyzeAction;
import org.elasticsearch.action.admin.indices.analyze.AnalyzeRequest; import org.elasticsearch.action.admin.indices.analyze.AnalyzeRequest;
@ -92,9 +91,9 @@ import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.index.query.QueryBuilders; import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.script.Script; import org.elasticsearch.script.Script;
import org.elasticsearch.search.action.SearchServiceTransportAction; import org.elasticsearch.search.action.SearchServiceTransportAction;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.elasticsearch.test.ElasticsearchIntegrationTest.Scope; import org.elasticsearch.test.ESIntegTestCase.Scope;
import org.elasticsearch.threadpool.ThreadPool; import org.elasticsearch.threadpool.ThreadPool;
import org.elasticsearch.transport.Transport; import org.elasticsearch.transport.Transport;
import org.elasticsearch.transport.TransportChannel; import org.elasticsearch.transport.TransportChannel;
@ -123,8 +122,7 @@ import static org.hamcrest.Matchers.hasItem;
import static org.hamcrest.Matchers.instanceOf; import static org.hamcrest.Matchers.instanceOf;
@ClusterScope(scope = Scope.SUITE, numClientNodes = 1, minNumDataNodes = 2) @ClusterScope(scope = Scope.SUITE, numClientNodes = 1, minNumDataNodes = 2)
@Slow public class IndicesRequestIT extends ESIntegTestCase {
public class IndicesRequestTests extends ElasticsearchIntegrationTest {
private final List<String> indices = new ArrayList<>(); private final List<String> indices = new ArrayList<>();

View File

@ -24,7 +24,7 @@ import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.client.Client; import org.elasticsearch.client.Client;
import org.elasticsearch.client.transport.TransportClient; import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.cluster.node.DiscoveryNode; import org.elasticsearch.cluster.node.DiscoveryNode;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.concurrent.CountDownLatch; import java.util.concurrent.CountDownLatch;
@ -32,7 +32,7 @@ import java.util.concurrent.atomic.AtomicReference;
/** /**
*/ */
public class ListenerActionTests extends ElasticsearchIntegrationTest { public class ListenerActionIT extends ESIntegTestCase {
@Test @Test
public void verifyThreadedListeners() throws Throwable { public void verifyThreadedListeners() throws Throwable {

View File

@ -22,7 +22,7 @@ package org.elasticsearch.action;
import org.elasticsearch.action.support.IndicesOptions; import org.elasticsearch.action.support.IndicesOptions;
import org.elasticsearch.common.io.stream.BytesStreamOutput; import org.elasticsearch.common.io.stream.BytesStreamOutput;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -30,7 +30,7 @@ import java.io.IOException;
import static org.elasticsearch.test.VersionUtils.randomVersion; import static org.elasticsearch.test.VersionUtils.randomVersion;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.equalTo;
public class OriginalIndicesTests extends ElasticsearchTestCase { public class OriginalIndicesTests extends ESTestCase {
private static final IndicesOptions[] indicesOptionsValues = new IndicesOptions[]{ private static final IndicesOptions[] indicesOptionsValues = new IndicesOptions[]{
IndicesOptions.lenientExpandOpen() , IndicesOptions.strictExpand(), IndicesOptions.strictExpandOpen(), IndicesOptions.lenientExpandOpen() , IndicesOptions.strictExpand(), IndicesOptions.strictExpandOpen(),

View File

@ -28,8 +28,8 @@ import org.elasticsearch.action.search.ShardSearchFailure;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.util.concurrent.EsRejectedExecutionException; import org.elasticsearch.common.util.concurrent.EsRejectedExecutionException;
import org.elasticsearch.index.query.QueryBuilders; import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Test; import org.junit.Test;
import java.util.Locale; import java.util.Locale;
@ -40,8 +40,8 @@ import static org.hamcrest.Matchers.equalTo;
/** /**
*/ */
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.SUITE, numDataNodes = 2) @ClusterScope(scope = ESIntegTestCase.Scope.SUITE, numDataNodes = 2)
public class RejectionActionTests extends ElasticsearchIntegrationTest { public class RejectionActionIT extends ESIntegTestCase {
@Override @Override
protected Settings nodeSettings(int nodeOrdinal) { protected Settings nodeSettings(int nodeOrdinal) {

View File

@ -18,13 +18,12 @@
*/ */
package org.elasticsearch.action.admin; package org.elasticsearch.action.admin;
import org.apache.lucene.util.LuceneTestCase.Slow;
import org.elasticsearch.action.ActionListener; import org.elasticsearch.action.ActionListener;
import org.elasticsearch.action.admin.cluster.node.hotthreads.NodeHotThreads; import org.elasticsearch.action.admin.cluster.node.hotthreads.NodeHotThreads;
import org.elasticsearch.action.admin.cluster.node.hotthreads.NodesHotThreadsRequestBuilder; import org.elasticsearch.action.admin.cluster.node.hotthreads.NodesHotThreadsRequestBuilder;
import org.elasticsearch.action.admin.cluster.node.hotthreads.NodesHotThreadsResponse; import org.elasticsearch.action.admin.cluster.node.hotthreads.NodesHotThreadsResponse;
import org.elasticsearch.common.unit.TimeValue; import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.Map; import java.util.Map;
@ -32,15 +31,17 @@ import java.util.concurrent.CountDownLatch;
import java.util.concurrent.ExecutionException; import java.util.concurrent.ExecutionException;
import java.util.concurrent.atomic.AtomicBoolean; import java.util.concurrent.atomic.AtomicBoolean;
import static org.elasticsearch.index.query.QueryBuilders.*; import static org.elasticsearch.index.query.QueryBuilders.andQuery;
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
import static org.elasticsearch.index.query.QueryBuilders.notQuery;
import static org.elasticsearch.index.query.QueryBuilders.termQuery;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
import static org.hamcrest.CoreMatchers.*; import static org.hamcrest.CoreMatchers.equalTo;
import static org.hamcrest.CoreMatchers.is;
import static org.hamcrest.CoreMatchers.notNullValue;
import static org.hamcrest.Matchers.lessThan; import static org.hamcrest.Matchers.lessThan;
/** public class HotThreadsIT extends ESIntegTestCase {
*/
@Slow
public class HotThreadsTest extends ElasticsearchIntegrationTest {
@Test @Test
public void testHotThreadsDontFail() throws ExecutionException, InterruptedException { public void testHotThreadsDontFail() throws ExecutionException, InterruptedException {

View File

@ -23,8 +23,8 @@ import org.elasticsearch.action.admin.cluster.repositories.get.GetRepositoriesRe
import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryResponse; import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryResponse;
import org.elasticsearch.cluster.metadata.MetaData; import org.elasticsearch.cluster.metadata.MetaData;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Test; import org.junit.Test;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
@ -37,8 +37,8 @@ import static org.hamcrest.Matchers.hasSize;
* *
* The @ClusterScope TEST is needed because this class updates the cluster setting "cluster.blocks.read_only". * The @ClusterScope TEST is needed because this class updates the cluster setting "cluster.blocks.read_only".
*/ */
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class RepositoryBlocksTests extends ElasticsearchIntegrationTest { public class RepositoryBlocksIT extends ESIntegTestCase {
@Test @Test
public void testPutRepositoryWithBlocks() { public void testPutRepositoryWithBlocks() {

View File

@ -20,7 +20,6 @@ package org.elasticsearch.action.admin.cluster.snapshots;
import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryResponse; import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryResponse;
import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotResponse; import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotResponse;
import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotResponse;
import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsResponse; import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsResponse;
import org.elasticsearch.action.admin.cluster.snapshots.restore.RestoreSnapshotResponse; import org.elasticsearch.action.admin.cluster.snapshots.restore.RestoreSnapshotResponse;
import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusResponse; import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusResponse;
@ -28,8 +27,8 @@ import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.cluster.metadata.MetaData; import org.elasticsearch.cluster.metadata.MetaData;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.rest.RestStatus; import org.elasticsearch.rest.RestStatus;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
@ -45,8 +44,8 @@ import static org.hamcrest.Matchers.hasSize;
* *
* The @ClusterScope TEST is needed because this class updates the cluster setting "cluster.blocks.read_only". * The @ClusterScope TEST is needed because this class updates the cluster setting "cluster.blocks.read_only".
*/ */
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class SnapshotBlocksTests extends ElasticsearchIntegrationTest { public class SnapshotBlocksIT extends ESIntegTestCase {
protected static final String INDEX_NAME = "test-blocks-1"; protected static final String INDEX_NAME = "test-blocks-1";
protected static final String OTHER_INDEX_NAME = "test-blocks-2"; protected static final String OTHER_INDEX_NAME = "test-blocks-2";

View File

@ -23,7 +23,7 @@ import org.elasticsearch.Version;
import org.elasticsearch.action.support.IndicesOptions; import org.elasticsearch.action.support.IndicesOptions;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.BytesStreamOutput; import org.elasticsearch.common.io.stream.BytesStreamOutput;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.elasticsearch.test.VersionUtils; import org.elasticsearch.test.VersionUtils;
import org.junit.Test; import org.junit.Test;
@ -32,7 +32,7 @@ import static org.hamcrest.CoreMatchers.equalTo;
/** /**
* Unit tests for the {@link ClusterStateRequest}. * Unit tests for the {@link ClusterStateRequest}.
*/ */
public class ClusterStateRequestTest extends ElasticsearchTestCase { public class ClusterStateRequestTest extends ESTestCase {
@Test @Test
public void testSerialization() throws Exception { public void testSerialization() throws Exception {

View File

@ -26,20 +26,20 @@ import org.elasticsearch.client.Requests;
import org.elasticsearch.common.Priority; import org.elasticsearch.common.Priority;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.index.store.Store; import org.elasticsearch.index.store.Store;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.hamcrest.Matchers; import org.hamcrest.Matchers;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
import static org.elasticsearch.common.settings.Settings.settingsBuilder; import static org.elasticsearch.common.settings.Settings.settingsBuilder;
import static org.elasticsearch.test.ElasticsearchIntegrationTest.*; import static org.elasticsearch.test.ESIntegTestCase.*;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.is;
@ClusterScope(scope = Scope.SUITE, numDataNodes = 1, numClientNodes = 0) @ClusterScope(scope = Scope.SUITE, numDataNodes = 1, numClientNodes = 0)
public class ClusterStatsTests extends ElasticsearchIntegrationTest { public class ClusterStatsIT extends ESIntegTestCase {
private void assertCounts(ClusterStatsNodes.Counts counts, int total, int masterOnly, int dataOnly, int masterData, int client) { private void assertCounts(ClusterStatsNodes.Counts counts, int total, int masterOnly, int dataOnly, int masterData, int client) {
assertThat(counts.getTotal(), Matchers.equalTo(total)); assertThat(counts.getTotal(), Matchers.equalTo(total));

View File

@ -19,16 +19,16 @@
package org.elasticsearch.action.admin.cluster.tasks; package org.elasticsearch.action.admin.cluster.tasks;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Test; import org.junit.Test;
import java.util.Arrays; import java.util.Arrays;
import static org.elasticsearch.cluster.metadata.IndexMetaData.*; import static org.elasticsearch.cluster.metadata.IndexMetaData.*;
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class PendingTasksBlocksTests extends ElasticsearchIntegrationTest { public class PendingTasksBlocksIT extends ESIntegTestCase {
@Test @Test
public void testPendingTasksWithBlocks() { public void testPendingTasksWithBlocks() {

View File

@ -19,8 +19,8 @@
package org.elasticsearch.action.admin.indices.cache.clear; package org.elasticsearch.action.admin.indices.cache.clear;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Test; import org.junit.Test;
import java.util.Arrays; import java.util.Arrays;
@ -30,8 +30,8 @@ import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBloc
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class ClearIndicesCacheBlocksTests extends ElasticsearchIntegrationTest { public class ClearIndicesCacheBlocksIT extends ESIntegTestCase {
@Test @Test
public void testClearIndicesCacheWithBlocks() { public void testClearIndicesCacheWithBlocks() {

View File

@ -25,9 +25,9 @@ import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.cluster.metadata.MetaData; import org.elasticsearch.cluster.metadata.MetaData;
import org.elasticsearch.common.collect.ImmutableOpenMap; import org.elasticsearch.common.collect.ImmutableOpenMap;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.elasticsearch.test.ElasticsearchIntegrationTest.Scope; import org.elasticsearch.test.ESIntegTestCase.Scope;
import org.junit.Test; import org.junit.Test;
import java.util.HashMap; import java.util.HashMap;
@ -38,7 +38,7 @@ import static org.hamcrest.Matchers.*;
import static org.hamcrest.core.IsNull.notNullValue; import static org.hamcrest.core.IsNull.notNullValue;
@ClusterScope(scope = Scope.TEST) @ClusterScope(scope = Scope.TEST)
public class CreateIndexTests extends ElasticsearchIntegrationTest{ public class CreateIndexIT extends ESIntegTestCase {
@Test @Test
public void testCreationDate_Given() { public void testCreationDate_Given() {

View File

@ -19,12 +19,11 @@
package org.elasticsearch.action.admin.indices.create; package org.elasticsearch.action.admin.indices.create;
import org.elasticsearch.action.index.IndexRequestBuilderTest;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.rest.NoOpClient; import org.elasticsearch.rest.NoOpClient;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.After; import org.junit.After;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
@ -34,7 +33,7 @@ import java.io.IOException;
import java.util.HashMap; import java.util.HashMap;
import java.util.Map; import java.util.Map;
public class CreateIndexRequestBuilderTest extends ElasticsearchTestCase { public class CreateIndexRequestBuilderTest extends ESTestCase {
private static final String KEY = "my.settings.key"; private static final String KEY = "my.settings.key";
private static final String VALUE = "my.settings.value"; private static final String VALUE = "my.settings.value";

View File

@ -19,14 +19,14 @@
package org.elasticsearch.action.admin.indices.delete; package org.elasticsearch.action.admin.indices.delete;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Test; import org.junit.Test;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBlocked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBlocked;
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class DeleteIndexBlocksTests extends ElasticsearchIntegrationTest{ public class DeleteIndexBlocksIT extends ESIntegTestCase {
@Test @Test
public void testDeleteIndexWithBlocks() { public void testDeleteIndexWithBlocks() {

View File

@ -19,8 +19,8 @@
package org.elasticsearch.action.admin.indices.flush; package org.elasticsearch.action.admin.indices.flush;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Test; import org.junit.Test;
import java.util.Arrays; import java.util.Arrays;
@ -30,8 +30,8 @@ import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBloc
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class FlushBlocksTests extends ElasticsearchIntegrationTest { public class FlushBlocksIT extends ESIntegTestCase {
@Test @Test
public void testFlushWithBlocks() { public void testFlushWithBlocks() {

View File

@ -20,7 +20,6 @@
package org.elasticsearch.action.admin.indices.get; package org.elasticsearch.action.admin.indices.get;
import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableList;
import org.elasticsearch.ResourceNotFoundException;
import org.elasticsearch.action.admin.indices.alias.Alias; import org.elasticsearch.action.admin.indices.alias.Alias;
import org.elasticsearch.action.admin.indices.get.GetIndexRequest.Feature; import org.elasticsearch.action.admin.indices.get.GetIndexRequest.Feature;
import org.elasticsearch.cluster.metadata.AliasMetaData; import org.elasticsearch.cluster.metadata.AliasMetaData;
@ -29,7 +28,7 @@ import org.elasticsearch.common.collect.ImmutableOpenMap;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.index.IndexNotFoundException; import org.elasticsearch.index.IndexNotFoundException;
import org.elasticsearch.search.warmer.IndexWarmersMetaData.Entry; import org.elasticsearch.search.warmer.IndexWarmersMetaData.Entry;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.ArrayList; import java.util.ArrayList;
@ -41,8 +40,8 @@ import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcke
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBlocked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBlocked;
import static org.hamcrest.Matchers.*; import static org.hamcrest.Matchers.*;
@ElasticsearchIntegrationTest.SuiteScopeTest @ESIntegTestCase.SuiteScopeTestCase
public class GetIndexTests extends ElasticsearchIntegrationTest { public class GetIndexIT extends ESIntegTestCase {
private static final String[] allFeatures = { "_alias", "_aliases", "_mapping", "_mappings", "_settings", "_warmer", "_warmers" }; private static final String[] allFeatures = { "_alias", "_aliases", "_mapping", "_mappings", "_settings", "_warmer", "_warmers" };

View File

@ -20,9 +20,9 @@
package org.elasticsearch.action.admin.indices.mapping.put; package org.elasticsearch.action.admin.indices.mapping.put;
import org.elasticsearch.action.ActionRequestValidationException; import org.elasticsearch.action.ActionRequestValidationException;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
public class PutMappingRequestTests extends ElasticsearchTestCase { public class PutMappingRequestTests extends ESTestCase {
public void testValidation() { public void testValidation() {
PutMappingRequest r = new PutMappingRequest("myindex"); PutMappingRequest r = new PutMappingRequest("myindex");

View File

@ -19,8 +19,8 @@
package org.elasticsearch.action.admin.indices.optimize; package org.elasticsearch.action.admin.indices.optimize;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Test; import org.junit.Test;
import java.util.Arrays; import java.util.Arrays;
@ -30,8 +30,8 @@ import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBloc
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class OptimizeBlocksTests extends ElasticsearchIntegrationTest { public class OptimizeBlocksIT extends ESIntegTestCase {
@Test @Test
public void testOptimizeWithBlocks() { public void testOptimizeWithBlocks() {

View File

@ -20,8 +20,8 @@
package org.elasticsearch.action.admin.indices.refresh; package org.elasticsearch.action.admin.indices.refresh;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Test; import org.junit.Test;
import java.util.Arrays; import java.util.Arrays;
@ -31,8 +31,8 @@ import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBloc
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class RefreshBlocksTests extends ElasticsearchIntegrationTest { public class RefreshBlocksIT extends ESIntegTestCase {
@Test @Test
public void testRefreshWithBlocks() { public void testRefreshWithBlocks() {

View File

@ -19,8 +19,8 @@
package org.elasticsearch.action.admin.indices.segments; package org.elasticsearch.action.admin.indices.segments;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Test; import org.junit.Test;
import java.util.Arrays; import java.util.Arrays;
@ -29,8 +29,8 @@ import static org.elasticsearch.cluster.metadata.IndexMetaData.*;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBlocked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBlocked;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class IndicesSegmentsBlocksTests extends ElasticsearchIntegrationTest { public class IndicesSegmentsBlocksIT extends ESIntegTestCase {
@Test @Test
public void testIndicesSegmentsWithBlocks() { public void testIndicesSegmentsWithBlocks() {

View File

@ -22,13 +22,13 @@ package org.elasticsearch.action.admin.indices.segments;
import org.elasticsearch.action.support.IndicesOptions; import org.elasticsearch.action.support.IndicesOptions;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.index.engine.Segment; import org.elasticsearch.index.engine.Segment;
import org.elasticsearch.test.ElasticsearchSingleNodeTest; import org.elasticsearch.test.ESSingleNodeTestCase;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
import java.util.List; import java.util.List;
public class IndicesSegmentsRequestTests extends ElasticsearchSingleNodeTest { public class IndicesSegmentsRequestTests extends ESSingleNodeTestCase {
@Before @Before
public void setupIndex() { public void setupIndex() {

View File

@ -24,7 +24,6 @@ import com.carrotsearch.hppc.cursors.IntObjectCursor;
import com.carrotsearch.hppc.cursors.ObjectCursor; import com.carrotsearch.hppc.cursors.ObjectCursor;
import com.google.common.base.Predicate; import com.google.common.base.Predicate;
import org.apache.lucene.index.CorruptIndexException; import org.apache.lucene.index.CorruptIndexException;
import org.elasticsearch.action.admin.indices.shards.IndicesShardStoresResponse;
import org.elasticsearch.action.index.IndexRequestBuilder; import org.elasticsearch.action.index.IndexRequestBuilder;
import org.elasticsearch.client.Requests; import org.elasticsearch.client.Requests;
import org.elasticsearch.cluster.ClusterState; import org.elasticsearch.cluster.ClusterState;
@ -36,7 +35,7 @@ import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.index.IndexService; import org.elasticsearch.index.IndexService;
import org.elasticsearch.index.shard.IndexShard; import org.elasticsearch.index.shard.IndexShard;
import org.elasticsearch.indices.IndicesService; import org.elasticsearch.indices.IndicesService;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.junit.annotations.TestLogging; import org.elasticsearch.test.junit.annotations.TestLogging;
import org.elasticsearch.test.store.MockFSDirectoryService; import org.elasticsearch.test.store.MockFSDirectoryService;
import org.junit.Test; import org.junit.Test;
@ -48,8 +47,8 @@ import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcke
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoTimeout; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoTimeout;
import static org.hamcrest.Matchers.*; import static org.hamcrest.Matchers.*;
@ElasticsearchIntegrationTest.ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ESIntegTestCase.ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class IndicesShardStoreRequestTests extends ElasticsearchIntegrationTest { public class IndicesShardStoreRequestIT extends ESIntegTestCase {
@Test @Test
public void testEmpty() { public void testEmpty() {

View File

@ -20,17 +20,15 @@
package org.elasticsearch.action.admin.indices.shards; package org.elasticsearch.action.admin.indices.shards;
import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableList;
import org.apache.lucene.index.CorruptIndexException;
import org.apache.lucene.util.CollectionUtil; import org.apache.lucene.util.CollectionUtil;
import org.elasticsearch.Version; import org.elasticsearch.Version;
import org.elasticsearch.action.admin.indices.shards.IndicesShardStoresResponse;
import org.elasticsearch.cluster.node.DiscoveryNode; import org.elasticsearch.cluster.node.DiscoveryNode;
import org.elasticsearch.common.bytes.BytesReference; import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.collect.ImmutableOpenIntMap; import org.elasticsearch.common.collect.ImmutableOpenIntMap;
import org.elasticsearch.common.collect.ImmutableOpenMap; import org.elasticsearch.common.collect.ImmutableOpenMap;
import org.elasticsearch.common.transport.DummyTransportAddress; import org.elasticsearch.common.transport.DummyTransportAddress;
import org.elasticsearch.common.xcontent.*; import org.elasticsearch.common.xcontent.*;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.elasticsearch.transport.NodeDisconnectedException; import org.elasticsearch.transport.NodeDisconnectedException;
import org.junit.Test; import org.junit.Test;
@ -39,7 +37,7 @@ import java.util.*;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
public class IndicesShardStoreResponseTest extends ElasticsearchTestCase { public class IndicesShardStoreResponseTest extends ESTestCase {
@Test @Test
public void testBasicSerialization() throws Exception { public void testBasicSerialization() throws Exception {

View File

@ -21,16 +21,16 @@ package org.elasticsearch.action.admin.indices.stats;
import org.elasticsearch.cluster.block.ClusterBlockException; import org.elasticsearch.cluster.block.ClusterBlockException;
import org.elasticsearch.cluster.metadata.IndexMetaData; import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.junit.Test; import org.junit.Test;
import java.util.Arrays; import java.util.Arrays;
import static org.elasticsearch.cluster.metadata.IndexMetaData.*; import static org.elasticsearch.cluster.metadata.IndexMetaData.*;
@ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class IndicesStatsBlocksTests extends ElasticsearchIntegrationTest { public class IndicesStatsBlocksIT extends ESIntegTestCase {
@Test @Test
public void testIndicesStatsWithBlocks() { public void testIndicesStatsWithBlocks() {

View File

@ -24,12 +24,12 @@ import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.index.engine.CommitStats; import org.elasticsearch.index.engine.CommitStats;
import org.elasticsearch.index.engine.SegmentsStats; import org.elasticsearch.index.engine.SegmentsStats;
import org.elasticsearch.index.translog.Translog; import org.elasticsearch.index.translog.Translog;
import org.elasticsearch.test.ElasticsearchSingleNodeTest; import org.elasticsearch.test.ESSingleNodeTestCase;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.*; import static org.hamcrest.Matchers.*;
public class IndicesStatsTests extends ElasticsearchSingleNodeTest { public class IndicesStatsTests extends ESSingleNodeTestCase {
public void testSegmentStatsEmptyIndex() { public void testSegmentStatsEmptyIndex() {
createIndex("test"); createIndex("test");

View File

@ -19,13 +19,13 @@
package org.elasticsearch.action.admin.indices.warmer.put; package org.elasticsearch.action.admin.indices.warmer.put;
import org.elasticsearch.action.ActionRequestValidationException; import org.elasticsearch.action.ActionRequestValidationException;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import static org.hamcrest.CoreMatchers.containsString; import static org.hamcrest.CoreMatchers.containsString;
import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.hasSize;
public class PutWarmerRequestTests extends ElasticsearchTestCase { public class PutWarmerRequestTests extends ESTestCase {
@Test // issue 4196 @Test // issue 4196
public void testThatValidationWithoutSpecifyingSearchRequestFails() { public void testThatValidationWithoutSpecifyingSearchRequestFails() {

View File

@ -23,12 +23,12 @@ package org.elasticsearch.action.bulk;
import com.google.common.base.Charsets; import com.google.common.base.Charsets;
import org.elasticsearch.action.admin.indices.mapping.get.GetMappingsResponse; import org.elasticsearch.action.admin.indices.mapping.get.GetMappingsResponse;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.junit.Test; import org.junit.Test;
import static org.elasticsearch.common.io.Streams.copyToStringFromClasspath; import static org.elasticsearch.common.io.Streams.copyToStringFromClasspath;
public class BulkIntegrationTests extends ElasticsearchIntegrationTest { public class BulkIntegrationIT extends ESIntegTestCase {
@Test @Test
public void testBulkIndexCreatesMapping() throws Exception { public void testBulkIndexCreatesMapping() throws Exception {

View File

@ -20,13 +20,13 @@
package org.elasticsearch.action.bulk; package org.elasticsearch.action.bulk;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope; import org.elasticsearch.test.ESIntegTestCase.ClusterScope;
import org.elasticsearch.test.ElasticsearchIntegrationTest.Scope; import org.elasticsearch.test.ESIntegTestCase.Scope;
import org.junit.Test; import org.junit.Test;
@ClusterScope(scope = Scope.TEST, numDataNodes = 0) @ClusterScope(scope = Scope.TEST, numDataNodes = 0)
public class BulkProcessorClusterSettingsTests extends ElasticsearchIntegrationTest { public class BulkProcessorClusterSettingsIT extends ESIntegTestCase {
@Test @Test
public void testBulkProcessorAutoCreateRestrictions() throws Exception { public void testBulkProcessorAutoCreateRestrictions() throws Exception {

View File

@ -32,7 +32,7 @@ import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.unit.ByteSizeUnit; import org.elasticsearch.common.unit.ByteSizeUnit;
import org.elasticsearch.common.unit.ByteSizeValue; import org.elasticsearch.common.unit.ByteSizeValue;
import org.elasticsearch.common.unit.TimeValue; import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.Arrays; import java.util.Arrays;
@ -47,7 +47,7 @@ import java.util.concurrent.atomic.AtomicInteger;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.*; import static org.hamcrest.Matchers.*;
public class BulkProcessorTests extends ElasticsearchIntegrationTest { public class BulkProcessorIT extends ESIntegTestCase {
@Test @Test
public void testThatBulkProcessorCountIsCorrect() throws InterruptedException { public void testThatBulkProcessorCountIsCorrect() throws InterruptedException {

View File

@ -30,7 +30,7 @@ import org.elasticsearch.client.Requests;
import org.elasticsearch.common.Strings; import org.elasticsearch.common.Strings;
import org.elasticsearch.common.bytes.BytesArray; import org.elasticsearch.common.bytes.BytesArray;
import org.elasticsearch.script.Script; import org.elasticsearch.script.Script;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.ArrayList; import java.util.ArrayList;
@ -42,7 +42,7 @@ import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.instanceOf; import static org.hamcrest.Matchers.instanceOf;
import static org.hamcrest.Matchers.notNullValue; import static org.hamcrest.Matchers.notNullValue;
public class BulkRequestTests extends ElasticsearchTestCase { public class BulkRequestTests extends ESTestCase {
@Test @Test
public void testSimpleBulk1() throws Exception { public void testSimpleBulk1() throws Exception {

View File

@ -29,7 +29,7 @@ import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.common.xcontent.XContentHelper; import org.elasticsearch.common.xcontent.XContentHelper;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.index.query.QueryBuilders; import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.AfterClass; import org.junit.AfterClass;
import org.junit.BeforeClass; import org.junit.BeforeClass;
import org.junit.Test; import org.junit.Test;
@ -39,7 +39,7 @@ import java.io.IOException;
import static org.hamcrest.CoreMatchers.containsString; import static org.hamcrest.CoreMatchers.containsString;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.equalTo;
public class CountRequestBuilderTests extends ElasticsearchTestCase { public class CountRequestBuilderTests extends ESTestCase {
private static Client client; private static Client client;

View File

@ -25,7 +25,7 @@ import org.elasticsearch.action.support.QuerySourceBuilder;
import org.elasticsearch.common.xcontent.XContentHelper; import org.elasticsearch.common.xcontent.XContentHelper;
import org.elasticsearch.index.query.QueryBuilders; import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.internal.SearchContext;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.Map; import java.util.Map;
@ -34,7 +34,7 @@ import static org.hamcrest.CoreMatchers.equalTo;
import static org.hamcrest.CoreMatchers.notNullValue; import static org.hamcrest.CoreMatchers.notNullValue;
import static org.hamcrest.CoreMatchers.nullValue; import static org.hamcrest.CoreMatchers.nullValue;
public class CountRequestTests extends ElasticsearchTestCase { public class CountRequestTests extends ESTestCase {
@Test @Test
public void testToSearchRequest() { public void testToSearchRequest() {

View File

@ -24,12 +24,12 @@ import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.action.search.ShardSearchFailure; import org.elasticsearch.action.search.ShardSearchFailure;
import org.elasticsearch.search.internal.InternalSearchHits; import org.elasticsearch.search.internal.InternalSearchHits;
import org.elasticsearch.search.internal.InternalSearchResponse; import org.elasticsearch.search.internal.InternalSearchResponse;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.equalTo;
public class CountResponseTests extends ElasticsearchTestCase { public class CountResponseTests extends ESTestCase {
@Test @Test
public void testFromSearchResponse() { public void testFromSearchResponse() {

View File

@ -21,14 +21,14 @@ package org.elasticsearch.action.fieldstats;
import org.elasticsearch.common.bytes.BytesArray; import org.elasticsearch.common.bytes.BytesArray;
import org.elasticsearch.common.io.Streams; import org.elasticsearch.common.io.Streams;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import static org.elasticsearch.action.fieldstats.IndexConstraint.Comparison.*; import static org.elasticsearch.action.fieldstats.IndexConstraint.Comparison.*;
import static org.elasticsearch.action.fieldstats.IndexConstraint.Property.MAX; import static org.elasticsearch.action.fieldstats.IndexConstraint.Property.MAX;
import static org.elasticsearch.action.fieldstats.IndexConstraint.Property.MIN; import static org.elasticsearch.action.fieldstats.IndexConstraint.Property.MIN;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
public class FieldStatsRequestTest extends ElasticsearchTestCase { public class FieldStatsRequestTest extends ESTestCase {
public void testFieldsParsing() throws Exception { public void testFieldsParsing() throws Exception {
byte[] data = Streams.copyToBytesFromClasspath("/org/elasticsearch/action/fieldstats/fieldstats-index-constraints-request.json"); byte[] data = Streams.copyToBytesFromClasspath("/org/elasticsearch/action/fieldstats/fieldstats-index-constraints-request.json");

View File

@ -23,7 +23,7 @@ import org.elasticsearch.common.io.stream.BytesStreamOutput;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.index.VersionType; import org.elasticsearch.index.VersionType;
import org.elasticsearch.search.fetch.source.FetchSourceContext; import org.elasticsearch.search.fetch.source.FetchSourceContext;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -31,7 +31,7 @@ import java.io.IOException;
import static org.elasticsearch.test.VersionUtils.randomVersion; import static org.elasticsearch.test.VersionUtils.randomVersion;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.equalTo;
public class MultiGetShardRequestTests extends ElasticsearchTestCase { public class MultiGetShardRequestTests extends ESTestCase {
@Test @Test
public void testSerialization() throws IOException { public void testSerialization() throws IOException {

View File

@ -24,7 +24,7 @@ import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.common.xcontent.XContentHelper; import org.elasticsearch.common.xcontent.XContentHelper;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.rest.NoOpClient; import org.elasticsearch.rest.NoOpClient;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.After; import org.junit.After;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
@ -33,7 +33,7 @@ import java.io.ByteArrayOutputStream;
import java.util.HashMap; import java.util.HashMap;
import java.util.Map; import java.util.Map;
public class IndexRequestBuilderTest extends ElasticsearchTestCase { public class IndexRequestBuilderTest extends ESTestCase {
private static final String EXPECTED_SOURCE = "{\"SomeKey\":\"SomeValue\"}"; private static final String EXPECTED_SOURCE = "{\"SomeKey\":\"SomeValue\"}";
private NoOpClient testClient; private NoOpClient testClient;

View File

@ -18,13 +18,13 @@
*/ */
package org.elasticsearch.action.index; package org.elasticsearch.action.index;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
/** /**
*/ */
public class IndexRequestTests extends ElasticsearchTestCase { public class IndexRequestTests extends ESTestCase {
@Test @Test
public void testIndexRequestOpTypeFromString() throws Exception { public void testIndexRequestOpTypeFromString() throws Exception {

View File

@ -22,7 +22,7 @@ package org.elasticsearch.action.indexedscripts.get;
import org.elasticsearch.common.io.stream.BytesStreamOutput; import org.elasticsearch.common.io.stream.BytesStreamOutput;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.index.VersionType; import org.elasticsearch.index.VersionType;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -30,7 +30,7 @@ import java.io.IOException;
import static org.elasticsearch.test.VersionUtils.randomVersion; import static org.elasticsearch.test.VersionUtils.randomVersion;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.equalTo;
public class GetIndexedScriptRequestTests extends ElasticsearchTestCase { public class GetIndexedScriptRequestTests extends ESTestCase {
@Test @Test
public void testGetIndexedScriptRequestSerialization() throws IOException { public void testGetIndexedScriptRequestSerialization() throws IOException {

View File

@ -22,7 +22,7 @@ import org.elasticsearch.action.support.IndicesOptions;
import org.elasticsearch.common.collect.MapBuilder; import org.elasticsearch.common.collect.MapBuilder;
import org.elasticsearch.common.io.Streams; import org.elasticsearch.common.io.Streams;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.Map; import java.util.Map;
@ -31,7 +31,7 @@ import static org.hamcrest.Matchers.*;
/** /**
*/ */
public class MultiPercolatorRequestTests extends ElasticsearchTestCase { public class MultiPercolatorRequestTests extends ESTestCase {
@Test @Test
public void testParseBulkRequests() throws Exception { public void testParseBulkRequests() throws Exception {

View File

@ -24,7 +24,7 @@ import org.elasticsearch.common.io.Streams;
import org.elasticsearch.common.xcontent.ToXContent; import org.elasticsearch.common.xcontent.ToXContent;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -34,7 +34,7 @@ import static org.hamcrest.Matchers.nullValue;
/** /**
*/ */
public class MultiSearchRequestTests extends ElasticsearchTestCase { public class MultiSearchRequestTests extends ESTestCase {
@Test @Test
public void simpleAdd() throws Exception { public void simpleAdd() throws Exception {

View File

@ -28,7 +28,7 @@ import org.elasticsearch.common.xcontent.XContentHelper;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.index.query.QueryBuilders; import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.search.builder.SearchSourceBuilder; import org.elasticsearch.search.builder.SearchSourceBuilder;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.AfterClass; import org.junit.AfterClass;
import org.junit.BeforeClass; import org.junit.BeforeClass;
import org.junit.Test; import org.junit.Test;
@ -38,7 +38,7 @@ import java.io.IOException;
import static org.hamcrest.CoreMatchers.containsString; import static org.hamcrest.CoreMatchers.containsString;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.equalTo;
public class SearchRequestBuilderTests extends ElasticsearchTestCase { public class SearchRequestBuilderTests extends ESTestCase {
private static Client client; private static Client client;

View File

@ -22,13 +22,13 @@ package org.elasticsearch.action.support;
import org.elasticsearch.Version; import org.elasticsearch.Version;
import org.elasticsearch.common.io.stream.BytesStreamOutput; import org.elasticsearch.common.io.stream.BytesStreamOutput;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import static org.elasticsearch.test.VersionUtils.randomVersion; import static org.elasticsearch.test.VersionUtils.randomVersion;
import static org.hamcrest.CoreMatchers.equalTo; import static org.hamcrest.CoreMatchers.equalTo;
public class IndicesOptionsTests extends ElasticsearchTestCase { public class IndicesOptionsTests extends ESTestCase {
@Test @Test
public void testSerialization() throws Exception { public void testSerialization() throws Exception {

View File

@ -20,7 +20,7 @@ package org.elasticsearch.action.support;
import org.elasticsearch.action.ActionListener; import org.elasticsearch.action.ActionListener;
import org.elasticsearch.common.util.concurrent.AbstractRunnable; import org.elasticsearch.common.util.concurrent.AbstractRunnable;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.elasticsearch.threadpool.ThreadPool; import org.elasticsearch.threadpool.ThreadPool;
import org.elasticsearch.transport.Transports; import org.elasticsearch.transport.Transports;
@ -28,7 +28,7 @@ import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit; import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicReference; import java.util.concurrent.atomic.AtomicReference;
public class ListenableActionFutureTests extends ElasticsearchTestCase { public class ListenableActionFutureTests extends ESTestCase {
public void testListenerIsCallableFromNetworkThreads() throws Throwable { public void testListenerIsCallableFromNetworkThreads() throws Throwable {
ThreadPool threadPool = new ThreadPool("testListenerIsCallableFromNetworkThreads"); ThreadPool threadPool = new ThreadPool("testListenerIsCallableFromNetworkThreads");

View File

@ -26,7 +26,7 @@ import org.elasticsearch.action.ActionRequest;
import org.elasticsearch.action.ActionRequestValidationException; import org.elasticsearch.action.ActionRequestValidationException;
import org.elasticsearch.action.ActionResponse; import org.elasticsearch.action.ActionResponse;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
@ -39,7 +39,7 @@ import java.util.concurrent.atomic.AtomicInteger;
import static org.hamcrest.CoreMatchers.*; import static org.hamcrest.CoreMatchers.*;
public class TransportActionFilterChainTests extends ElasticsearchTestCase { public class TransportActionFilterChainTests extends ESTestCase {
private AtomicInteger counter; private AtomicInteger counter;

View File

@ -54,7 +54,7 @@ import org.elasticsearch.index.shard.IndexShardNotStartedException;
import org.elasticsearch.index.shard.IndexShardState; import org.elasticsearch.index.shard.IndexShardState;
import org.elasticsearch.index.shard.ShardId; import org.elasticsearch.index.shard.ShardId;
import org.elasticsearch.rest.RestStatus; import org.elasticsearch.rest.RestStatus;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.elasticsearch.test.cluster.TestClusterService; import org.elasticsearch.test.cluster.TestClusterService;
import org.elasticsearch.test.transport.CapturingTransport; import org.elasticsearch.test.transport.CapturingTransport;
import org.elasticsearch.threadpool.ThreadPool; import org.elasticsearch.threadpool.ThreadPool;
@ -80,7 +80,7 @@ import java.util.concurrent.atomic.AtomicInteger;
import static org.elasticsearch.cluster.metadata.IndexMetaData.*; import static org.elasticsearch.cluster.metadata.IndexMetaData.*;
import static org.hamcrest.Matchers.*; import static org.hamcrest.Matchers.*;
public class ShardReplicationTests extends ElasticsearchTestCase { public class ShardReplicationTests extends ESTestCase {
private static ThreadPool threadPool; private static ThreadPool threadPool;

View File

@ -40,8 +40,7 @@ import org.elasticsearch.action.admin.indices.alias.Alias;
import org.elasticsearch.common.inject.internal.Join; import org.elasticsearch.common.inject.internal.Join;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.index.*; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest;
import java.io.IOException; import java.io.IOException;
import java.util.*; import java.util.*;
@ -50,7 +49,7 @@ import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
public abstract class AbstractTermVectorsTests extends ElasticsearchIntegrationTest { public abstract class AbstractTermVectorsTestCase extends ESIntegTestCase {
protected static class TestFieldSetting { protected static class TestFieldSetting {
final public String name; final public String name;

View File

@ -29,7 +29,7 @@ import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.ToXContent; import org.elasticsearch.common.xcontent.ToXContent;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.hamcrest.Matchers; import org.hamcrest.Matchers;
import org.junit.Test; import org.junit.Test;
@ -38,7 +38,7 @@ import java.io.IOException;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
public class GetTermVectorsCheckDocFreqTests extends ElasticsearchIntegrationTest { public class GetTermVectorsCheckDocFreqIT extends ESIntegTestCase {
@Override @Override
protected int numberOfShards() { protected int numberOfShards() {

View File

@ -20,12 +20,14 @@
package org.elasticsearch.action.termvectors; package org.elasticsearch.action.termvectors;
import com.carrotsearch.hppc.ObjectIntHashMap; import com.carrotsearch.hppc.ObjectIntHashMap;
import org.apache.lucene.analysis.payloads.PayloadHelper; import org.apache.lucene.analysis.payloads.PayloadHelper;
import org.apache.lucene.document.FieldType; import org.apache.lucene.document.FieldType;
import org.apache.lucene.index.*; import org.apache.lucene.index.DirectoryReader;
import org.apache.lucene.index.Fields;
import org.apache.lucene.index.PostingsEnum;
import org.apache.lucene.index.Terms;
import org.apache.lucene.index.TermsEnum;
import org.apache.lucene.util.BytesRef; import org.apache.lucene.util.BytesRef;
import org.apache.lucene.util.LuceneTestCase.Slow;
import org.elasticsearch.ElasticsearchException; import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.action.ActionFuture; import org.elasticsearch.action.ActionFuture;
import org.elasticsearch.action.admin.indices.alias.Alias; import org.elasticsearch.action.admin.indices.alias.Alias;
@ -42,17 +44,25 @@ import org.hamcrest.Matcher;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
import java.util.*; import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.ExecutionException; import java.util.concurrent.ExecutionException;
import static org.elasticsearch.common.settings.Settings.settingsBuilder; import static org.elasticsearch.common.settings.Settings.settingsBuilder;
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder; import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertThrows; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertThrows;
import static org.hamcrest.Matchers.*; import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.lessThan;
import static org.hamcrest.Matchers.notNullValue;
import static org.hamcrest.Matchers.nullValue;
@Slow public class GetTermVectorsIT extends AbstractTermVectorsTestCase {
public class GetTermVectorsTests extends AbstractTermVectorsTests {
@Test @Test
public void testNoSuchDoc() throws Exception { public void testNoSuchDoc() throws Exception {

View File

@ -33,20 +33,20 @@ import java.io.IOException;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.*; import static org.hamcrest.Matchers.*;
public class MultiTermVectorsTests extends AbstractTermVectorsTests { public class MultiTermVectorsIT extends AbstractTermVectorsTestCase {
@Test @Test
public void testDuelESLucene() throws Exception { public void testDuelESLucene() throws Exception {
AbstractTermVectorsTests.TestFieldSetting[] testFieldSettings = getFieldSettings(); AbstractTermVectorsTestCase.TestFieldSetting[] testFieldSettings = getFieldSettings();
createIndexBasedOnFieldSettings("test", "alias", testFieldSettings); createIndexBasedOnFieldSettings("test", "alias", testFieldSettings);
//we generate as many docs as many shards we have //we generate as many docs as many shards we have
TestDoc[] testDocs = generateTestDocs("test", testFieldSettings); TestDoc[] testDocs = generateTestDocs("test", testFieldSettings);
DirectoryReader directoryReader = indexDocsWithLucene(testDocs); DirectoryReader directoryReader = indexDocsWithLucene(testDocs);
AbstractTermVectorsTests.TestConfig[] testConfigs = generateTestConfigs(20, testDocs, testFieldSettings); AbstractTermVectorsTestCase.TestConfig[] testConfigs = generateTestConfigs(20, testDocs, testFieldSettings);
MultiTermVectorsRequestBuilder requestBuilder = client().prepareMultiTermVectors(); MultiTermVectorsRequestBuilder requestBuilder = client().prepareMultiTermVectors();
for (AbstractTermVectorsTests.TestConfig test : testConfigs) { for (AbstractTermVectorsTestCase.TestConfig test : testConfigs) {
requestBuilder.add(getRequestForConfig(test).request()); requestBuilder.add(getRequestForConfig(test).request());
} }

View File

@ -42,7 +42,7 @@ import org.elasticsearch.index.mapper.MapperParsingException;
import org.elasticsearch.index.mapper.core.TypeParsers; import org.elasticsearch.index.mapper.core.TypeParsers;
import org.elasticsearch.index.mapper.internal.AllFieldMapper; import org.elasticsearch.index.mapper.internal.AllFieldMapper;
import org.elasticsearch.rest.action.termvectors.RestTermVectorsAction; import org.elasticsearch.rest.action.termvectors.RestTermVectorsAction;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.hamcrest.Matchers; import org.hamcrest.Matchers;
import org.junit.Test; import org.junit.Test;
@ -56,7 +56,7 @@ import java.util.Set;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
public class TermVectorsUnitTests extends ElasticsearchTestCase { public class TermVectorsUnitTests extends ESTestCase {
@Test @Test
public void streamResponse() throws Exception { public void streamResponse() throws Exception {

View File

@ -28,7 +28,7 @@ import org.elasticsearch.common.xcontent.XContentHelper;
import org.elasticsearch.index.get.GetResult; import org.elasticsearch.index.get.GetResult;
import org.elasticsearch.script.Script; import org.elasticsearch.script.Script;
import org.elasticsearch.script.ScriptService.ScriptType; import org.elasticsearch.script.ScriptService.ScriptType;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.Map; import java.util.Map;
@ -38,7 +38,7 @@ import static org.hamcrest.Matchers.*;
import static org.hamcrest.Matchers.instanceOf; import static org.hamcrest.Matchers.instanceOf;
import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.is;
public class UpdateRequestTests extends ElasticsearchTestCase { public class UpdateRequestTests extends ESTestCase {
@Test @Test
public void testUpdateRequest() throws Exception { public void testUpdateRequest() throws Exception {

View File

@ -19,7 +19,6 @@
package org.elasticsearch.aliases; package org.elasticsearch.aliases;
import org.apache.lucene.util.LuceneTestCase.Slow;
import org.elasticsearch.Version; import org.elasticsearch.Version;
import org.elasticsearch.action.ActionRequestValidationException; import org.elasticsearch.action.ActionRequestValidationException;
import org.elasticsearch.action.admin.indices.alias.Alias; import org.elasticsearch.action.admin.indices.alias.Alias;
@ -46,7 +45,7 @@ import org.elasticsearch.search.aggregations.AggregationBuilders;
import org.elasticsearch.search.aggregations.bucket.global.Global; import org.elasticsearch.search.aggregations.bucket.global.Global;
import org.elasticsearch.search.aggregations.bucket.terms.Terms; import org.elasticsearch.search.aggregations.bucket.terms.Terms;
import org.elasticsearch.search.sort.SortOrder; import org.elasticsearch.search.sort.SortOrder;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.Arrays; import java.util.Arrays;
@ -58,18 +57,31 @@ import java.util.concurrent.TimeUnit;
import static com.google.common.collect.Sets.newHashSet; import static com.google.common.collect.Sets.newHashSet;
import static org.elasticsearch.client.Requests.createIndexRequest; import static org.elasticsearch.client.Requests.createIndexRequest;
import static org.elasticsearch.client.Requests.indexRequest; import static org.elasticsearch.client.Requests.indexRequest;
import static org.elasticsearch.cluster.metadata.IndexMetaData.*; import static org.elasticsearch.cluster.metadata.IndexMetaData.INDEX_METADATA_BLOCK;
import static org.elasticsearch.cluster.metadata.IndexMetaData.INDEX_READ_ONLY_BLOCK;
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_BLOCKS_METADATA;
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_BLOCKS_READ;
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_BLOCKS_WRITE;
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_READ_ONLY;
import static org.elasticsearch.common.settings.Settings.settingsBuilder; import static org.elasticsearch.common.settings.Settings.settingsBuilder;
import static org.elasticsearch.index.query.QueryBuilders.*; import static org.elasticsearch.index.query.QueryBuilders.hasChildQuery;
import static org.elasticsearch.index.query.QueryBuilders.hasParentQuery;
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
import static org.elasticsearch.index.query.QueryBuilders.rangeQuery;
import static org.elasticsearch.index.query.QueryBuilders.termQuery;
import static org.elasticsearch.test.hamcrest.CollectionAssertions.hasKey; import static org.elasticsearch.test.hamcrest.CollectionAssertions.hasKey;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.*; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertBlocked;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertSearchResponse;
import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.containsString;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.lessThan;
import static org.hamcrest.Matchers.notNullValue;
import static org.hamcrest.Matchers.nullValue;
/** public class IndexAliasesIT extends ESIntegTestCase {
*
*/
@Slow
public class IndexAliasesTests extends ElasticsearchIntegrationTest {
@Test @Test
public void testAliases() throws Exception { public void testAliases() throws Exception {

View File

@ -31,7 +31,7 @@ import org.elasticsearch.common.logging.ESLogger;
import org.elasticsearch.common.logging.Loggers; import org.elasticsearch.common.logging.Loggers;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.unit.TimeValue; import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.test.ElasticsearchAllocationTestCase; import org.elasticsearch.test.ESAllocationTestCase;
import java.util.Random; import java.util.Random;
@ -48,9 +48,9 @@ public class ClusterAllocationRerouteBenchmark {
final int numReplicas = 2; final int numReplicas = 2;
final int numberOfNodes = 30; final int numberOfNodes = 30;
final int numberOfTags = 2; final int numberOfTags = 2;
AllocationService strategy = ElasticsearchAllocationTestCase.createAllocationService(Settings.builder() AllocationService strategy = ESAllocationTestCase.createAllocationService(Settings.builder()
.put("cluster.routing.allocation.awareness.attributes", "tag") .put("cluster.routing.allocation.awareness.attributes", "tag")
.build(), new Random(1)); .build(), new Random(1));
MetaData.Builder mb = MetaData.builder(); MetaData.Builder mb = MetaData.builder();
for (int i = 1; i <= numIndices; i++) { for (int i = 1; i <= numIndices; i++) {
@ -64,7 +64,7 @@ public class ClusterAllocationRerouteBenchmark {
RoutingTable routingTable = rb.build(); RoutingTable routingTable = rb.build();
DiscoveryNodes.Builder nb = DiscoveryNodes.builder(); DiscoveryNodes.Builder nb = DiscoveryNodes.builder();
for (int i = 1; i <= numberOfNodes; i++) { for (int i = 1; i <= numberOfNodes; i++) {
nb.put(ElasticsearchAllocationTestCase.newNode("node" + i, numberOfTags == 0 ? ImmutableMap.<String, String>of() : ImmutableMap.of("tag", "tag_" + (i % numberOfTags)))); nb.put(ESAllocationTestCase.newNode("node" + i, numberOfTags == 0 ? ImmutableMap.<String, String>of() : ImmutableMap.of("tag", "tag_" + (i % numberOfTags))));
} }
ClusterState initialClusterState = ClusterState.builder(ClusterName.DEFAULT).metaData(metaData).routingTable(routingTable).nodes(nb).build(); ClusterState initialClusterState = ClusterState.builder(ClusterName.DEFAULT).metaData(metaData).routingTable(routingTable).nodes(nb).build();

View File

@ -27,7 +27,7 @@ import org.elasticsearch.action.index.IndexRequestBuilder;
import org.elasticsearch.action.index.IndexResponse; import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.cluster.block.ClusterBlockException; import org.elasticsearch.cluster.block.ClusterBlockException;
import org.elasticsearch.cluster.metadata.IndexMetaData; import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.HashMap; import java.util.HashMap;
@ -35,8 +35,8 @@ import java.util.HashMap;
import static org.elasticsearch.common.settings.Settings.settingsBuilder; import static org.elasticsearch.common.settings.Settings.settingsBuilder;
import static org.hamcrest.Matchers.notNullValue; import static org.hamcrest.Matchers.notNullValue;
@ElasticsearchIntegrationTest.ClusterScope(scope = ElasticsearchIntegrationTest.Scope.TEST) @ESIntegTestCase.ClusterScope(scope = ESIntegTestCase.Scope.TEST)
public class SimpleBlocksTests extends ElasticsearchIntegrationTest { public class SimpleBlocksIT extends ESIntegTestCase {
@Test @Test
public void verifyIndexAndClusterReadOnly() throws Exception { public void verifyIndexAndClusterReadOnly() throws Exception {

View File

@ -20,12 +20,12 @@
package org.elasticsearch.bootstrap; package org.elasticsearch.bootstrap;
import org.apache.lucene.util.Constants; import org.apache.lucene.util.Constants;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import org.junit.Test; import org.junit.Test;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
public class JNANativesTests extends ElasticsearchTestCase { public class JNANativesTests extends ESTestCase {
@Test @Test
public void testMlockall() { public void testMlockall() {

View File

@ -20,7 +20,7 @@
package org.elasticsearch.bootstrap; package org.elasticsearch.bootstrap;
import org.elasticsearch.Version; import org.elasticsearch.Version;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import java.io.IOException; import java.io.IOException;
import java.net.URL; import java.net.URL;
@ -33,7 +33,7 @@ import java.util.jar.Manifest;
import java.util.zip.ZipEntry; import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream; import java.util.zip.ZipOutputStream;
public class JarHellTests extends ElasticsearchTestCase { public class JarHellTests extends ESTestCase {
URL makeJar(Path dir, String name, Manifest manifest, String... files) throws IOException { URL makeJar(Path dir, String name, Manifest manifest, String... files) throws IOException {
Path jarpath = dir.resolve(name); Path jarpath = dir.resolve(name);

View File

@ -20,17 +20,20 @@
package org.elasticsearch.bootstrap; package org.elasticsearch.bootstrap;
import org.apache.lucene.util.Constants; import org.apache.lucene.util.Constants;
import org.elasticsearch.common.io.PathUtils;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.env.Environment; import org.elasticsearch.env.Environment;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ESTestCase;
import java.io.FilePermission; import java.io.FilePermission;
import java.io.IOException; import java.io.IOException;
import java.nio.file.Files; import java.nio.file.Files;
import java.nio.file.Path; import java.nio.file.Path;
import java.security.PermissionCollection;
import java.security.Permissions; import java.security.Permissions;
import java.util.Set;
public class SecurityTests extends ElasticsearchTestCase { public class SecurityTests extends ESTestCase {
/** test generated permissions */ /** test generated permissions */
public void testGeneratedPermissions() throws Exception { public void testGeneratedPermissions() throws Exception {
@ -53,26 +56,28 @@ public class SecurityTests extends ElasticsearchTestCase {
} }
// the fake es home // the fake es home
assertTrue(permissions.implies(new FilePermission(esHome.toString(), "read"))); assertNoPermissions(esHome, permissions);
// its parent // its parent
assertFalse(permissions.implies(new FilePermission(path.toString(), "read"))); assertNoPermissions(esHome.getParent(), permissions);
// some other sibling // some other sibling
assertFalse(permissions.implies(new FilePermission(path.resolve("other").toString(), "read"))); assertNoPermissions(esHome.getParent().resolve("other"), permissions);
// double check we overwrote java.io.tmpdir correctly for the test // double check we overwrote java.io.tmpdir correctly for the test
assertFalse(permissions.implies(new FilePermission(realTmpDir.toString(), "read"))); assertNoPermissions(PathUtils.get(realTmpDir), permissions);
} }
/** test generated permissions for all configured paths */ /** test generated permissions for all configured paths */
public void testEnvironmentPaths() throws Exception { public void testEnvironmentPaths() throws Exception {
Path path = createTempDir(); Path path = createTempDir();
// make a fake ES home and ensure we only grant permissions to that.
Path esHome = path.resolve("esHome");
Settings.Builder settingsBuilder = Settings.builder(); Settings.Builder settingsBuilder = Settings.builder();
settingsBuilder.put("path.home", path.resolve("home").toString()); settingsBuilder.put("path.home", esHome.resolve("home").toString());
settingsBuilder.put("path.conf", path.resolve("conf").toString()); settingsBuilder.put("path.conf", esHome.resolve("conf").toString());
settingsBuilder.put("path.plugins", path.resolve("plugins").toString()); settingsBuilder.put("path.plugins", esHome.resolve("plugins").toString());
settingsBuilder.putArray("path.data", path.resolve("data1").toString(), path.resolve("data2").toString()); settingsBuilder.putArray("path.data", esHome.resolve("data1").toString(), esHome.resolve("data2").toString());
settingsBuilder.put("path.logs", path.resolve("logs").toString()); settingsBuilder.put("path.logs", esHome.resolve("logs").toString());
settingsBuilder.put("pidfile", path.resolve("test.pid").toString()); settingsBuilder.put("pidfile", esHome.resolve("test.pid").toString());
Settings settings = settingsBuilder.build(); Settings settings = settingsBuilder.build();
Path fakeTmpDir = createTempDir(); Path fakeTmpDir = createTempDir();
@ -87,30 +92,39 @@ public class SecurityTests extends ElasticsearchTestCase {
System.setProperty("java.io.tmpdir", realTmpDir); System.setProperty("java.io.tmpdir", realTmpDir);
} }
// the fake es home
assertNoPermissions(esHome, permissions);
// its parent
assertNoPermissions(esHome.getParent(), permissions);
// some other sibling
assertNoPermissions(esHome.getParent().resolve("other"), permissions);
// double check we overwrote java.io.tmpdir correctly for the test
assertNoPermissions(PathUtils.get(realTmpDir), permissions);
// check that all directories got permissions: // check that all directories got permissions:
// homefile: this is needed unless we break out rules for "lib" dir.
// TODO: make read-only // bin file: ro
assertTrue(permissions.implies(new FilePermission(environment.homeFile().toString(), "read,readlink,write,delete"))); assertExactPermissions(new FilePermission(environment.binFile().toString(), "read,readlink"), permissions);
// config file // lib file: ro
// TODO: make read-only assertExactPermissions(new FilePermission(environment.libFile().toString(), "read,readlink"), permissions);
assertTrue(permissions.implies(new FilePermission(environment.configFile().toString(), "read,readlink,write,delete"))); // config file: ro
// plugins: r/w, TODO: can this be minimized? assertExactPermissions(new FilePermission(environment.configFile().toString(), "read,readlink"), permissions);
assertTrue(permissions.implies(new FilePermission(environment.pluginsFile().toString(), "read,readlink,write,delete"))); // plugins: ro
assertExactPermissions(new FilePermission(environment.pluginsFile().toString(), "read,readlink"), permissions);
// data paths: r/w // data paths: r/w
for (Path dataPath : environment.dataFiles()) { for (Path dataPath : environment.dataFiles()) {
assertTrue(permissions.implies(new FilePermission(dataPath.toString(), "read,readlink,write,delete"))); assertExactPermissions(new FilePermission(dataPath.toString(), "read,readlink,write,delete"), permissions);
} }
for (Path dataPath : environment.dataWithClusterFiles()) { for (Path dataPath : environment.dataWithClusterFiles()) {
assertTrue(permissions.implies(new FilePermission(dataPath.toString(), "read,readlink,write,delete"))); assertExactPermissions(new FilePermission(dataPath.toString(), "read,readlink,write,delete"), permissions);
} }
// logs: r/w // logs: r/w
assertTrue(permissions.implies(new FilePermission(environment.logsFile().toString(), "read,readlink,write,delete"))); assertExactPermissions(new FilePermission(environment.logsFile().toString(), "read,readlink,write,delete"), permissions);
// temp dir: r/w // temp dir: r/w
assertTrue(permissions.implies(new FilePermission(fakeTmpDir.toString(), "read,readlink,write,delete"))); assertExactPermissions(new FilePermission(fakeTmpDir.toString(), "read,readlink,write,delete"), permissions);
// double check we overwrote java.io.tmpdir correctly for the test // PID file: delete only (for the shutdown hook)
assertFalse(permissions.implies(new FilePermission(realTmpDir.toString(), "read"))); assertExactPermissions(new FilePermission(environment.pidFile().toString(), "delete"), permissions);
// PID file: r/w
assertTrue(permissions.implies(new FilePermission(environment.pidFile().toString(), "read,readlink,write,delete")));
} }
public void testEnsureExists() throws IOException { public void testEnsureExists() throws IOException {
@ -226,9 +240,40 @@ public class SecurityTests extends ElasticsearchTestCase {
} }
Permissions permissions = new Permissions(); Permissions permissions = new Permissions();
Security.addPath(permissions, link, "read"); Security.addPath(permissions, link, "read");
assertTrue(permissions.implies(new FilePermission(link.toString(), "read"))); assertExactPermissions(new FilePermission(link.toString(), "read"), permissions);
assertTrue(permissions.implies(new FilePermission(link.resolve("foo").toString(), "read"))); assertExactPermissions(new FilePermission(link.resolve("foo").toString(), "read"), permissions);
assertTrue(permissions.implies(new FilePermission(target.toString(), "read"))); assertExactPermissions(new FilePermission(target.toString(), "read"), permissions);
assertTrue(permissions.implies(new FilePermission(target.resolve("foo").toString(), "read"))); assertExactPermissions(new FilePermission(target.resolve("foo").toString(), "read"), permissions);
}
/**
* checks exact file permissions, meaning those and only those for that path.
*/
static void assertExactPermissions(FilePermission expected, PermissionCollection actual) {
String target = expected.getName(); // see javadocs
Set<String> permissionSet = asSet(expected.getActions().split(","));
boolean read = permissionSet.remove("read");
boolean readlink = permissionSet.remove("readlink");
boolean write = permissionSet.remove("write");
boolean delete = permissionSet.remove("delete");
boolean execute = permissionSet.remove("execute");
assertTrue("unrecognized permission: " + permissionSet, permissionSet.isEmpty());
assertEquals(read, actual.implies(new FilePermission(target, "read")));
assertEquals(readlink, actual.implies(new FilePermission(target, "readlink")));
assertEquals(write, actual.implies(new FilePermission(target, "write")));
assertEquals(delete, actual.implies(new FilePermission(target, "delete")));
assertEquals(execute, actual.implies(new FilePermission(target, "execute")));
}
/**
* checks that this path has no permissions
*/
static void assertNoPermissions(Path path, PermissionCollection actual) {
String target = path.toString();
assertFalse(actual.implies(new FilePermission(target, "read")));
assertFalse(actual.implies(new FilePermission(target, "readlink")));
assertFalse(actual.implies(new FilePermission(target, "write")));
assertFalse(actual.implies(new FilePermission(target, "delete")));
assertFalse(actual.implies(new FilePermission(target, "execute")));
} }
} }

View File

@ -24,7 +24,7 @@ import org.elasticsearch.action.count.CountResponse;
import org.elasticsearch.action.search.SearchPhaseExecutionException; import org.elasticsearch.action.search.SearchPhaseExecutionException;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -35,7 +35,7 @@ import static org.elasticsearch.index.query.QueryBuilders.termQuery;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
public class BroadcastActionsTests extends ElasticsearchIntegrationTest { public class BroadcastActionsIT extends ESIntegTestCase {
@Override @Override
protected int maximumNumberOfReplicas() { protected int maximumNumberOfReplicas() {

View File

@ -24,8 +24,8 @@ import org.apache.lucene.util.TestUtil;
import org.elasticsearch.Version; import org.elasticsearch.Version;
import org.elasticsearch.action.admin.indices.analyze.AnalyzeResponse; import org.elasticsearch.action.admin.indices.analyze.AnalyzeResponse;
import org.elasticsearch.indices.analysis.PreBuiltAnalyzers; import org.elasticsearch.indices.analysis.PreBuiltAnalyzers;
import org.elasticsearch.test.ElasticsearchBackwardsCompatIntegrationTest; import org.elasticsearch.test.ESBackcompatTestCase;
import org.elasticsearch.test.ElasticsearchIntegrationTest; import org.elasticsearch.test.ESIntegTestCase;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -38,8 +38,8 @@ import java.util.regex.Pattern;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
@ElasticsearchIntegrationTest.ClusterScope(numDataNodes = 0, scope = ElasticsearchIntegrationTest.Scope.SUITE, numClientNodes = 0, transportClientRatio = 0.0) @ESIntegTestCase.ClusterScope(numDataNodes = 0, scope = ESIntegTestCase.Scope.SUITE, numClientNodes = 0, transportClientRatio = 0.0)
public class BasicAnalysisBackwardCompatibilityTests extends ElasticsearchBackwardsCompatIntegrationTest { public class BasicAnalysisBackwardCompatibilityIT extends ESBackcompatTestCase {
// This pattern match characters with Line_Break = Complex_Content. // This pattern match characters with Line_Break = Complex_Content.
final static Pattern complexUnicodeChars = Pattern.compile("[\u17B4\u17B5\u17D3\u17CB-\u17D1\u17DD\u1036\u17C6\u1A74\u1038\u17C7\u0E4E\u0E47-\u0E4D\u0EC8-\u0ECD\uAABF\uAAC1\u1037\u17C8-\u17CA\u1A75-\u1A7C\u1AA8-\u1AAB\uAADE\uAADF\u1AA0-\u1AA6\u1AAC\u1AAD\u109E\u109F\uAA77-\uAA79\u0E46\u0EC6\u17D7\u1AA7\uA9E6\uAA70\uAADD\u19DA\u0E01-\u0E3A\u0E40-\u0E45\u0EDE\u0E81\u0E82\u0E84\u0E87\u0E88\u0EAA\u0E8A\u0EDF\u0E8D\u0E94-\u0E97\u0E99-\u0E9F\u0EA1-\u0EA3\u0EA5\u0EA7\u0EAB\u0EDC\u0EDD\u0EAD-\u0EB9\u0EBB-\u0EBD\u0EC0-\u0EC4\uAA80-\uAABE\uAAC0\uAAC2\uAADB\uAADC\u1000\u1075\u1001\u1076\u1002\u1077\uAA60\uA9E9\u1003\uA9E0\uA9EA\u1004\u105A\u1005\u1078\uAA61\u1006\uA9E1\uAA62\uAA7E\u1007\uAA63\uA9EB\u1079\uAA72\u1008\u105B\uA9E2\uAA64\uA9EC\u1061\uAA7F\u1009\u107A\uAA65\uA9E7\u100A\u100B\uAA66\u100C\uAA67\u100D\uAA68\uA9ED\u100E\uAA69\uA9EE\u100F\u106E\uA9E3\uA9EF\u1010-\u1012\u107B\uA9FB\u1013\uAA6A\uA9FC\u1014\u107C\uAA6B\u105E\u1015\u1016\u107D\u107E\uAA6F\u108E\uA9E8\u1017\u107F\uA9FD\u1018\uA9E4\uA9FE\u1019\u105F\u101A\u103B\u101B\uAA73\uAA7A\u103C\u101C\u1060\u101D\u103D\u1082\u1080\u1050\u1051\u1065\u101E\u103F\uAA6C\u101F\u1081\uAA6D\u103E\uAA6E\uAA71\u1020\uA9FA\u105C\u105D\u106F\u1070\u1066\u1021-\u1026\u1052-\u1055\u1027-\u102A\u102C\u102B\u1083\u1072\u109C\u102D\u1071\u102E\u1033\u102F\u1073\u1074\u1030\u1056-\u1059\u1031\u1084\u1035\u1085\u1032\u109D\u1034\u1062\u1067\u1068\uA9E5\u1086\u1039\u103A\u1063\u1064\u1069-\u106D\u1087\u108B\u1088\u108C\u108D\u1089\u108A\u108F\u109A\u109B\uAA7B-\uAA7D\uAA74-\uAA76\u1780-\u17A2\u17DC\u17A3-\u17B3\u17B6-\u17C5\u17D2\u1950-\u196D\u1970-\u1974\u1980-\u199C\u19DE\u19DF\u199D-\u19AB\u19B0-\u19C9\u1A20-\u1A26\u1A58\u1A59\u1A27-\u1A3B\u1A5A\u1A5B\u1A3C-\u1A46\u1A54\u1A47-\u1A4C\u1A53\u1A6B\u1A55-\u1A57\u1A5C-\u1A5E\u1A4D-\u1A52\u1A61\u1A6C\u1A62-\u1A6A\u1A6E\u1A6F\u1A73\u1A70-\u1A72\u1A6D\u1A60]"); final static Pattern complexUnicodeChars = Pattern.compile("[\u17B4\u17B5\u17D3\u17CB-\u17D1\u17DD\u1036\u17C6\u1A74\u1038\u17C7\u0E4E\u0E47-\u0E4D\u0EC8-\u0ECD\uAABF\uAAC1\u1037\u17C8-\u17CA\u1A75-\u1A7C\u1AA8-\u1AAB\uAADE\uAADF\u1AA0-\u1AA6\u1AAC\u1AAD\u109E\u109F\uAA77-\uAA79\u0E46\u0EC6\u17D7\u1AA7\uA9E6\uAA70\uAADD\u19DA\u0E01-\u0E3A\u0E40-\u0E45\u0EDE\u0E81\u0E82\u0E84\u0E87\u0E88\u0EAA\u0E8A\u0EDF\u0E8D\u0E94-\u0E97\u0E99-\u0E9F\u0EA1-\u0EA3\u0EA5\u0EA7\u0EAB\u0EDC\u0EDD\u0EAD-\u0EB9\u0EBB-\u0EBD\u0EC0-\u0EC4\uAA80-\uAABE\uAAC0\uAAC2\uAADB\uAADC\u1000\u1075\u1001\u1076\u1002\u1077\uAA60\uA9E9\u1003\uA9E0\uA9EA\u1004\u105A\u1005\u1078\uAA61\u1006\uA9E1\uAA62\uAA7E\u1007\uAA63\uA9EB\u1079\uAA72\u1008\u105B\uA9E2\uAA64\uA9EC\u1061\uAA7F\u1009\u107A\uAA65\uA9E7\u100A\u100B\uAA66\u100C\uAA67\u100D\uAA68\uA9ED\u100E\uAA69\uA9EE\u100F\u106E\uA9E3\uA9EF\u1010-\u1012\u107B\uA9FB\u1013\uAA6A\uA9FC\u1014\u107C\uAA6B\u105E\u1015\u1016\u107D\u107E\uAA6F\u108E\uA9E8\u1017\u107F\uA9FD\u1018\uA9E4\uA9FE\u1019\u105F\u101A\u103B\u101B\uAA73\uAA7A\u103C\u101C\u1060\u101D\u103D\u1082\u1080\u1050\u1051\u1065\u101E\u103F\uAA6C\u101F\u1081\uAA6D\u103E\uAA6E\uAA71\u1020\uA9FA\u105C\u105D\u106F\u1070\u1066\u1021-\u1026\u1052-\u1055\u1027-\u102A\u102C\u102B\u1083\u1072\u109C\u102D\u1071\u102E\u1033\u102F\u1073\u1074\u1030\u1056-\u1059\u1031\u1084\u1035\u1085\u1032\u109D\u1034\u1062\u1067\u1068\uA9E5\u1086\u1039\u103A\u1063\u1064\u1069-\u106D\u1087\u108B\u1088\u108C\u108D\u1089\u108A\u108F\u109A\u109B\uAA7B-\uAA7D\uAA74-\uAA76\u1780-\u17A2\u17DC\u17A3-\u17B3\u17B6-\u17C5\u17D2\u1950-\u196D\u1970-\u1974\u1980-\u199C\u19DE\u19DF\u199D-\u19AB\u19B0-\u19C9\u1A20-\u1A26\u1A58\u1A59\u1A27-\u1A3B\u1A5A\u1A5B\u1A3C-\u1A46\u1A54\u1A47-\u1A4C\u1A53\u1A6B\u1A55-\u1A57\u1A5C-\u1A5E\u1A4D-\u1A52\u1A61\u1A6C\u1A62-\u1A6A\u1A6E\u1A6F\u1A73\u1A70-\u1A72\u1A6D\u1A60]");

View File

@ -46,7 +46,6 @@ import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.cluster.routing.IndexRoutingTable; import org.elasticsearch.cluster.routing.IndexRoutingTable;
import org.elasticsearch.cluster.routing.IndexShardRoutingTable; import org.elasticsearch.cluster.routing.IndexShardRoutingTable;
import org.elasticsearch.cluster.routing.ShardRouting; import org.elasticsearch.cluster.routing.ShardRouting;
import org.elasticsearch.cluster.routing.allocation.decider.EnableAllocationDecider;
import org.elasticsearch.common.collect.ImmutableOpenMap; import org.elasticsearch.common.collect.ImmutableOpenMap;
import org.elasticsearch.common.regex.Regex; import org.elasticsearch.common.regex.Regex;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
@ -59,7 +58,7 @@ import org.elasticsearch.index.mapper.internal.FieldNamesFieldMapper;
import org.elasticsearch.index.query.QueryBuilders; import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.search.SearchHit; import org.elasticsearch.search.SearchHit;
import org.elasticsearch.search.sort.SortOrder; import org.elasticsearch.search.sort.SortOrder;
import org.elasticsearch.test.ElasticsearchBackwardsCompatIntegrationTest; import org.elasticsearch.test.ESBackcompatTestCase;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -76,7 +75,7 @@ import static org.hamcrest.Matchers.*;
/** /**
*/ */
public class BasicBackwardsCompatibilityTest extends ElasticsearchBackwardsCompatIntegrationTest { public class BasicBackwardsCompatibilityIT extends ESBackcompatTestCase {
/** /**
* Basic test using Index & Realtime Get with external versioning. This test ensures routing works correctly across versions. * Basic test using Index & Realtime Get with external versioning. This test ensures routing works correctly across versions.

View File

@ -29,7 +29,7 @@ import org.elasticsearch.cluster.block.ClusterBlockLevel;
import org.elasticsearch.cluster.block.ClusterBlocks; import org.elasticsearch.cluster.block.ClusterBlocks;
import org.elasticsearch.cluster.metadata.IndexMetaData; import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.test.ElasticsearchBackwardsCompatIntegrationTest; import org.elasticsearch.test.ESBackcompatTestCase;
import org.junit.Test; import org.junit.Test;
import java.util.HashMap; import java.util.HashMap;
@ -38,7 +38,7 @@ import java.util.Map;
import static org.elasticsearch.cluster.metadata.IndexMetaData.*; import static org.elasticsearch.cluster.metadata.IndexMetaData.*;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
public class ClusterStateBackwardsCompatTests extends ElasticsearchBackwardsCompatIntegrationTest { public class ClusterStateBackwardsCompatIT extends ESBackcompatTestCase {
@Test @Test
public void testClusterState() throws Exception { public void testClusterState() throws Exception {

Some files were not shown because too many files have changed in this diff Show More