mirror of
https://github.com/honeymoose/OpenSearch.git
synced 2025-03-24 17:09:48 +00:00
[TEST] Randomized number of shards used for indices created during tests
Introduced two levels of randomization for the number of shards (between 1 and 10) when running tests: 1) through the existing random index template, which now sets a random number of shards that is shared across all the indices created in the same test method unless overwritten 2) through `createIndex` and `prepareCreate` methods, similar to what happens using the `indexSettings` method, which changes for every `createIndex` or `prepareCreate` unless overwritten (overwrites index template for what concerns the number of shards) Added the following facilities to deal with the random number of shards: - `getNumShards` to retrieve the number of shards of a given existing index, useful when doing comparisons based on the number of shards and we can avoid specifying a static number. The method returns an object containing the number of primaries, number of replicas and the total number of shards for the existing index - added `assertFailures` that checks that a shard failure happened during a search request, either partial failure or total (all shards failed). Checks also the error code and the error message related to the failure. This is needed as without knowing the number of shards upfront, when simulating errors we can run into either partial (search returns partial results and failures) or total failures (search returns an error) - added common methods similar to `indexSettings`, to be used in combination with `createIndex` and `prepareCreate` method and explicitly control the second level of randomization: `numberOfShards`, `minimumNumberOfShards` and `maximumNumberOfShards`. Added also `numberOfReplicas` despite the number of replicas is not randomized (default not specified but can be overwritten by tests) Tests that specified the number of shards have been reviewed and the results follow: - removed number_of_shards in node settings, ignored anyway as it would be overwritten by both mechanisms above - remove specific number of shards when not needed - removed manual shards randomization where present, replaced with ordinary one that's now available - adapted tests that didn't need a specific number of shards to the new random behaviour - fixed a couple of test bugs (e.g. 3 levels parent child test could only work on a single shard as the routing key used for grand-children wasn't correct) - also done some cleanup, shared code through shard size facets and aggs tests and used common methods like `assertAcked`, `ensureGreen`, `refresh`, `flush` and `refreshAndFlush` where possible - made sure that `indexSettings()` is always used as a basis when using `prepareCreate` to inject specific settings - converted indexRandom(false, ...) + refresh to indexRandom(true, ...)
This commit is contained in:
parent
bb63b3fa61
commit
d5aaa90f34
@ -61,9 +61,15 @@ In case you only need to execute a unit test, because your implementation can be
|
||||
[[integration-tests]]
|
||||
=== integration tests
|
||||
|
||||
These kind of tests require firing up a whole cluster of nodes, before the tests can actually be run. Compared to unit tests they are obviously way more time consuming, but the test infrastructure tries to minimize the time cost by only restarting the whole cluster, if this is configured explicitely.
|
||||
These kind of tests require firing up a whole cluster of nodes, before the tests can actually be run. Compared to unit tests they are obviously way more time consuming, but the test infrastructure tries to minimize the time cost by only restarting the whole cluster, if this is configured explicitly.
|
||||
|
||||
The class your tests have to inherit from is `ElasticsearchIntegrationTest`. As soon as you inherit, there is no need for you to start any elasticsearch nodes manually in your test anymore, though you might need to ensure that at least a certain amount of nodes is up running.
|
||||
The class your tests have to inherit from is `ElasticsearchIntegrationTest`. As soon as you inherit, there is no need for you to start any elasticsearch nodes manually in your test anymore, though you might need to ensure that at least a certain amount of nodes is up and running.
|
||||
|
||||
[[number-of-shards]]
|
||||
==== number of shards
|
||||
|
||||
The number of shards used for indices created during integration tests is randomized between `1` and `10` unless overwritten upon index creation via index settings.
|
||||
Rule of thumb is not to specify the number of shards unless needed, so that each test will use a different one all the time.
|
||||
|
||||
[[helper-methods]]
|
||||
==== generic helper methods
|
||||
@ -197,7 +203,7 @@ As many elasticsearch tests are checking for a similar output, like the amount o
|
||||
|
||||
[horizontal]
|
||||
`assertHitCount()`:: Checks hit count of a search or count request
|
||||
`assertAcked()`:: Ensure the a request has been ackknowledged by the master
|
||||
`assertAcked()`:: Ensure the a request has been acknowledged by the master
|
||||
`assertSearchHits()`:: Asserts a search response contains specific ids
|
||||
`assertMatchCount()`:: Asserts a matching count from a percolation response
|
||||
`assertFirstHit()`:: Asserts the first hit hits the specified matcher
|
||||
@ -205,6 +211,7 @@ As many elasticsearch tests are checking for a similar output, like the amount o
|
||||
`assertThirdHit()`:: Asserts the third hits hits the specified matcher
|
||||
`assertSearchHit()`:: Assert a certain element in a search response hits the specified matcher
|
||||
`assertNoFailures()`:: Asserts that no shard failures have occured in the response
|
||||
`assertFailures()`:: Asserts that shard failures have happened during a search request
|
||||
`assertHighlight()`:: Assert specific highlights matched
|
||||
`assertSuggestion()`:: Assert for specific suggestions
|
||||
`assertSuggestionSize()`:: Assert for specific suggestion count
|
||||
|
@ -48,6 +48,7 @@ import java.io.Reader;
|
||||
import java.util.*;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
public abstract class AbstractTermVectorTests extends ElasticsearchIntegrationTest {
|
||||
@ -180,21 +181,18 @@ public abstract class AbstractTermVectorTests extends ElasticsearchIntegrationTe
|
||||
}
|
||||
}
|
||||
|
||||
protected void createIndexBasedOnFieldSettings(TestFieldSetting[] fieldSettings, int number_of_shards) throws IOException {
|
||||
wipeIndices("test");
|
||||
protected void createIndexBasedOnFieldSettings(String index, TestFieldSetting[] fieldSettings) throws IOException {
|
||||
XContentBuilder mappingBuilder = jsonBuilder();
|
||||
mappingBuilder.startObject().startObject("type1").startObject("properties");
|
||||
for (TestFieldSetting field : fieldSettings) {
|
||||
field.addToMappings(mappingBuilder);
|
||||
}
|
||||
mappingBuilder.endObject().endObject().endObject();
|
||||
ImmutableSettings.Builder settings = ImmutableSettings.settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.analysis.analyzer.tv_test.tokenizer", "standard")
|
||||
.putArray("index.analysis.analyzer.tv_test.filter", "type_as_payload", "lowercase");
|
||||
if (number_of_shards > 0) {
|
||||
settings.put("number_of_shards", number_of_shards);
|
||||
}
|
||||
mappingBuilder.endObject().endObject().endObject();
|
||||
prepareCreate("test").addMapping("type1", mappingBuilder).setSettings(settings).get();
|
||||
assertAcked(prepareCreate(index).addMapping("type1", mappingBuilder).setSettings(settings));
|
||||
|
||||
ensureYellow();
|
||||
}
|
||||
|
@ -27,20 +27,39 @@ import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.common.io.BytesStream;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.elasticsearch.test.hamcrest.ElasticsearchAssertions;
|
||||
import org.hamcrest.Matchers;
|
||||
import org.junit.Test;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
public class GetTermVectorCheckDocFreqTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Override
|
||||
protected int numberOfShards() {
|
||||
return 1;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return 0;
|
||||
}
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put(super.indexSettings())
|
||||
.put("index.analysis.analyzer.tv_test.tokenizer", "whitespace")
|
||||
.putArray("index.analysis.analyzer.tv_test.filter", "type_as_payload", "lowercase")
|
||||
.build();
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testSimpleTermVectors() throws ElasticsearchException, IOException {
|
||||
@ -53,12 +72,7 @@ public class GetTermVectorCheckDocFreqTests extends ElasticsearchIntegrationTest
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject().endObject();
|
||||
ElasticsearchAssertions.assertAcked(prepareCreate("test").addMapping("type1", mapping).setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.analysis.analyzer.tv_test.tokenizer", "whitespace")
|
||||
.put("index.number_of_replicas", 0)
|
||||
.putArray("index.analysis.analyzer.tv_test.filter", "type_as_payload", "lowercase")));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
int numDocs = 15;
|
||||
for (int i = 0; i < numDocs; i++) {
|
||||
@ -242,9 +256,8 @@ public class GetTermVectorCheckDocFreqTests extends ElasticsearchIntegrationTest
|
||||
}
|
||||
assertThat(iterator.next(), Matchers.nullValue());
|
||||
|
||||
XContentBuilder xBuilder = new XContentFactory().jsonBuilder();
|
||||
|
||||
response.toXContent(xBuilder, null);
|
||||
XContentBuilder xBuilder = XContentFactory.jsonBuilder();
|
||||
response.toXContent(xBuilder, ToXContent.EMPTY_PARAMS);
|
||||
BytesStream bytesStream = xBuilder.bytesStream();
|
||||
String utf8 = bytesStream.bytes().toUtf8();
|
||||
String expectedString = "{\"_index\":\"test\",\"_type\":\"type1\",\"_id\":\""
|
||||
|
@ -26,11 +26,9 @@ import org.apache.lucene.index.*;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.ActionFuture;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.index.mapper.core.AbstractFieldMapper;
|
||||
import org.elasticsearch.test.hamcrest.ElasticsearchAssertions;
|
||||
import org.hamcrest.Matchers;
|
||||
import org.junit.Test;
|
||||
|
||||
@ -40,13 +38,13 @@ import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertThrows;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
public class GetTermVectorTests extends AbstractTermVectorTests {
|
||||
|
||||
|
||||
|
||||
@Test
|
||||
public void testNoSuchDoc() throws Exception {
|
||||
XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
@ -57,7 +55,7 @@ public class GetTermVectorTests extends AbstractTermVectorTests {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject().endObject();
|
||||
ElasticsearchAssertions.assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
|
||||
ensureYellow();
|
||||
|
||||
@ -83,7 +81,7 @@ public class GetTermVectorTests extends AbstractTermVectorTests {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject().endObject();
|
||||
ElasticsearchAssertions.assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
|
||||
ensureYellow();
|
||||
// when indexing a field that simply has a question mark, the term
|
||||
@ -111,7 +109,7 @@ public class GetTermVectorTests extends AbstractTermVectorTests {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject().endObject();
|
||||
ElasticsearchAssertions.assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureYellow();
|
||||
// when indexing a field that simply has a question mark, the term
|
||||
// vectors will be null
|
||||
@ -139,8 +137,9 @@ public class GetTermVectorTests extends AbstractTermVectorTests {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject().endObject();
|
||||
ElasticsearchAssertions.assertAcked(prepareCreate("test").addMapping("type1", mapping)
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping)
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.analysis.analyzer.tv_test.tokenizer", "whitespace")
|
||||
.putArray("index.analysis.analyzer.tv_test.filter", "type_as_payload", "lowercase")));
|
||||
ensureYellow();
|
||||
@ -251,8 +250,8 @@ public class GetTermVectorTests extends AbstractTermVectorTests {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject().endObject();
|
||||
ElasticsearchAssertions.assertAcked(prepareCreate("test").addMapping("type1", mapping)
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping)
|
||||
.setSettings(settingsBuilder()
|
||||
.put("index.analysis.analyzer.tv_test.tokenizer", "whitespace")
|
||||
.putArray("index.analysis.analyzer.tv_test.filter", "type_as_payload", "lowercase")));
|
||||
ensureYellow();
|
||||
@ -355,8 +354,9 @@ public class GetTermVectorTests extends AbstractTermVectorTests {
|
||||
@Test
|
||||
public void testDuelESLucene() throws Exception {
|
||||
TestFieldSetting[] testFieldSettings = getFieldSettings();
|
||||
createIndexBasedOnFieldSettings(testFieldSettings, -1);
|
||||
TestDoc[] testDocs = generateTestDocs(5, testFieldSettings);
|
||||
createIndexBasedOnFieldSettings("test", testFieldSettings);
|
||||
//we generate as many docs as many shards we have
|
||||
TestDoc[] testDocs = generateTestDocs(getNumShards("test").numPrimaries, testFieldSettings);
|
||||
|
||||
DirectoryReader directoryReader = indexDocsWithLucene(testDocs);
|
||||
TestConfig[] testConfigs = generateTestConfigs(20, testDocs, testFieldSettings);
|
||||
@ -401,8 +401,10 @@ public class GetTermVectorTests extends AbstractTermVectorTests {
|
||||
XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("field").field("type", "string").field("term_vector", "with_positions_offsets_payloads")
|
||||
.field("analyzer", "payload_test").endObject().endObject().endObject().endObject();
|
||||
ElasticsearchAssertions.assertAcked(prepareCreate("test").addMapping("type1", mapping).setSettings(
|
||||
ImmutableSettings.settingsBuilder().put("index.analysis.analyzer.payload_test.tokenizer", "whitespace")
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping).setSettings(
|
||||
settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.analysis.analyzer.payload_test.tokenizer", "whitespace")
|
||||
.putArray("index.analysis.analyzer.payload_test.filter", "my_delimited_payload_filter")
|
||||
.put("index.analysis.filter.my_delimited_payload_filter.delimiter", delimiter)
|
||||
.put("index.analysis.filter.my_delimited_payload_filter.encoding", encodingString)
|
||||
|
@ -30,8 +30,9 @@ public class MultiTermVectorsTests extends AbstractTermVectorTests {
|
||||
@Test
|
||||
public void testDuelESLucene() throws Exception {
|
||||
AbstractTermVectorTests.TestFieldSetting[] testFieldSettings = getFieldSettings();
|
||||
createIndexBasedOnFieldSettings(testFieldSettings, -1);
|
||||
AbstractTermVectorTests.TestDoc[] testDocs = generateTestDocs(5, testFieldSettings);
|
||||
createIndexBasedOnFieldSettings("test", testFieldSettings);
|
||||
//we generate as many docs as many shards we have
|
||||
TestDoc[] testDocs = generateTestDocs(getNumShards("test").numPrimaries, testFieldSettings);
|
||||
|
||||
DirectoryReader directoryReader = indexDocsWithLucene(testDocs);
|
||||
AbstractTermVectorTests.TestConfig[] testConfigs = generateTestConfigs(20, testDocs, testFieldSettings);
|
||||
@ -62,6 +63,7 @@ public class MultiTermVectorsTests extends AbstractTermVectorTests {
|
||||
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testMissingIndexThrowsMissingIndex() throws Exception {
|
||||
TermVectorRequestBuilder requestBuilder = client().prepareTermVector("testX", "typeX", Integer.toString(1));
|
||||
MultiTermVectorsRequestBuilder mtvBuilder = new MultiTermVectorsRequestBuilder(client());
|
||||
|
@ -33,8 +33,6 @@ import org.elasticsearch.cluster.metadata.AliasAction;
|
||||
import org.elasticsearch.cluster.metadata.AliasMetaData;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.common.StopWatch;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.unit.TimeValue;
|
||||
import org.elasticsearch.index.query.FilterBuilder;
|
||||
import org.elasticsearch.index.query.FilterBuilders;
|
||||
@ -550,16 +548,13 @@ public class IndexAliasesTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testIndicesGetAliases() throws Exception {
|
||||
Settings indexSettings = ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
.build();
|
||||
|
||||
logger.info("--> creating indices [foobar, test, test123, foobarbaz, bazbar]");
|
||||
assertAcked(prepareCreate("foobar").setSettings(indexSettings));
|
||||
assertAcked(prepareCreate("test").setSettings(indexSettings));
|
||||
assertAcked(prepareCreate("test123").setSettings(indexSettings));
|
||||
assertAcked(prepareCreate("foobarbaz").setSettings(indexSettings));
|
||||
assertAcked(prepareCreate("bazbar").setSettings(indexSettings));
|
||||
createIndex("foobar");
|
||||
createIndex("test");
|
||||
createIndex("test123");
|
||||
createIndex("foobarbaz");
|
||||
createIndex("bazbar");
|
||||
|
||||
ensureGreen();
|
||||
|
||||
|
@ -46,6 +46,8 @@ public class BroadcastActionsTests extends ElasticsearchIntegrationTest {
|
||||
public void testBroadcastOperations() throws IOException {
|
||||
prepareCreate("test", 1).execute().actionGet(5000);
|
||||
|
||||
NumShards numShards = getNumShards("test");
|
||||
|
||||
logger.info("Running Cluster Health");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForYellowStatus()).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
@ -65,8 +67,8 @@ public class BroadcastActionsTests extends ElasticsearchIntegrationTest {
|
||||
.setQuery(termQuery("_type", "type1"))
|
||||
.setOperationThreading(BroadcastOperationThreading.NO_THREADS).get();
|
||||
assertThat(countResponse.getCount(), equalTo(2l));
|
||||
assertThat(countResponse.getTotalShards(), equalTo(5));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(5));
|
||||
assertThat(countResponse.getTotalShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(0));
|
||||
}
|
||||
|
||||
@ -75,8 +77,8 @@ public class BroadcastActionsTests extends ElasticsearchIntegrationTest {
|
||||
.setQuery(termQuery("_type", "type1"))
|
||||
.setOperationThreading(BroadcastOperationThreading.SINGLE_THREAD).get();
|
||||
assertThat(countResponse.getCount(), equalTo(2l));
|
||||
assertThat(countResponse.getTotalShards(), equalTo(5));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(5));
|
||||
assertThat(countResponse.getTotalShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(0));
|
||||
}
|
||||
|
||||
@ -85,8 +87,8 @@ public class BroadcastActionsTests extends ElasticsearchIntegrationTest {
|
||||
.setQuery(termQuery("_type", "type1"))
|
||||
.setOperationThreading(BroadcastOperationThreading.THREAD_PER_SHARD).get();
|
||||
assertThat(countResponse.getCount(), equalTo(2l));
|
||||
assertThat(countResponse.getTotalShards(), equalTo(5));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(5));
|
||||
assertThat(countResponse.getTotalShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(0));
|
||||
}
|
||||
|
||||
@ -95,9 +97,9 @@ public class BroadcastActionsTests extends ElasticsearchIntegrationTest {
|
||||
CountResponse countResponse = client().count(countRequest("test").source("{ term : { _type : \"type1 } }".getBytes(Charsets.UTF_8))).actionGet();
|
||||
|
||||
assertThat(countResponse.getCount(), equalTo(0l));
|
||||
assertThat(countResponse.getTotalShards(), equalTo(5));
|
||||
assertThat(countResponse.getTotalShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(0));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(5));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(numShards.numPrimaries));
|
||||
for (ShardOperationFailedException exp : countResponse.getShardFailures()) {
|
||||
assertThat(exp.reason(), containsString("QueryParsingException"));
|
||||
}
|
||||
|
@ -50,7 +50,6 @@ public class MinimumMasterNodesTests extends ElasticsearchIntegrationTest {
|
||||
.put("discovery.zen.ping_timeout", "200ms")
|
||||
.put("discovery.initial_state_timeout", "500ms")
|
||||
.put("gateway.type", "local")
|
||||
.put("index.number_of_shards", 1)
|
||||
.build();
|
||||
|
||||
logger.info("--> start first node");
|
||||
@ -76,16 +75,16 @@ public class MinimumMasterNodesTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(state.metaData().indices().containsKey("test"), equalTo(false));
|
||||
|
||||
createIndex("test");
|
||||
NumShards numShards = getNumShards("test");
|
||||
logger.info("--> indexing some data");
|
||||
for (int i = 0; i < 100; i++) {
|
||||
client().prepareIndex("test", "type1", Integer.toString(i)).setSource("field", "value").execute().actionGet();
|
||||
}
|
||||
// make sure that all shards recovered before trying to flush
|
||||
assertThat(client().admin().cluster().prepareHealth("test").setWaitForActiveShards(2).execute().actionGet().getActiveShards(), equalTo(2));
|
||||
assertThat(client().admin().cluster().prepareHealth("test").setWaitForActiveShards(numShards.totalNumShards).execute().actionGet().getActiveShards(), equalTo(numShards.totalNumShards));
|
||||
// flush for simpler debugging
|
||||
client().admin().indices().prepareFlush().execute().actionGet();
|
||||
flushAndRefresh();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
logger.info("--> verify we the data back");
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertThat(client().prepareCount().setQuery(QueryBuilders.matchAllQuery()).execute().actionGet().getCount(), equalTo(100l));
|
||||
@ -151,10 +150,7 @@ public class MinimumMasterNodesTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(state.metaData().indices().containsKey("test"), equalTo(true));
|
||||
|
||||
logger.info("Running Cluster Health");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus()).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> verify we the data back");
|
||||
for (int i = 0; i < 10; i++) {
|
||||
@ -208,12 +204,13 @@ public class MinimumMasterNodesTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(state.nodes().size(), equalTo(4));
|
||||
|
||||
createIndex("test");
|
||||
NumShards numShards = getNumShards("test");
|
||||
logger.info("--> indexing some data");
|
||||
for (int i = 0; i < 100; i++) {
|
||||
client().prepareIndex("test", "type1", Integer.toString(i)).setSource("field", "value").execute().actionGet();
|
||||
}
|
||||
// make sure that all shards recovered before trying to flush
|
||||
assertThat(client().admin().cluster().prepareHealth("test").setWaitForActiveShards(10).execute().actionGet().isTimedOut(), equalTo(false));
|
||||
assertThat(client().admin().cluster().prepareHealth("test").setWaitForActiveShards(numShards.totalNumShards).execute().actionGet().isTimedOut(), equalTo(false));
|
||||
// flush for simpler debugging
|
||||
client().admin().indices().prepareFlush().execute().actionGet();
|
||||
|
||||
@ -235,7 +232,7 @@ public class MinimumMasterNodesTests extends ElasticsearchIntegrationTest {
|
||||
ClusterState state = client.admin().cluster().prepareState().setLocal(true).execute().actionGet().getState();
|
||||
success &= state.blocks().hasGlobalBlock(Discovery.NO_MASTER_BLOCK);
|
||||
if (logger.isDebugEnabled()) {
|
||||
logger.debug("Checking for NO_MASTER_BLOCL on client: {} NO_MASTER_BLOCK: [{}]", client, state.blocks().hasGlobalBlock(Discovery.NO_MASTER_BLOCK));
|
||||
logger.debug("Checking for NO_MASTER_BLOCK on client: {} NO_MASTER_BLOCK: [{}]", client, state.blocks().hasGlobalBlock(Discovery.NO_MASTER_BLOCK));
|
||||
}
|
||||
}
|
||||
return success;
|
||||
|
@ -51,7 +51,6 @@ public class NoMasterNodeTests extends ElasticsearchIntegrationTest {
|
||||
.put("discovery.zen.minimum_master_nodes", 2)
|
||||
.put("discovery.zen.ping_timeout", "200ms")
|
||||
.put("discovery.initial_state_timeout", "500ms")
|
||||
.put("index.number_of_shards", 1)
|
||||
.build();
|
||||
|
||||
TimeValue timeout = TimeValue.timeValueMillis(200);
|
||||
|
@ -67,9 +67,9 @@ public class SpecificMasterNodesTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
logger.info("--> start master node");
|
||||
final String nextMasterEligableNodeName = cluster().startNode(settingsBuilder().put("node.data", false).put("node.master", true));
|
||||
assertThat(cluster().nonMasterClient().admin().cluster().prepareState().execute().actionGet().getState().nodes().masterNode().name(), equalTo(nextMasterEligableNodeName));
|
||||
assertThat(cluster().masterClient().admin().cluster().prepareState().execute().actionGet().getState().nodes().masterNode().name(), equalTo(nextMasterEligableNodeName));
|
||||
final String nextMasterEligibleNodeName = cluster().startNode(settingsBuilder().put("node.data", false).put("node.master", true));
|
||||
assertThat(cluster().nonMasterClient().admin().cluster().prepareState().execute().actionGet().getState().nodes().masterNode().name(), equalTo(nextMasterEligibleNodeName));
|
||||
assertThat(cluster().masterClient().admin().cluster().prepareState().execute().actionGet().getState().nodes().masterNode().name(), equalTo(nextMasterEligibleNodeName));
|
||||
}
|
||||
|
||||
@Test
|
||||
@ -111,7 +111,7 @@ public class SpecificMasterNodesTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("--> start data node / non master node");
|
||||
cluster().startNode(settingsBuilder().put("node.data", true).put("node.master", false));
|
||||
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings("number_of_shards", 1).get());
|
||||
createIndex("test");
|
||||
assertAcked(client().admin().indices().preparePutMapping("test").setType("_default_").setSource("_timestamp", "enabled=true"));
|
||||
|
||||
MappingMetaData defaultMapping = client().admin().cluster().prepareState().get().getState().getMetaData().getIndices().get("test").getMappings().get("_default_");
|
||||
|
@ -41,16 +41,17 @@ public class UpdateSettingsValidationTests extends ElasticsearchIntegrationTest
|
||||
String node_1 = cluster().startNode(settingsBuilder().put("node.master", false).build());
|
||||
String node_2 = cluster().startNode(settingsBuilder().put("node.master", false).build());
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 5).put("index.number_of_replicas", 1)).execute().actionGet();
|
||||
createIndex("test");
|
||||
NumShards test = getNumShards("test");
|
||||
|
||||
ClusterHealthResponse healthResponse = client().admin().cluster().prepareHealth("test").setWaitForEvents(Priority.LANGUID).setWaitForNodes("3").setWaitForGreenStatus().execute().actionGet();
|
||||
assertThat(healthResponse.isTimedOut(), equalTo(false));
|
||||
assertThat(healthResponse.getIndices().get("test").getActiveShards(), equalTo(10));
|
||||
assertThat(healthResponse.getIndices().get("test").getActiveShards(), equalTo(test.totalNumShards));
|
||||
|
||||
client().admin().indices().prepareUpdateSettings("test").setSettings(settingsBuilder().put("index.number_of_replicas", 0)).execute().actionGet();
|
||||
healthResponse = client().admin().cluster().prepareHealth("test").setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertThat(healthResponse.isTimedOut(), equalTo(false));
|
||||
assertThat(healthResponse.getIndices().get("test").getActiveShards(), equalTo(5));
|
||||
assertThat(healthResponse.getIndices().get("test").getActiveShards(), equalTo(test.numPrimaries));
|
||||
|
||||
try {
|
||||
client().admin().indices().prepareUpdateSettings("test").setSettings(settingsBuilder().put("index.refresh_interval", "")).execute().actionGet();
|
||||
|
@ -27,6 +27,7 @@ import org.elasticsearch.cluster.ClusterState;
|
||||
import org.elasticsearch.cluster.routing.IndexRoutingTable;
|
||||
import org.elasticsearch.cluster.routing.IndexShardRoutingTable;
|
||||
import org.elasticsearch.cluster.routing.ShardRouting;
|
||||
import org.elasticsearch.cluster.routing.allocation.decider.ThrottlingAllocationDecider;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.discovery.DiscoverySettings;
|
||||
@ -46,22 +47,34 @@ public class AckClusterUpdateSettingsTests extends ElasticsearchIntegrationTest
|
||||
protected Settings nodeSettings(int nodeOrdinal) {
|
||||
//to test that the acknowledgement mechanism is working we better disable the wait for publish
|
||||
//otherwise the operation is most likely acknowledged even if it doesn't support ack
|
||||
return ImmutableSettings.builder().put(DiscoverySettings.PUBLISH_TIMEOUT, 0).build();
|
||||
return ImmutableSettings.builder()
|
||||
.put(DiscoverySettings.PUBLISH_TIMEOUT, 0)
|
||||
//make sure that enough concurrent reroutes can happen at the same time
|
||||
//we have a minimum of 2 nodes, and a maximum of 10 shards, thus 5 should be enough
|
||||
.put(ThrottlingAllocationDecider.CLUSTER_ROUTING_ALLOCATION_NODE_CONCURRENT_RECOVERIES, 5)
|
||||
.build();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected int minimumNumberOfShards() {
|
||||
return cluster().size();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return 0;
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testClusterUpdateSettingsAcknowledgement() {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put("number_of_shards", atLeast(cluster().size()))
|
||||
.put("number_of_replicas", 0)).get();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
NodesInfoResponse nodesInfo = client().admin().cluster().prepareNodesInfo().get();
|
||||
String excludedNodeId = null;
|
||||
for (NodeInfo nodeInfo : nodesInfo) {
|
||||
if (nodeInfo.getNode().isDataNode()) {
|
||||
excludedNodeId = nodesInfo.getAt(0).getNode().id();
|
||||
excludedNodeId = nodeInfo.getNode().id();
|
||||
break;
|
||||
}
|
||||
}
|
||||
@ -80,25 +93,20 @@ public class AckClusterUpdateSettingsTests extends ElasticsearchIntegrationTest
|
||||
for (ShardRouting shardRouting : indexShardRoutingTable) {
|
||||
if (clusterState.nodes().get(shardRouting.currentNodeId()).id().equals(excludedNodeId)) {
|
||||
//if the shard is still there it must be relocating and all nodes need to know, since the request was acknowledged
|
||||
//reroute happens as part of the update settings and we made sure no throttling comes into the picture via settings
|
||||
assertThat(shardRouting.relocating(), equalTo(true));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
//let's wait for the relocation to be completed, otherwise there can be issues with after test checks (mock directory wrapper etc.)
|
||||
waitForRelocation();
|
||||
|
||||
//removes the allocation exclude settings
|
||||
client().admin().cluster().prepareUpdateSettings().setTransientSettings(settingsBuilder().put("cluster.routing.allocation.exclude._id", "")).get();
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testClusterUpdateSettingsNoAcknowledgement() {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put("number_of_shards", atLeast(cluster().size()))
|
||||
.put("number_of_shards", between(cluster().size(), DEFAULT_MAX_NUM_SHARDS))
|
||||
.put("number_of_replicas", 0)).get();
|
||||
ensureGreen();
|
||||
|
||||
@ -106,7 +114,7 @@ public class AckClusterUpdateSettingsTests extends ElasticsearchIntegrationTest
|
||||
String excludedNodeId = null;
|
||||
for (NodeInfo nodeInfo : nodesInfo) {
|
||||
if (nodeInfo.getNode().isDataNode()) {
|
||||
excludedNodeId = nodesInfo.getAt(0).getNode().id();
|
||||
excludedNodeId = nodeInfo.getNode().id();
|
||||
break;
|
||||
}
|
||||
}
|
||||
@ -116,12 +124,6 @@ public class AckClusterUpdateSettingsTests extends ElasticsearchIntegrationTest
|
||||
.setTransientSettings(settingsBuilder().put("cluster.routing.allocation.exclude._id", excludedNodeId)).get();
|
||||
assertThat(clusterUpdateSettingsResponse.isAcknowledged(), equalTo(false));
|
||||
assertThat(clusterUpdateSettingsResponse.getTransientSettings().get("cluster.routing.allocation.exclude._id"), equalTo(excludedNodeId));
|
||||
|
||||
//let's wait for the relocation to be completed, otherwise there can be issues with after test checks (mock directory wrapper etc.)
|
||||
waitForRelocation();
|
||||
|
||||
//removes the allocation exclude settings
|
||||
client().admin().cluster().prepareUpdateSettings().setTransientSettings(settingsBuilder().put("cluster.routing.allocation.exclude._id", "")).get();
|
||||
}
|
||||
|
||||
private static ClusterState getLocalClusterState(Client client) {
|
||||
|
@ -49,6 +49,7 @@ import org.elasticsearch.search.warmer.IndexWarmersMetaData;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.*;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.Scope.SUITE;
|
||||
@ -176,13 +177,13 @@ public class AckTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testClusterRerouteAcknowledgement() throws InterruptedException {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put("number_of_shards", atLeast(cluster().size()))
|
||||
.put("number_of_replicas", 0)).get();
|
||||
assertAcked(prepareCreate("test").setSettings(ImmutableSettings.builder()
|
||||
.put(indexSettings())
|
||||
.put(SETTING_NUMBER_OF_SHARDS, between(cluster().size(), DEFAULT_MAX_NUM_SHARDS))
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
));
|
||||
ensureGreen();
|
||||
|
||||
|
||||
MoveAllocationCommand moveAllocationCommand = getAllocationCommand();
|
||||
|
||||
assertAcked(client().admin().cluster().prepareReroute().add(moveAllocationCommand));
|
||||
@ -207,16 +208,14 @@ public class AckTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
assertThat(found, equalTo(true));
|
||||
}
|
||||
//let's wait for the relocation to be completed, otherwise there can be issues with after test checks (mock directory wrapper etc.)
|
||||
waitForRelocation();
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testClusterRerouteNoAcknowledgement() throws InterruptedException {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put("number_of_shards", atLeast(cluster().size()))
|
||||
.put("number_of_replicas", 0)).get();
|
||||
.put(SETTING_NUMBER_OF_SHARDS, between(cluster().size(), DEFAULT_MAX_NUM_SHARDS))
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, 0)).get();
|
||||
ensureGreen();
|
||||
|
||||
MoveAllocationCommand moveAllocationCommand = getAllocationCommand();
|
||||
@ -229,8 +228,8 @@ public class AckTests extends ElasticsearchIntegrationTest {
|
||||
public void testClusterRerouteAcknowledgementDryRun() throws InterruptedException {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put("number_of_shards", atLeast(cluster().size()))
|
||||
.put("number_of_replicas", 0)).get();
|
||||
.put(SETTING_NUMBER_OF_SHARDS, between(cluster().size(), DEFAULT_MAX_NUM_SHARDS))
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, 0)).get();
|
||||
ensureGreen();
|
||||
|
||||
MoveAllocationCommand moveAllocationCommand = getAllocationCommand();
|
||||
@ -262,8 +261,8 @@ public class AckTests extends ElasticsearchIntegrationTest {
|
||||
public void testClusterRerouteNoAcknowledgementDryRun() throws InterruptedException {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put("number_of_shards", atLeast(cluster().size()))
|
||||
.put("number_of_replicas", 0)).get();
|
||||
.put(SETTING_NUMBER_OF_SHARDS, between(cluster().size(), DEFAULT_MAX_NUM_SHARDS))
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, 0)).get();
|
||||
ensureGreen();
|
||||
|
||||
MoveAllocationCommand moveAllocationCommand = getAllocationCommand();
|
||||
@ -334,7 +333,7 @@ public class AckTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
for (Client client : clients()) {
|
||||
IndexMetaData indexMetaData = getLocalClusterState(client).metaData().indices().get("test");
|
||||
assertThat(indexMetaData.getState(), equalTo(IndexMetaData.State.CLOSE));
|
||||
assertThat(indexMetaData.getState(), equalTo(State.CLOSE));
|
||||
}
|
||||
}
|
||||
|
||||
@ -358,7 +357,7 @@ public class AckTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
for (Client client : clients()) {
|
||||
IndexMetaData indexMetaData = getLocalClusterState(client).metaData().indices().get("test");
|
||||
assertThat(indexMetaData.getState(), equalTo(IndexMetaData.State.OPEN));
|
||||
assertThat(indexMetaData.getState(), equalTo(State.OPEN));
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -20,6 +20,7 @@
|
||||
package org.elasticsearch.cluster.allocation;
|
||||
|
||||
import com.carrotsearch.hppc.ObjectIntOpenHashMap;
|
||||
import com.google.common.base.Predicate;
|
||||
import org.apache.lucene.util.LuceneTestCase.Slow;
|
||||
import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
|
||||
import org.elasticsearch.cluster.ClusterState;
|
||||
@ -36,6 +37,8 @@ import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest.Scope;
|
||||
import org.junit.Test;
|
||||
|
||||
import java.util.concurrent.TimeUnit;
|
||||
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.hamcrest.Matchers.anyOf;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
@ -62,35 +65,42 @@ public class AwarenessAllocationTests extends ElasticsearchIntegrationTest {
|
||||
createIndex("test1");
|
||||
createIndex("test2");
|
||||
|
||||
NumShards test1 = getNumShards("test1");
|
||||
NumShards test2 = getNumShards("test2");
|
||||
//no replicas will be allocated as both indices end up on a single node
|
||||
final int totalPrimaries = test1.numPrimaries + test2.numPrimaries;
|
||||
|
||||
ClusterHealthResponse health = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertThat(health.isTimedOut(), equalTo(false));
|
||||
|
||||
logger.info("--> starting 1 node on a different rack");
|
||||
String node3 = cluster().startNode(ImmutableSettings.settingsBuilder().put(commonSettings).put("node.rack_id", "rack_2").build());
|
||||
final String node3 = cluster().startNode(ImmutableSettings.settingsBuilder().put(commonSettings).put("node.rack_id", "rack_2").build());
|
||||
|
||||
long start = System.currentTimeMillis();
|
||||
ObjectIntOpenHashMap<String> counts;
|
||||
// On slow machines the initial relocation might be delayed
|
||||
do {
|
||||
Thread.sleep(100);
|
||||
logger.info("--> waiting for no relocation");
|
||||
health = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForNodes("3").setWaitForRelocatingShards(0).execute().actionGet();
|
||||
assertThat(health.isTimedOut(), equalTo(false));
|
||||
assertThat(awaitBusy(new Predicate<Object>() {
|
||||
@Override
|
||||
public boolean apply(Object input) {
|
||||
|
||||
logger.info("--> checking current state");
|
||||
ClusterState clusterState = client().admin().cluster().prepareState().execute().actionGet().getState();
|
||||
//System.out.println(clusterState.routingTable().prettyPrint());
|
||||
// verify that we have 10 shards on node3
|
||||
counts = new ObjectIntOpenHashMap<String>();
|
||||
for (IndexRoutingTable indexRoutingTable : clusterState.routingTable()) {
|
||||
for (IndexShardRoutingTable indexShardRoutingTable : indexRoutingTable) {
|
||||
for (ShardRouting shardRouting : indexShardRoutingTable) {
|
||||
counts.addTo(clusterState.nodes().get(shardRouting.currentNodeId()).name(), 1);
|
||||
logger.info("--> waiting for no relocation");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForNodes("3").setWaitForRelocatingShards(0).get();
|
||||
if (clusterHealth.isTimedOut()) {
|
||||
return false;
|
||||
}
|
||||
|
||||
logger.info("--> checking current state");
|
||||
ClusterState clusterState = client().admin().cluster().prepareState().execute().actionGet().getState();
|
||||
// verify that we have all the primaries on node3
|
||||
ObjectIntOpenHashMap<String> counts = new ObjectIntOpenHashMap<String>();
|
||||
for (IndexRoutingTable indexRoutingTable : clusterState.routingTable()) {
|
||||
for (IndexShardRoutingTable indexShardRoutingTable : indexRoutingTable) {
|
||||
for (ShardRouting shardRouting : indexShardRoutingTable) {
|
||||
counts.addTo(clusterState.nodes().get(shardRouting.currentNodeId()).name(), 1);
|
||||
}
|
||||
}
|
||||
}
|
||||
return counts.get(node3) == totalPrimaries;
|
||||
}
|
||||
} while (counts.get(node3) != 10 && (System.currentTimeMillis() - start) < 10000);
|
||||
assertThat(counts.get(node3), equalTo(10));
|
||||
}, 10, TimeUnit.SECONDS), equalTo(true));
|
||||
}
|
||||
|
||||
@Test
|
||||
|
@ -20,35 +20,44 @@ package org.elasticsearch.cluster.allocation;
|
||||
|
||||
import org.elasticsearch.cluster.ClusterState;
|
||||
import org.elasticsearch.cluster.routing.RoutingNode;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_NUMBER_OF_REPLICAS;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
public class SimpleAllocationTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Override
|
||||
protected int numberOfShards() {
|
||||
return 3;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return 1;
|
||||
}
|
||||
|
||||
/**
|
||||
* Test for
|
||||
* https://groups.google.com/d/msg/elasticsearch/y-SY_HyoB-8/EZdfNt9VO44J
|
||||
*/
|
||||
@Test
|
||||
public void testSaneAllocation() {
|
||||
prepareCreate("test", 3,
|
||||
settingsBuilder().put("index.number_of_shards", 3)
|
||||
.put("index.number_of_replicas", 1))
|
||||
.execute().actionGet();
|
||||
ensureGreen();
|
||||
ClusterState state = client().admin().cluster().prepareState().execute().actionGet().getState();
|
||||
assertAcked(prepareCreate("test", 3));
|
||||
ensureGreen();
|
||||
|
||||
ClusterState state = client().admin().cluster().prepareState().execute().actionGet().getState();
|
||||
assertThat(state.routingNodes().unassigned().size(), equalTo(0));
|
||||
for (RoutingNode node : state.routingNodes()) {
|
||||
if (!node.isEmpty()) {
|
||||
assertThat(node.size(), equalTo(2));
|
||||
}
|
||||
}
|
||||
client().admin().indices().prepareUpdateSettings("test").setSettings(settingsBuilder().put("index.number_of_replicas", 0)).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
client().admin().indices().prepareUpdateSettings("test").setSettings(settingsBuilder().put(SETTING_NUMBER_OF_REPLICAS, 0)).execute().actionGet();
|
||||
ensureGreen();
|
||||
state = client().admin().cluster().prepareState().execute().actionGet().getState();
|
||||
|
||||
assertThat(state.routingNodes().unassigned().size(), equalTo(0));
|
||||
@ -57,17 +66,13 @@ public class SimpleAllocationTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(node.size(), equalTo(1));
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// create another index
|
||||
prepareCreate("test2", 3,
|
||||
settingsBuilder()
|
||||
.put("index.number_of_shards", 3)
|
||||
.put("index.number_of_replicas", 1))
|
||||
.execute()
|
||||
.actionGet();
|
||||
ensureGreen();
|
||||
client().admin().indices().prepareUpdateSettings("test").setSettings(settingsBuilder().put("index.number_of_replicas", 1)).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test2", 3));
|
||||
ensureGreen();
|
||||
|
||||
client().admin().indices().prepareUpdateSettings("test").setSettings(settingsBuilder().put(SETTING_NUMBER_OF_REPLICAS, 1)).execute().actionGet();
|
||||
ensureGreen();
|
||||
state = client().admin().cluster().prepareState().execute().actionGet().getState();
|
||||
|
||||
assertThat(state.routingNodes().unassigned().size(), equalTo(0));
|
||||
|
@ -31,13 +31,24 @@ import org.junit.Test;
|
||||
import java.io.IOException;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.endsWith;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class CodecTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Override
|
||||
protected int numberOfShards() {
|
||||
return 1;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return 0;
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testFieldsWithCustomPostingsFormat() throws Exception {
|
||||
try {
|
||||
@ -46,14 +57,12 @@ public class CodecTests extends ElasticsearchIntegrationTest {
|
||||
// ignore
|
||||
}
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties").startObject("field1")
|
||||
.field("postings_format", "test1").field("index_options", "docs").field("type", "string").endObject().endObject().endObject().endObject())
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
.put("index.codec.postings_format.test1.type", "pulsing")
|
||||
).execute().actionGet();
|
||||
.put(indexSettings())
|
||||
.put("index.codec.postings_format.test1.type", "pulsing")));
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "quick brown fox", "field2", "quick brown fox").execute().actionGet();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "quick lazy huge brown fox", "field2", "quick lazy huge brown fox").setRefresh(true).execute().actionGet();
|
||||
@ -75,12 +84,10 @@ public class CodecTests extends ElasticsearchIntegrationTest {
|
||||
// ignore
|
||||
}
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
.put("index.codec", "SimpleText")
|
||||
).execute().actionGet();
|
||||
.put(indexSettings())
|
||||
.put("index.codec", "SimpleText")));
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "quick brown fox", "field2", "quick brown fox").execute().actionGet();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "quick lazy huge brown fox", "field2", "quick lazy huge brown fox").setRefresh(true).execute().actionGet();
|
||||
@ -102,14 +109,14 @@ public class CodecTests extends ElasticsearchIntegrationTest {
|
||||
// ignore
|
||||
}
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("test", jsonBuilder().startObject().startObject("test")
|
||||
.startObject("_version").field("doc_values_format", "disk").endObject()
|
||||
.startObject("properties").startObject("field").field("type", "long").field("doc_values_format", "dvf").endObject().endObject()
|
||||
.endObject().endObject())
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.codec.doc_values_format.dvf.type", "disk")
|
||||
.build());
|
||||
.put(indexSettings())
|
||||
.put("index.codec.doc_values_format.dvf.type", "disk")));
|
||||
|
||||
for (int i = 10; i >= 0; --i) {
|
||||
client().prepareIndex("test", "test", Integer.toString(i)).setSource("field", randomLong()).setRefresh(i == 0 || rarely()).execute().actionGet();
|
||||
|
@ -20,13 +20,14 @@
|
||||
package org.elasticsearch.count.query;
|
||||
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.ShardOperationFailedException;
|
||||
import org.elasticsearch.action.count.CountResponse;
|
||||
import org.elasticsearch.action.search.SearchPhaseExecutionException;
|
||||
import org.elasticsearch.common.bytes.BytesArray;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.index.query.*;
|
||||
import org.elasticsearch.index.query.CommonTermsQueryBuilder.Operator;
|
||||
import org.elasticsearch.index.query.MatchQueryBuilder.Type;
|
||||
import org.elasticsearch.rest.RestStatus;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.joda.time.DateTime;
|
||||
import org.joda.time.DateTimeZone;
|
||||
@ -35,7 +36,6 @@ import org.junit.Test;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_NUMBER_OF_REPLICAS;
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_NUMBER_OF_SHARDS;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.*;
|
||||
@ -47,7 +47,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void passQueryAsStringTest() throws Exception {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1_1", "field2", "value2_1").setRefresh(true).get();
|
||||
|
||||
@ -58,18 +58,22 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testIndexOptions() throws Exception {
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,index_options=docs")
|
||||
.setSettings("index.number_of_shards", 1));
|
||||
.addMapping("type1", "field1", "type=string,index_options=docs"));
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "quick brown fox", "field2", "quick brown fox").get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "quick lazy huge brown fox", "field2", "quick lazy huge brown fox").setRefresh(true).get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "quick lazy huge brown fox", "field2", "quick lazy huge brown fox").get();
|
||||
refresh();
|
||||
|
||||
CountResponse countResponse = client().prepareCount().setQuery(QueryBuilders.matchQuery("field2", "quick brown").type(Type.PHRASE).slop(0)).get();
|
||||
assertHitCount(countResponse, 1l);
|
||||
try {
|
||||
client().prepareCount().setQuery(QueryBuilders.matchQuery("field1", "quick brown").type(Type.PHRASE).slop(0)).get();
|
||||
} catch (SearchPhaseExecutionException e) {
|
||||
assertTrue("wrong exception message " + e.getMessage(), e.getMessage().endsWith("IllegalStateException[field \"field1\" was indexed without position data; cannot run PhraseQuery (term=quick)]; }"));
|
||||
|
||||
countResponse = client().prepareCount().setQuery(QueryBuilders.matchQuery("field1", "quick brown").type(Type.PHRASE).slop(0)).get();
|
||||
assertHitCount(countResponse, 0l);
|
||||
assertThat(countResponse.getFailedShards(), anyOf(equalTo(1), equalTo(2)));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(countResponse.getShardFailures().length));
|
||||
for (ShardOperationFailedException shardFailure : countResponse.getShardFailures()) {
|
||||
assertThat(shardFailure.status(), equalTo(RestStatus.INTERNAL_SERVER_ERROR));
|
||||
assertThat(shardFailure.reason(), containsString("[field \"field1\" was indexed without position data; cannot run PhraseQuery (term=quick)]"));
|
||||
}
|
||||
}
|
||||
|
||||
@ -132,7 +136,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void queryStringAnalyzedWildcard() throws Exception {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value_1", "field2", "value_2").get();
|
||||
refresh();
|
||||
@ -155,7 +159,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testLowercaseExpandedTerms() {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value_1", "field2", "value_2").get();
|
||||
refresh();
|
||||
@ -176,7 +180,9 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testDateRangeInQueryString() {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
NumShards test = getNumShards("test");
|
||||
|
||||
String aMonthAgo = ISODateTimeFormat.yearMonthDay().print(new DateTime(DateTimeZone.UTC).minusMonths(1));
|
||||
String aMonthFromNow = ISODateTimeFormat.yearMonthDay().print(new DateTime(DateTimeZone.UTC).plusMonths(1));
|
||||
@ -190,12 +196,15 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
countResponse = client().prepareCount().setQuery(queryString("future:[now/d TO now+2M/d]").lowercaseExpandedTerms(false)).get();
|
||||
assertHitCount(countResponse, 1l);
|
||||
|
||||
countResponse = client().prepareCount().setQuery(queryString("future:[now/D TO now+2M/d]").lowercaseExpandedTerms(false)).get();
|
||||
countResponse = client().prepareCount("test").setQuery(queryString("future:[now/D TO now+2M/d]").lowercaseExpandedTerms(false)).get();
|
||||
//D is an unsupported unit in date math
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(0));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(1));
|
||||
assertThat(countResponse.getShardFailures().length, equalTo(1));
|
||||
assertThat(countResponse.getShardFailures()[0].reason(), allOf(containsString("Failed to parse"), containsString("unit [D] not supported for date math")));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(test.numPrimaries));
|
||||
assertThat(countResponse.getShardFailures().length, equalTo(test.numPrimaries));
|
||||
for (ShardOperationFailedException shardFailure : countResponse.getShardFailures()) {
|
||||
assertThat(shardFailure.status(), equalTo(RestStatus.BAD_REQUEST));
|
||||
assertThat(shardFailure.reason(), allOf(containsString("Failed to parse"), containsString("unit [D] not supported for date math")));
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
@ -209,7 +218,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
private void typeFilterTests(String index) throws Exception {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("_type").field("index", index).endObject()
|
||||
.endObject().endObject())
|
||||
@ -242,7 +251,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
private void idsFilterTests(String index) throws Exception {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("_id").field("index", index).endObject()
|
||||
.endObject().endObject()));
|
||||
@ -288,7 +297,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void filterExistsMissingTests() throws Exception {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
indexRandom(true,
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject().startObject("obj1").field("obj1_val", "1").endObject().field("x1", "x_1").field("field1", "value1_1").field("field2", "value2_1").endObject()),
|
||||
@ -342,7 +351,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void passQueryAsJSONStringTest() throws Exception {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1_1", "field2", "value2_1").setRefresh(true).get();
|
||||
|
||||
@ -375,7 +384,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMatchQueryNumeric() throws Exception {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("long", 1l, "double", 1.0d).get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("long", 2l, "double", 2.0d).get();
|
||||
@ -390,8 +399,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMultiMatchQuery() throws Exception {
|
||||
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1", "field2", "value4", "field3", "value3").get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "value2", "field2", "value5", "field3", "value2").get();
|
||||
@ -432,7 +440,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMatchQueryZeroTermsQuery() {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,analyzer=classic", "field2", "type=string,analyzer=classic"));
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1").get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "value2").get();
|
||||
@ -457,7 +465,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMultiMatchQueryZeroTermsQuery() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,analyzer=classic", "field2", "type=string,analyzer=classic"));
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1", "field2", "value2").get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "value3", "field2", "value4").get();
|
||||
@ -482,7 +490,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMultiMatchQueryMinShouldMatch() {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", new String[]{"value1", "value2", "value3"}).get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field2", "value1").get();
|
||||
refresh();
|
||||
@ -519,7 +527,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFuzzyQueryString() {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("str", "kimchy", "date", "2012-02-01", "num", 12).get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("str", "shay", "date", "2012-02-05", "num", 20).get();
|
||||
refresh();
|
||||
@ -536,7 +544,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSpecialRangeSyntaxInQueryString() {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("str", "kimchy", "date", "2012-02-01", "num", 12).get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("str", "shay", "date", "2012-02-05", "num", 20).get();
|
||||
refresh();
|
||||
@ -723,8 +731,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testNumericTermsAndRanges() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1",
|
||||
"num_byte", "type=byte", "num_short", "type=short",
|
||||
"num_integer", "type=integer", "num_long", "type=long",
|
||||
@ -801,7 +808,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test // see #2994
|
||||
public void testSimpleSpan() throws ElasticsearchException, IOException {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1, SETTING_NUMBER_OF_REPLICAS, 0));
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "test", "1").setSource("description", "foo other anything bar").get();
|
||||
|
@ -20,7 +20,6 @@
|
||||
package org.elasticsearch.count.simple;
|
||||
|
||||
import org.elasticsearch.action.count.CountResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.index.query.QueryBuilders;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
@ -31,14 +30,14 @@ import java.util.concurrent.ExecutionException;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.boolQuery;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.rangeQuery;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
|
||||
|
||||
public class SimpleCountTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testCountRandomPreference() throws InterruptedException, ExecutionException {
|
||||
client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", between(1, 3))).get();
|
||||
createIndex("test");
|
||||
indexRandom(true, client().prepareIndex("test", "type", "1").setSource("field", "value"),
|
||||
client().prepareIndex("test", "type", "2").setSource("field", "value"),
|
||||
client().prepareIndex("test", "type", "3").setSource("field", "value"),
|
||||
@ -56,7 +55,7 @@ public class SimpleCountTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void simpleIpTests() throws Exception {
|
||||
client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1)).execute().actionGet();
|
||||
createIndex("test");
|
||||
|
||||
client().admin().indices().preparePutMapping("test").setType("type1")
|
||||
.setSource(XContentFactory.jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
@ -76,7 +75,7 @@ public class SimpleCountTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void simpleIdTests() {
|
||||
client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1)).execute().actionGet();
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type", "XXX1").setSource("field", "value").setRefresh(true).execute().actionGet();
|
||||
// id is not indexed, but lets see that we automatically convert to
|
||||
@ -96,13 +95,12 @@ public class SimpleCountTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void simpleDateMathTests() throws Exception {
|
||||
prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder()).execute().actionGet();
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field", "2010-01-05T02:00").execute().actionGet();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field", "2010-01-06T02:00").execute().actionGet();
|
||||
ensureGreen();
|
||||
refresh();
|
||||
CountResponse countResponse = client().prepareCount("test").setQuery(QueryBuilders.rangeQuery("field").gte("2010-01-03||+2d").lte("2010-01-04||+2d")).execute().actionGet();
|
||||
assertNoFailures(countResponse);
|
||||
assertHitCount(countResponse, 2l);
|
||||
|
||||
countResponse = client().prepareCount("test").setQuery(QueryBuilders.queryString("field:[2010-01-03||+2d TO 2010-01-04||+2d]")).execute().actionGet();
|
||||
@ -111,7 +109,7 @@ public class SimpleCountTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void localDependentDateTests() throws Exception {
|
||||
prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1",
|
||||
jsonBuilder().startObject()
|
||||
.startObject("type1")
|
||||
@ -123,27 +121,24 @@ public class SimpleCountTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject())
|
||||
.execute().actionGet();
|
||||
.endObject()));
|
||||
ensureGreen();
|
||||
for (int i = 0; i < 10; i++) {
|
||||
client().prepareIndex("test", "type1", "" + i).setSource("date_field", "Mi, 06 Dez 2000 02:55:00 -0800").execute().actionGet();
|
||||
client().prepareIndex("test", "type1", "" + (10 + i)).setSource("date_field", "Do, 07 Dez 2000 02:55:00 -0800").execute().actionGet();
|
||||
}
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
for (int i = 0; i < 10; i++) {
|
||||
CountResponse countResponse = client().prepareCount("test")
|
||||
.setQuery(QueryBuilders.rangeQuery("date_field").gte("Di, 05 Dez 2000 02:55:00 -0800").lte("Do, 07 Dez 2000 00:00:00 -0800"))
|
||||
.execute().actionGet();
|
||||
assertHitCount(countResponse, 10l);
|
||||
|
||||
|
||||
countResponse = client().prepareCount("test")
|
||||
.setQuery(QueryBuilders.rangeQuery("date_field").gte("Di, 05 Dez 2000 02:55:00 -0800").lte("Fr, 08 Dez 2000 00:00:00 -0800"))
|
||||
.execute().actionGet();
|
||||
assertHitCount(countResponse, 20l);
|
||||
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -107,12 +107,14 @@ public class DeleteByQueryTests extends ElasticsearchIntegrationTest {
|
||||
.setQuery(QueryBuilders.hasChildQuery("type", QueryBuilders.matchAllQuery()))
|
||||
.execute().actionGet();
|
||||
|
||||
NumShards twitter = getNumShards("twitter");
|
||||
|
||||
assertThat(response.status(), equalTo(RestStatus.BAD_REQUEST));
|
||||
assertThat(response.getIndex("twitter").getSuccessfulShards(), equalTo(0));
|
||||
assertThat(response.getIndex("twitter").getFailedShards(), equalTo(5));
|
||||
assertThat(response.getIndex("twitter").getFailedShards(), equalTo(twitter.numPrimaries));
|
||||
assertThat(response.getIndices().size(), equalTo(1));
|
||||
assertThat(response.getIndices().get("twitter").getFailedShards(), equalTo(5));
|
||||
assertThat(response.getIndices().get("twitter").getFailures().length, equalTo(5));
|
||||
assertThat(response.getIndices().get("twitter").getFailedShards(), equalTo(twitter.numPrimaries));
|
||||
assertThat(response.getIndices().get("twitter").getFailures().length, equalTo(twitter.numPrimaries));
|
||||
for (ShardOperationFailedException failure : response.getIndices().get("twitter").getFailures()) {
|
||||
assertThat(failure.reason(), containsString("[twitter] [has_child] No mapping for for type [type]"));
|
||||
assertThat(failure.status(), equalTo(RestStatus.BAD_REQUEST));
|
||||
|
@ -20,7 +20,6 @@
|
||||
package org.elasticsearch.document;
|
||||
|
||||
import com.google.common.base.Charsets;
|
||||
import org.elasticsearch.action.admin.indices.create.CreateIndexResponse;
|
||||
import org.elasticsearch.action.bulk.BulkRequestBuilder;
|
||||
import org.elasticsearch.action.bulk.BulkResponse;
|
||||
import org.elasticsearch.action.count.CountResponse;
|
||||
@ -29,7 +28,6 @@ import org.elasticsearch.action.index.IndexResponse;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.update.UpdateRequestBuilder;
|
||||
import org.elasticsearch.action.update.UpdateResponse;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.bytes.BytesArray;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
@ -44,20 +42,12 @@ import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class BulkTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Test
|
||||
public void testBulkUpdate_simple() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 0)
|
||||
).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
BulkResponse bulkResponse = client().prepareBulk()
|
||||
.add(client().prepareIndex().setIndex("test").setType("type1").setId("1").setSource("field", 1))
|
||||
@ -181,13 +171,8 @@ public class BulkTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testBulkUpdate_malformedScripts() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 0)
|
||||
).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
BulkResponse bulkResponse = client().prepareBulk()
|
||||
.add(client().prepareIndex().setIndex("test").setType("type1").setId("1").setSource("field", 1))
|
||||
@ -222,13 +207,8 @@ public class BulkTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testBulkUpdate_largerVolume() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 1)
|
||||
).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
int numDocs = 2000;
|
||||
BulkRequestBuilder builder = client().prepareBulk();
|
||||
@ -361,18 +341,14 @@ public class BulkTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testBulkIndexingWhileInitializing() throws Exception {
|
||||
|
||||
int shards = 1 + randomInt(10);
|
||||
int replica = randomInt(2);
|
||||
|
||||
cluster().ensureAtLeastNumNodes(1 + replica);
|
||||
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", shards)
|
||||
.put("index.number_of_replicas", replica)
|
||||
).execute().actionGet();
|
||||
assertAcked(prepareCreate("test").setSettings(
|
||||
ImmutableSettings.builder()
|
||||
.put(indexSettings())
|
||||
.put("index.number_of_replicas", replica)));
|
||||
|
||||
int numDocs = 5000;
|
||||
int bulk = 50;
|
||||
@ -403,7 +379,7 @@ public class BulkTests extends ElasticsearchIntegrationTest {
|
||||
.addMapping("parent", "{\"parent\":{}}")
|
||||
.addMapping("child", "{\"child\": {\"_parent\": {\"type\": \"parent\"}}}")
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
BulkRequestBuilder builder = client().prepareBulk();
|
||||
|
||||
@ -437,11 +413,10 @@ public class BulkTests extends ElasticsearchIntegrationTest {
|
||||
*/
|
||||
@Test
|
||||
public void testBulkUpdateUpsertWithParent() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent", "{\"parent\":{}}")
|
||||
.addMapping("child", "{\"child\": {\"_parent\": {\"type\": \"parent\"}}}")
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.addMapping("child", "{\"child\": {\"_parent\": {\"type\": \"parent\"}}}"));
|
||||
ensureGreen();
|
||||
|
||||
BulkRequestBuilder builder = client().prepareBulk();
|
||||
|
||||
@ -465,7 +440,6 @@ public class BulkTests extends ElasticsearchIntegrationTest {
|
||||
.setQuery(QueryBuilders.hasParentQuery("parent", QueryBuilders.matchAllQuery()))
|
||||
.get();
|
||||
|
||||
assertNoFailures(searchResponse);
|
||||
assertSearchHits(searchResponse, "child1");
|
||||
}
|
||||
|
||||
@ -520,8 +494,7 @@ public class BulkTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject();
|
||||
CreateIndexResponse createIndexResponse = prepareCreate("test").addMapping("type", builder).get();
|
||||
assertAcked(createIndexResponse);
|
||||
assertAcked(prepareCreate("test").addMapping("type", builder));
|
||||
|
||||
String brokenBuildRequestData = "{\"index\": {\"_id\": \"1\"}}\n" +
|
||||
"{\"name\": \"Malformed}\n" +
|
||||
@ -546,8 +519,7 @@ public class BulkTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject();
|
||||
CreateIndexResponse createIndexResponse = prepareCreate("test").addMapping("type", builder).get();
|
||||
assertAcked(createIndexResponse);
|
||||
assertAcked(prepareCreate("test").addMapping("type", builder));
|
||||
ensureYellow("test");
|
||||
|
||||
String brokenBuildRequestData = "{\"index\": {} }\n" +
|
||||
@ -573,8 +545,7 @@ public class BulkTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject();
|
||||
CreateIndexResponse createIndexResponse = prepareCreate("test").addMapping("type", builder).get();
|
||||
assertAcked(createIndexResponse);
|
||||
assertAcked(prepareCreate("test").addMapping("type", builder));
|
||||
ensureYellow("test");
|
||||
|
||||
String brokenBuildRequestData = "{\"index\": {} }\n" +
|
||||
|
@ -20,7 +20,6 @@
|
||||
package org.elasticsearch.document;
|
||||
|
||||
import com.google.common.base.Charsets;
|
||||
import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
|
||||
import org.elasticsearch.action.admin.cluster.health.ClusterHealthStatus;
|
||||
import org.elasticsearch.action.admin.indices.cache.clear.ClearIndicesCacheResponse;
|
||||
import org.elasticsearch.action.admin.indices.flush.FlushResponse;
|
||||
@ -66,6 +65,7 @@ public class DocumentActionsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testIndexActions() throws Exception {
|
||||
createIndex();
|
||||
NumShards numShards = getNumShards(getConcreteIndexName());
|
||||
logger.info("Running Cluster Health");
|
||||
ensureGreen();
|
||||
logger.info("Indexing [type1/1]");
|
||||
@ -75,7 +75,7 @@ public class DocumentActionsTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(indexResponse.getType(), equalTo("type1"));
|
||||
logger.info("Refreshing");
|
||||
RefreshResponse refreshResponse = refresh();
|
||||
assertThat(refreshResponse.getSuccessfulShards(), equalTo(10));
|
||||
assertThat(refreshResponse.getSuccessfulShards(), equalTo(numShards.totalNumShards));
|
||||
|
||||
logger.info("--> index exists?");
|
||||
assertThat(indexExists(getConcreteIndexName()), equalTo(true));
|
||||
@ -85,12 +85,12 @@ public class DocumentActionsTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("Clearing cache");
|
||||
ClearIndicesCacheResponse clearIndicesCacheResponse = client().admin().indices().clearCache(clearIndicesCacheRequest("test").recycler(true).fieldDataCache(true).filterCache(true).idCache(true)).actionGet();
|
||||
assertNoFailures(clearIndicesCacheResponse);
|
||||
assertThat(clearIndicesCacheResponse.getSuccessfulShards(), equalTo(10));
|
||||
assertThat(clearIndicesCacheResponse.getSuccessfulShards(), equalTo(numShards.totalNumShards));
|
||||
|
||||
logger.info("Optimizing");
|
||||
waitForRelocation(ClusterHealthStatus.GREEN);
|
||||
OptimizeResponse optimizeResponse = optimize();
|
||||
assertThat(optimizeResponse.getSuccessfulShards(), equalTo(10));
|
||||
assertThat(optimizeResponse.getSuccessfulShards(), equalTo(numShards.totalNumShards));
|
||||
|
||||
GetResponse getResult;
|
||||
|
||||
@ -141,7 +141,7 @@ public class DocumentActionsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
logger.info("Flushing");
|
||||
FlushResponse flushResult = client().admin().indices().prepareFlush("test").execute().actionGet();
|
||||
assertThat(flushResult.getSuccessfulShards(), equalTo(10));
|
||||
assertThat(flushResult.getSuccessfulShards(), equalTo(numShards.totalNumShards));
|
||||
assertThat(flushResult.getFailedShards(), equalTo(0));
|
||||
logger.info("Refreshing");
|
||||
client().admin().indices().refresh(refreshRequest("test")).actionGet();
|
||||
@ -165,7 +165,7 @@ public class DocumentActionsTests extends ElasticsearchIntegrationTest {
|
||||
CountResponse countResponse = client().prepareCount("test").setQuery(termQuery("_type", "type1")).setOperationThreading(BroadcastOperationThreading.NO_THREADS).execute().actionGet();
|
||||
assertNoFailures(countResponse);
|
||||
assertThat(countResponse.getCount(), equalTo(2l));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(5));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(0));
|
||||
|
||||
countResponse = client().prepareCount("test")
|
||||
@ -173,14 +173,14 @@ public class DocumentActionsTests extends ElasticsearchIntegrationTest {
|
||||
.setOperationThreading(BroadcastOperationThreading.SINGLE_THREAD)
|
||||
.get();
|
||||
assertThat(countResponse.getCount(), equalTo(2l));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(5));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(0));
|
||||
|
||||
countResponse = client().prepareCount("test")
|
||||
.setQuery(termQuery("_type", "type1"))
|
||||
.setOperationThreading(BroadcastOperationThreading.THREAD_PER_SHARD).get();
|
||||
assertThat(countResponse.getCount(), equalTo(2l));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(5));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(0));
|
||||
|
||||
// test failed (simply query that can't be parsed)
|
||||
@ -188,19 +188,19 @@ public class DocumentActionsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
assertThat(countResponse.getCount(), equalTo(0l));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(0));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(5));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(numShards.numPrimaries));
|
||||
|
||||
// count with no query is a match all one
|
||||
countResponse = client().prepareCount("test").execute().actionGet();
|
||||
assertThat("Failures " + countResponse.getShardFailures(), countResponse.getShardFailures() == null ? 0 : countResponse.getShardFailures().length, equalTo(0));
|
||||
assertThat(countResponse.getCount(), equalTo(2l));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(5));
|
||||
assertThat(countResponse.getSuccessfulShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(countResponse.getFailedShards(), equalTo(0));
|
||||
}
|
||||
|
||||
logger.info("Delete by query");
|
||||
DeleteByQueryResponse queryResponse = client().prepareDeleteByQuery().setIndices("test").setQuery(termQuery("name", "test2")).execute().actionGet();
|
||||
assertThat(queryResponse.getIndex(getConcreteIndexName()).getSuccessfulShards(), equalTo(5));
|
||||
assertThat(queryResponse.getIndex(getConcreteIndexName()).getSuccessfulShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(queryResponse.getIndex(getConcreteIndexName()).getFailedShards(), equalTo(0));
|
||||
client().admin().indices().refresh(refreshRequest("test")).actionGet();
|
||||
|
||||
@ -218,11 +218,9 @@ public class DocumentActionsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testBulk() throws Exception {
|
||||
createIndex();
|
||||
NumShards numShards = getNumShards(getConcreteIndexName());
|
||||
logger.info("-> running Cluster Health");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus()).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
BulkResponse bulkResponse = client().prepareBulk()
|
||||
.add(client().prepareIndex().setIndex("test").setType("type1").setId("1").setSource(source("1", "test")))
|
||||
@ -267,7 +265,7 @@ public class DocumentActionsTests extends ElasticsearchIntegrationTest {
|
||||
waitForRelocation(ClusterHealthStatus.GREEN);
|
||||
RefreshResponse refreshResponse = client().admin().indices().prepareRefresh("test").execute().actionGet();
|
||||
assertNoFailures(refreshResponse);
|
||||
assertThat(refreshResponse.getSuccessfulShards(), equalTo(10));
|
||||
assertThat(refreshResponse.getSuccessfulShards(), equalTo(numShards.totalNumShards));
|
||||
|
||||
|
||||
for (int i = 0; i < 5; i++) {
|
||||
|
@ -201,7 +201,6 @@ public class ExplainActionTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(((Map<String, Object>) response.getGetResult().getSource().get("obj1")).get("field1").toString(), equalTo("value1"));
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testExplainWithAlias() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
@ -223,7 +222,7 @@ public class ExplainActionTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void explainDateRangeInQueryString() {
|
||||
client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1)).get();
|
||||
createIndex("test");
|
||||
|
||||
String aMonthAgo = ISODateTimeFormat.yearMonthDay().print(new DateTime(DateTimeZone.UTC).minusMonths(1));
|
||||
String aMonthFromNow = ISODateTimeFormat.yearMonthDay().print(new DateTime(DateTimeZone.UTC).plusMonths(1));
|
||||
|
@ -24,9 +24,6 @@ import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_NUMBER_OF_REPLICAS;
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_NUMBER_OF_SHARDS;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.fuzzyLikeThisFieldQuery;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.fuzzyLikeThisQuery;
|
||||
@ -39,13 +36,15 @@ import static org.hamcrest.Matchers.equalTo;
|
||||
*/
|
||||
public class FuzzyLikeThisActionTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
@Test
|
||||
// See issue https://github.com/elasticsearch/elasticsearch/issues/3252
|
||||
public void testNumericField() throws Exception {
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put(SETTING_NUMBER_OF_SHARDS, between(1, 5))
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, between(0, 1)))
|
||||
.addMapping("type", "int_value", "type=integer"));
|
||||
ensureGreen();
|
||||
client().prepareIndex("test", "type", "1")
|
||||
|
@ -70,8 +70,6 @@ public class IndexGatewayTests extends ElasticsearchIntegrationTest {
|
||||
if (between(0, 5) == 0) {
|
||||
builder.put("gateway.fs.chunk_size", between(1, 100) + "kb");
|
||||
}
|
||||
builder.put("index.number_of_replicas", "1");
|
||||
builder.put("index.number_of_shards", rarely() ? Integer.toString(between(2, 6)) : "1");
|
||||
storeType = rarely() ? "ram" : "fs";
|
||||
builder.put("index.store.type", storeType);
|
||||
settings.set(builder.build());
|
||||
@ -79,7 +77,6 @@ public class IndexGatewayTests extends ElasticsearchIntegrationTest {
|
||||
return settings.get();
|
||||
}
|
||||
|
||||
|
||||
protected boolean isPersistentStorage() {
|
||||
assertNotNull(storeType);
|
||||
return "fs".equals(settings.get().get("index.store.type"));
|
||||
|
@ -96,11 +96,13 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
public void testSimpleOpenClose() throws Exception {
|
||||
|
||||
logger.info("--> starting 2 nodes");
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 1).build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 1).build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").build());
|
||||
|
||||
logger.info("--> creating test index");
|
||||
client().admin().indices().prepareCreate("test").execute().actionGet();
|
||||
createIndex("test");
|
||||
|
||||
NumShards test = getNumShards("test");
|
||||
|
||||
logger.info("--> waiting for green status");
|
||||
ClusterHealthResponse health = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForNodes("2").execute().actionGet();
|
||||
@ -108,8 +110,8 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
ClusterStateResponse stateResponse = client().admin().cluster().prepareState().execute().actionGet();
|
||||
assertThat(stateResponse.getState().metaData().index("test").state(), equalTo(IndexMetaData.State.OPEN));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(2));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(4));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(test.numPrimaries));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(test.totalNumShards));
|
||||
|
||||
logger.info("--> indexing a simple document");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1").execute().actionGet();
|
||||
@ -151,8 +153,8 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
stateResponse = client().admin().cluster().prepareState().execute().actionGet();
|
||||
assertThat(stateResponse.getState().metaData().index("test").state(), equalTo(IndexMetaData.State.OPEN));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(2));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(4));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(test.numPrimaries));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(test.totalNumShards));
|
||||
|
||||
logger.info("--> trying to get the indexed document on the first index");
|
||||
GetResponse getResponse = client().prepareGet("test", "type1", "1").execute().actionGet();
|
||||
@ -191,8 +193,8 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
stateResponse = client().admin().cluster().prepareState().execute().actionGet();
|
||||
assertThat(stateResponse.getState().metaData().index("test").state(), equalTo(IndexMetaData.State.OPEN));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(2));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(4));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(test.numPrimaries));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(test.totalNumShards));
|
||||
|
||||
logger.info("--> trying to get the indexed document on the first round (before close and shutdown)");
|
||||
getResponse = client().prepareGet("test", "type1", "1").execute().actionGet();
|
||||
@ -207,7 +209,7 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("--> cleaning nodes");
|
||||
|
||||
logger.info("--> starting 1 master node non data");
|
||||
cluster().startNode(settingsBuilder().put("node.data", false).put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 1).build());
|
||||
cluster().startNode(settingsBuilder().put("node.data", false).put("gateway.type", "local").build());
|
||||
|
||||
logger.info("--> create an index");
|
||||
client().admin().indices().prepareCreate("test").execute().actionGet();
|
||||
@ -216,7 +218,7 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
cluster().closeNonSharedNodes(false);
|
||||
|
||||
logger.info("--> starting 1 master node non data again");
|
||||
cluster().startNode(settingsBuilder().put("node.data", false).put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 1).build());
|
||||
cluster().startNode(settingsBuilder().put("node.data", false).put("gateway.type", "local").build());
|
||||
|
||||
logger.info("--> waiting for test index to be created");
|
||||
ClusterHealthResponse health = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setIndices("test").execute().actionGet();
|
||||
@ -232,8 +234,8 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("--> cleaning nodes");
|
||||
|
||||
logger.info("--> starting 1 master node non data");
|
||||
cluster().startNode(settingsBuilder().put("node.data", false).put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 1).build());
|
||||
cluster().startNode(settingsBuilder().put("node.master", false).put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 1).build());
|
||||
cluster().startNode(settingsBuilder().put("node.data", false).put("gateway.type", "local").build());
|
||||
cluster().startNode(settingsBuilder().put("node.master", false).put("gateway.type", "local").build());
|
||||
|
||||
logger.info("--> create an index");
|
||||
client().admin().indices().prepareCreate("test").execute().actionGet();
|
||||
@ -250,8 +252,8 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("--> cleaning nodes");
|
||||
|
||||
logger.info("--> starting 2 nodes");
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").put("index.number_of_shards", 5).put("index.number_of_replicas", 1).build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").put("index.number_of_shards", 5).put("index.number_of_replicas", 1).build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").build());
|
||||
|
||||
logger.info("--> indexing a simple document");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1").setRefresh(true).execute().actionGet();
|
||||
@ -291,7 +293,6 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
public void testDanglingIndicesAutoImportYes() throws Exception {
|
||||
Settings settings = settingsBuilder()
|
||||
.put("gateway.type", "local").put("gateway.local.auto_import_dangled", "yes")
|
||||
.put("index.number_of_shards", 1).put("index.number_of_replicas", 1)
|
||||
.build();
|
||||
logger.info("--> starting two nodes");
|
||||
final String node_1 = cluster().startNode(settings);
|
||||
@ -349,7 +350,6 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
public void testDanglingIndicesAutoImportClose() throws Exception {
|
||||
Settings settings = settingsBuilder()
|
||||
.put("gateway.type", "local").put("gateway.local.auto_import_dangled", "closed")
|
||||
.put("index.number_of_shards", 1).put("index.number_of_replicas", 1)
|
||||
.build();
|
||||
|
||||
|
||||
@ -417,7 +417,6 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
public void testDanglingIndicesNoAutoImport() throws Exception {
|
||||
Settings settings = settingsBuilder()
|
||||
.put("gateway.type", "local").put("gateway.local.auto_import_dangled", "no")
|
||||
.put("index.number_of_shards", 1).put("index.number_of_replicas", 1)
|
||||
.build();
|
||||
logger.info("--> starting two nodes");
|
||||
final String node_1 = cluster().startNode(settings);
|
||||
@ -483,7 +482,6 @@ public class LocalGatewayIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
public void testDanglingIndicesNoAutoImportStillDanglingAndCreatingSameIndex() throws Exception {
|
||||
Settings settings = settingsBuilder()
|
||||
.put("gateway.type", "local").put("gateway.local.auto_import_dangled", "no")
|
||||
.put("index.number_of_shards", 1).put("index.number_of_replicas", 1)
|
||||
.build();
|
||||
|
||||
logger.info("--> starting two nodes");
|
||||
|
@ -40,9 +40,7 @@ import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.is;
|
||||
import static org.hamcrest.Matchers.notNullValue;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
*
|
||||
@ -50,28 +48,31 @@ import static org.hamcrest.Matchers.notNullValue;
|
||||
@ClusterScope(numNodes=0, scope=Scope.TEST)
|
||||
public class QuorumLocalGatewayTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return 2;
|
||||
}
|
||||
|
||||
@Test
|
||||
@Slow
|
||||
public void testChangeInitialShardsRecovery() throws Exception {
|
||||
logger.info("--> starting 3 nodes");
|
||||
final String[] nodes = new String[3];
|
||||
nodes[0] = cluster().startNode(settingsBuilder().put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 2).build());
|
||||
nodes[1] = cluster().startNode(settingsBuilder().put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 2).build());
|
||||
nodes[2] = cluster().startNode(settingsBuilder().put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 2).build());
|
||||
nodes[0] = cluster().startNode(settingsBuilder().put("gateway.type", "local").build());
|
||||
nodes[1] = cluster().startNode(settingsBuilder().put("gateway.type", "local").build());
|
||||
nodes[2] = cluster().startNode(settingsBuilder().put("gateway.type", "local").build());
|
||||
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
NumShards test = getNumShards("test");
|
||||
|
||||
logger.info("--> indexing...");
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject().field("field", "value1").endObject()).get();
|
||||
//We don't check for failures in the flush response: if we do we might get the following:
|
||||
// FlushNotAllowedEngineException[[test][1] recovery is in progress, flush [COMMIT_TRANSLOG] is not allowed]
|
||||
client().admin().indices().prepareFlush().get();
|
||||
flush();
|
||||
client().prepareIndex("test", "type1", "2").setSource(jsonBuilder().startObject().field("field", "value2").endObject()).get();
|
||||
assertNoFailures(client().admin().indices().prepareRefresh().execute().get());
|
||||
|
||||
logger.info("--> running cluster_health (wait for the shards to startup)");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForActiveShards(6)).actionGet();
|
||||
logger.info("--> done cluster_health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
refresh();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).get(), 2l);
|
||||
@ -93,7 +94,7 @@ public class QuorumLocalGatewayTests extends ElasticsearchIntegrationTest {
|
||||
if (randomBoolean()) {
|
||||
Thread.sleep(between(1, 400)); // wait a bit and give is a chance to try to allocate
|
||||
}
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForNodes("1")).actionGet();
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForNodes("1")).actionGet();
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.RED)); // nothing allocated yet
|
||||
assertThat(awaitBusy(new Predicate<Object>() {
|
||||
@ -109,8 +110,8 @@ public class QuorumLocalGatewayTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("--> change the recovery.initial_shards setting, and make sure its recovered");
|
||||
client().admin().indices().prepareUpdateSettings("test").setSettings(settingsBuilder().put("recovery.initial_shards", 1)).get();
|
||||
|
||||
logger.info("--> running cluster_health (wait for the shards to startup), 2 shards since we only have 1 node");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForYellowStatus().waitForActiveShards(2)).actionGet();
|
||||
logger.info("--> running cluster_health (wait for the shards to startup), primaries only since we only have 1 node");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForYellowStatus().waitForActiveShards(test.numPrimaries)).actionGet();
|
||||
logger.info("--> done cluster_health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
@ -125,23 +126,21 @@ public class QuorumLocalGatewayTests extends ElasticsearchIntegrationTest {
|
||||
public void testQuorumRecovery() throws Exception {
|
||||
|
||||
logger.info("--> starting 3 nodes");
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 2).build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 2).build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").put("index.number_of_shards", 2).put("index.number_of_replicas", 2).build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local").build());
|
||||
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
final NumShards test = getNumShards("test");
|
||||
|
||||
logger.info("--> indexing...");
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject().field("field", "value1").endObject()).get();
|
||||
//We don't check for failures in the flush response: if we do we might get the following:
|
||||
// FlushNotAllowedEngineException[[test][1] recovery is in progress, flush [COMMIT_TRANSLOG] is not allowed]
|
||||
client().admin().indices().prepareFlush().get();
|
||||
flush();
|
||||
client().prepareIndex("test", "type1", "2").setSource(jsonBuilder().startObject().field("field", "value2").endObject()).get();
|
||||
assertNoFailures(client().admin().indices().prepareRefresh().get());
|
||||
|
||||
logger.info("--> running cluster_health (wait for the shards to startup)");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForActiveShards(6)).actionGet();
|
||||
logger.info("--> done cluster_health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
refresh();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).get(), 2l);
|
||||
@ -163,7 +162,7 @@ public class QuorumLocalGatewayTests extends ElasticsearchIntegrationTest {
|
||||
@Override
|
||||
public boolean apply(Object input) {
|
||||
logger.info("--> running cluster_health (wait for the shards to startup)");
|
||||
ClusterHealthResponse clusterHealth = activeClient.admin().cluster().health(clusterHealthRequest().waitForYellowStatus().waitForNodes("2").waitForActiveShards(4)).actionGet();
|
||||
ClusterHealthResponse clusterHealth = activeClient.admin().cluster().health(clusterHealthRequest().waitForYellowStatus().waitForNodes("2").waitForActiveShards(test.numPrimaries * 2)).actionGet();
|
||||
logger.info("--> done cluster_health, status " + clusterHealth.getStatus());
|
||||
return (!clusterHealth.isTimedOut()) && clusterHealth.getStatus() == ClusterHealthStatus.YELLOW;
|
||||
}
|
||||
@ -180,10 +179,7 @@ public class QuorumLocalGatewayTests extends ElasticsearchIntegrationTest {
|
||||
});
|
||||
logger.info("--> all nodes are started back, verifying we got the latest version");
|
||||
logger.info("--> running cluster_health (wait for the shards to startup)");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForActiveShards(6)).actionGet();
|
||||
logger.info("--> done cluster_health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).get(), 3l);
|
||||
|
@ -20,8 +20,6 @@
|
||||
package org.elasticsearch.gateway.local;
|
||||
|
||||
import org.apache.lucene.util.LuceneTestCase.Slow;
|
||||
import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
|
||||
import org.elasticsearch.action.admin.cluster.health.ClusterHealthStatus;
|
||||
import org.elasticsearch.action.admin.indices.status.IndexShardStatus;
|
||||
import org.elasticsearch.action.admin.indices.status.IndicesStatusResponse;
|
||||
import org.elasticsearch.action.admin.indices.status.ShardStatus;
|
||||
@ -29,7 +27,6 @@ import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.cluster.ClusterState;
|
||||
import org.elasticsearch.cluster.routing.allocation.allocator.BalancedShardsAllocator;
|
||||
import org.elasticsearch.cluster.routing.allocation.decider.DisableAllocationDecider;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
@ -41,10 +38,10 @@ import org.elasticsearch.test.ElasticsearchIntegrationTest.Scope;
|
||||
import org.elasticsearch.test.TestCluster.RestartCallback;
|
||||
import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.client.Requests.clusterHealthRequest;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.termQuery;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
@ -54,7 +51,6 @@ import static org.hamcrest.Matchers.*;
|
||||
@ClusterScope(numNodes = 0, scope = Scope.TEST)
|
||||
public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
private ImmutableSettings.Builder settingsBuilder() {
|
||||
return ImmutableSettings.settingsBuilder().put("gateway.type", "local");
|
||||
}
|
||||
@ -63,12 +59,12 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
@Slow
|
||||
public void testX() throws Exception {
|
||||
|
||||
cluster().startNode(settingsBuilder().put("index.number_of_shards", 1).build());
|
||||
cluster().startNode(settingsBuilder().build());
|
||||
|
||||
String mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties").startObject("appAccountIds").field("type", "string").endObject().endObject()
|
||||
.endObject().endObject().string();
|
||||
client().admin().indices().prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
|
||||
client().prepareIndex("test", "type1", "10990239").setSource(jsonBuilder().startObject()
|
||||
.field("_id", "10990239")
|
||||
@ -86,15 +82,12 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
.field("_id", "11026351")
|
||||
.startArray("appAccountIds").value(14).endArray().endObject()).execute().actionGet();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
assertHitCount(client().prepareCount().setQuery(termQuery("appAccountIds", 179)).execute().actionGet(), 2);
|
||||
cluster().fullRestart();
|
||||
|
||||
logger.info("Running Cluster Health (wait for the shards to startup)");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForYellowStatus().waitForActiveShards(1)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
ensureYellow();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
assertHitCount(client().prepareCount().setQuery(termQuery("appAccountIds", 179)).execute().actionGet(), 2);
|
||||
@ -102,10 +95,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
cluster().fullRestart();
|
||||
|
||||
logger.info("Running Cluster Health (wait for the shards to startup)");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForYellowStatus().waitForActiveShards(1)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
ensureYellow();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
assertHitCount(client().prepareCount().setQuery(termQuery("appAccountIds", 179)).execute().actionGet(), 2);
|
||||
@ -115,19 +105,20 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
@Slow
|
||||
public void testSingleNodeNoFlush() throws Exception {
|
||||
|
||||
cluster().startNode(settingsBuilder().put("index.number_of_shards", 1).build());
|
||||
cluster().startNode(settingsBuilder().build());
|
||||
|
||||
String mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties").startObject("field").field("type", "string").endObject().startObject("num").field("type", "integer").endObject().endObject()
|
||||
.endObject().endObject().string();
|
||||
client().admin().indices().prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
|
||||
for (int i = 0; i < 100; i++) {
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject().field("_id", "1").field("field", "value1").startArray("num").value(14).value(179).endArray().endObject()).execute().actionGet();
|
||||
client().prepareIndex("test", "type1", "2").setSource(jsonBuilder().startObject().field("_id", "2").field("field", "value2").startArray("num").value(14).endArray().endObject()).execute().actionGet();
|
||||
}
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).execute().actionGet(), 2);
|
||||
assertHitCount(client().prepareCount().setQuery(termQuery("field", "value1")).execute().actionGet(), 1);
|
||||
@ -138,10 +129,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
cluster().fullRestart();
|
||||
|
||||
logger.info("Running Cluster Health (wait for the shards to startup)");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForYellowStatus().waitForActiveShards(1)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
ensureYellow();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).execute().actionGet(), 2);
|
||||
@ -154,10 +142,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
|
||||
|
||||
logger.info("Running Cluster Health (wait for the shards to startup)");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForYellowStatus().waitForActiveShards(1)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
ensureYellow();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).execute().actionGet(), 2);
|
||||
@ -172,21 +157,18 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
@Slow
|
||||
public void testSingleNodeWithFlush() throws Exception {
|
||||
|
||||
cluster().startNode(settingsBuilder().put("index.number_of_shards", 1).build());
|
||||
cluster().startNode(settingsBuilder().build());
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject().field("field", "value1").endObject()).execute().actionGet();
|
||||
client().admin().indices().prepareFlush().execute().actionGet();
|
||||
flush();
|
||||
client().prepareIndex("test", "type1", "2").setSource(jsonBuilder().startObject().field("field", "value2").endObject()).execute().actionGet();
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).execute().actionGet(), 2);
|
||||
|
||||
cluster().fullRestart();
|
||||
|
||||
logger.info("Running Cluster Health (wait for the shards to startup)");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForYellowStatus().waitForActiveShards(1)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
ensureYellow();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).execute().actionGet(), 2);
|
||||
@ -195,10 +177,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
cluster().fullRestart();
|
||||
|
||||
logger.info("Running Cluster Health (wait for the shards to startup)");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForYellowStatus().waitForActiveShards(1)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
ensureYellow();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).execute().actionGet(), 2);
|
||||
@ -209,19 +188,16 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
@Slow
|
||||
public void testTwoNodeFirstNodeCleared() throws Exception {
|
||||
|
||||
final String firstNode = cluster().startNode(settingsBuilder().put("index.number_of_shards", 1).build());
|
||||
cluster().startNode(settingsBuilder().put("index.number_of_shards", 1).build());
|
||||
final String firstNode = cluster().startNode(settingsBuilder().build());
|
||||
cluster().startNode(settingsBuilder().build());
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject().field("field", "value1").endObject()).execute().actionGet();
|
||||
client().admin().indices().prepareFlush().execute().actionGet();
|
||||
flush();
|
||||
client().prepareIndex("test", "type1", "2").setSource(jsonBuilder().startObject().field("field", "value2").endObject()).execute().actionGet();
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
logger.info("Running Cluster Health (wait for the shards to startup)");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForActiveShards(2)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).execute().actionGet(), 2);
|
||||
@ -241,10 +217,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
});
|
||||
|
||||
logger.info("Running Cluster Health (wait for the shards to startup)");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForActiveShards(2)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).execute().actionGet(), 2);
|
||||
@ -256,8 +229,8 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
public void testLatestVersionLoaded() throws Exception {
|
||||
// clean two nodes
|
||||
|
||||
cluster().startNode(settingsBuilder().put("index.number_of_shards", 1).put("gateway.recover_after_nodes", 2).build());
|
||||
cluster().startNode(settingsBuilder().put("index.number_of_shards", 1).put("gateway.recover_after_nodes", 2).build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.recover_after_nodes", 2).build());
|
||||
cluster().startNode(settingsBuilder().put("gateway.recover_after_nodes", 2).build());
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject().field("field", "value1").endObject()).execute().actionGet();
|
||||
client().admin().indices().prepareFlush().execute().actionGet();
|
||||
@ -265,10 +238,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
|
||||
logger.info("--> running cluster_health (wait for the shards to startup)");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForActiveShards(2)).actionGet();
|
||||
logger.info("--> done cluster_health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareCount().setQuery(matchAllQuery()).execute().actionGet(), 2);
|
||||
@ -312,10 +282,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
});
|
||||
|
||||
logger.info("--> running cluster_health (wait for the shards to startup)");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForActiveShards(2)).actionGet();
|
||||
logger.info("--> done cluster_health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
assertThat(client().admin().cluster().prepareState().execute().get().getState().getMetaData().uuid(), equalTo(metaDataUuid));
|
||||
|
||||
@ -333,13 +300,11 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
@Test
|
||||
@Slow
|
||||
public void testReusePeerRecovery() throws Exception {
|
||||
|
||||
|
||||
ImmutableSettings.Builder settings = settingsBuilder()
|
||||
.put("action.admin.cluster.node.shutdown.delay", "10ms")
|
||||
.put("gateway.recover_after_nodes", 4)
|
||||
|
||||
.put(BalancedShardsAllocator.SETTING_THRESHOLD, 1.1f); // use less agressive settings
|
||||
.put(BalancedShardsAllocator.SETTING_THRESHOLD, 1.1f); // use less aggressive settings
|
||||
|
||||
cluster().startNode(settings);
|
||||
cluster().startNode(settings);
|
||||
@ -354,10 +319,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
}
|
||||
}
|
||||
logger.info("Running Cluster Health");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForRelocatingShards(0)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> shutting down the nodes");
|
||||
// Disable allocations while we are closing nodes
|
||||
@ -365,10 +327,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
cluster().fullRestart();
|
||||
|
||||
logger.info("Running Cluster Health");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForActiveShards(10)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> shutting down the nodes");
|
||||
// Disable allocations while we are closing nodes
|
||||
@ -377,10 +336,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
|
||||
|
||||
logger.info("Running Cluster Health");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForActiveShards(10)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
IndicesStatusResponse statusResponse = client().admin().indices().prepareStatus("test").setRecovery(true).execute().actionGet();
|
||||
for (IndexShardStatus indexShardStatus : statusResponse.getIndex("test")) {
|
||||
@ -406,8 +362,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
|
||||
cluster().startNode(settingsBuilder().put("path.data", "data/data2").build());
|
||||
|
||||
ClusterHealthResponse health = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertThat(health.isTimedOut(), equalTo(false));
|
||||
ensureGreen();
|
||||
|
||||
cluster().fullRestart(new RestartCallback() {
|
||||
|
||||
@ -417,9 +372,7 @@ public class SimpleRecoveryLocalGatewayTests extends ElasticsearchIntegrationTes
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
health = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForYellowStatus().execute().actionGet();
|
||||
assertThat(health.isTimedOut(), equalTo(false));
|
||||
ensureYellow();
|
||||
|
||||
assertThat(client().admin().indices().prepareExists("test").execute().actionGet().isExists(), equalTo(true));
|
||||
assertHitCount(client().prepareCount("test").setQuery(QueryBuilders.matchAllQuery()).execute().actionGet(), 1);
|
||||
|
@ -32,18 +32,20 @@ import org.junit.Test;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.search.facet.FacetBuilders.termsFacet;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
|
||||
public class FieldDataFilterIntegrationTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return 0;
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testRegexpFilter() throws IOException {
|
||||
CreateIndexRequestBuilder builder = prepareCreate("test").setSettings(settingsBuilder()
|
||||
.put("index.number_of_shards", between(1,5))
|
||||
.put("index.number_of_replicas", 0));
|
||||
CreateIndexRequestBuilder builder = prepareCreate("test");
|
||||
XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
|
||||
.startObject("properties")
|
||||
.startObject("name")
|
||||
@ -82,7 +84,6 @@ public class FieldDataFilterIntegrationTests extends ElasticsearchIntegrationTes
|
||||
assertThat(notFilteredFacet.getEntries().size(), Matchers.equalTo(2));
|
||||
assertThat(notFilteredFacet.getEntries().get(0).getTerm().string(), Matchers.isOneOf("bacon", "bastards"));
|
||||
assertThat(notFilteredFacet.getEntries().get(1).getTerm().string(), Matchers.isOneOf("bacon", "bastards"));
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -38,7 +38,6 @@ import org.elasticsearch.search.sort.SortOrder;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.elasticsearch.test.engine.MockInternalEngine;
|
||||
import org.elasticsearch.test.engine.ThrowingAtomicReaderWrapper;
|
||||
import org.elasticsearch.test.junit.annotations.TestLogging;
|
||||
import org.junit.Test;
|
||||
|
||||
import java.io.IOException;
|
||||
@ -61,7 +60,7 @@ public class RandomExceptionCircuitBreakerTests extends ElasticsearchIntegration
|
||||
.clear().setBreaker(true).execute().actionGet().getNodes()) {
|
||||
assertThat("Breaker is not set to 0", node.getBreaker().getEstimated(), equalTo(0L));
|
||||
}
|
||||
final int numShards = between(1, 5);
|
||||
|
||||
final int numReplicas = randomIntBetween(0, 1);
|
||||
String mapping = XContentFactory.jsonBuilder()
|
||||
.startObject()
|
||||
@ -107,7 +106,7 @@ public class RandomExceptionCircuitBreakerTests extends ElasticsearchIntegration
|
||||
}
|
||||
|
||||
ImmutableSettings.Builder settings = settingsBuilder()
|
||||
.put("index.number_of_shards", numShards)
|
||||
.put(indexSettings())
|
||||
.put("index.number_of_replicas", numReplicas)
|
||||
.put(MockInternalEngine.READER_WRAPPER_TYPE, RandomExceptionDirectoryReaderWrapper.class.getName())
|
||||
.put(EXCEPTION_TOP_LEVEL_RATIO_KEY, topLevelRate)
|
||||
|
@ -22,7 +22,6 @@ package org.elasticsearch.indices.mapping;
|
||||
|
||||
import org.elasticsearch.action.ActionListener;
|
||||
import org.elasticsearch.action.index.IndexResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.index.query.QueryBuilders;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
@ -33,6 +32,7 @@ import java.util.Map;
|
||||
import java.util.concurrent.CopyOnWriteArrayList;
|
||||
import java.util.concurrent.CountDownLatch;
|
||||
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.hamcrest.Matchers.emptyIterable;
|
||||
|
||||
@ -40,6 +40,11 @@ public class ConcurrentDynamicTemplateTests extends ElasticsearchIntegrationTest
|
||||
|
||||
private final String mappingType = "test-mapping";
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
@Test // see #3544
|
||||
public void testConcurrentDynamicMapping() throws Exception {
|
||||
final String fieldName = "field";
|
||||
@ -53,12 +58,8 @@ public class ConcurrentDynamicTemplateTests extends ElasticsearchIntegrationTest
|
||||
int iters = atLeast(5);
|
||||
for (int i = 0; i < iters; i++) {
|
||||
wipeIndices("test");
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("number_of_shards", between(1, 5))
|
||||
.put("number_of_replicas", between(0, 1)).build())
|
||||
.addMapping(mappingType, mapping).execute().actionGet();
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping(mappingType, mapping));
|
||||
ensureYellow();
|
||||
int numDocs = atLeast(10);
|
||||
final CountDownLatch latch = new CountDownLatch(numDocs);
|
||||
|
@ -20,8 +20,6 @@
|
||||
package org.elasticsearch.indices.mapping;
|
||||
|
||||
import org.elasticsearch.action.admin.indices.mapping.get.GetFieldMappingsResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.hamcrest.Matchers;
|
||||
@ -31,6 +29,7 @@ import java.io.IOException;
|
||||
import java.util.Map;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
@ -38,6 +37,11 @@ import static org.hamcrest.Matchers.*;
|
||||
*/
|
||||
public class SimpleGetFieldMappingsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
@Test
|
||||
public void getMappingsWhereThereAreNone() {
|
||||
createIndex("index");
|
||||
@ -59,19 +63,12 @@ public class SimpleGetFieldMappingsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void simpleGetFieldMappings() throws Exception {
|
||||
|
||||
Settings.Builder settings = ImmutableSettings.settingsBuilder()
|
||||
.put("number_of_shards", randomIntBetween(1, 3), "number_of_replicas", randomIntBetween(0, 1));
|
||||
|
||||
assertTrue(client().admin().indices().prepareCreate("indexa")
|
||||
assertAcked(prepareCreate("indexa")
|
||||
.addMapping("typeA", getMappingForType("typeA"))
|
||||
.addMapping("typeB", getMappingForType("typeB"))
|
||||
.setSettings(settings)
|
||||
.get().isAcknowledged());
|
||||
assertTrue(client().admin().indices().prepareCreate("indexb")
|
||||
.addMapping("typeB", getMappingForType("typeB")));
|
||||
assertAcked(client().admin().indices().prepareCreate("indexb")
|
||||
.addMapping("typeA", getMappingForType("typeA"))
|
||||
.addMapping("typeB", getMappingForType("typeB"))
|
||||
.setSettings(settings)
|
||||
.get().isAcknowledged());
|
||||
.addMapping("typeB", getMappingForType("typeB")));
|
||||
|
||||
ensureYellow();
|
||||
|
||||
@ -137,8 +134,7 @@ public class SimpleGetFieldMappingsTests extends ElasticsearchIntegrationTest {
|
||||
@SuppressWarnings("unchecked")
|
||||
@Test
|
||||
public void simpleGetFieldMappingsWithDefaults() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.addMapping("type", getMappingForType("type")).get();
|
||||
assertAcked(prepareCreate("test").addMapping("type", getMappingForType("type")));
|
||||
|
||||
client().prepareIndex("test", "type", "1").setSource("num", 1).get();
|
||||
ensureYellow();
|
||||
|
@ -35,7 +35,6 @@ import org.elasticsearch.cluster.ClusterState;
|
||||
import org.elasticsearch.cluster.metadata.MappingMetaData;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.collect.ImmutableOpenMap;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.xcontent.json.JsonXContent;
|
||||
import org.elasticsearch.index.mapper.MapperParsingException;
|
||||
import org.elasticsearch.index.mapper.MapperService;
|
||||
@ -50,6 +49,7 @@ import java.util.Map;
|
||||
import java.util.concurrent.CyclicBarrier;
|
||||
import java.util.concurrent.atomic.AtomicBoolean;
|
||||
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertThrows;
|
||||
@ -61,7 +61,7 @@ public class UpdateMappingTests extends ElasticsearchIntegrationTest {
|
||||
public void dynamicUpdates() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
).execute().actionGet();
|
||||
@ -124,7 +124,7 @@ public class UpdateMappingTests extends ElasticsearchIntegrationTest {
|
||||
public void updateMappingWithoutType() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
).addMapping("doc", "{\"doc\":{\"properties\":{\"body\":{\"type\":\"string\"}}}}")
|
||||
@ -146,7 +146,7 @@ public class UpdateMappingTests extends ElasticsearchIntegrationTest {
|
||||
public void updateMappingWithoutTypeMultiObjects() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
).execute().actionGet();
|
||||
@ -168,7 +168,7 @@ public class UpdateMappingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 0)
|
||||
).addMapping("type", "{\"type\":{\"properties\":{\"body\":{\"type\":\"string\"}}}}")
|
||||
@ -200,7 +200,7 @@ public class UpdateMappingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 0)
|
||||
).addMapping("type", "{\"type\":{\"properties\":{\"body\":{\"type\":\"string\"}}}}")
|
||||
@ -224,7 +224,7 @@ public class UpdateMappingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 0)
|
||||
).addMapping("type", "{\"type\":{\"properties\":{\"body\":{\"type\":\"string\"}}}}")
|
||||
@ -406,11 +406,7 @@ public class UpdateMappingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void updateMappingConcurrently() throws Throwable {
|
||||
// Test that we can concurrently update different indexes and types.
|
||||
int shardNo = Math.max(5, cluster().size());
|
||||
|
||||
prepareCreate("test1").setSettings("index.number_of_shards", shardNo).execute().actionGet();
|
||||
prepareCreate("test2").setSettings("index.number_of_shards", shardNo).execute().actionGet();
|
||||
createIndex("test1", "test2");
|
||||
|
||||
// This is important. The test assumes all nodes are aware of all indices. Due to initializing shard throttling
|
||||
// not all shards are allocated with the initial create index. Wait for it..
|
||||
|
@ -29,6 +29,7 @@ import org.junit.Test;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
@ -37,19 +38,21 @@ import static org.hamcrest.Matchers.equalTo;
|
||||
*/
|
||||
public class UpdateNumberOfReplicasTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Test
|
||||
public void simpleUpdateNumberOfReplicasTests() throws Exception {
|
||||
logger.info("Creating index test");
|
||||
prepareCreate("test", 2).execute().actionGet();
|
||||
assertAcked(prepareCreate("test", 2));
|
||||
logger.info("Running Cluster Health");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
|
||||
NumShards numShards = getNumShards("test");
|
||||
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(5));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(1));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(10));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.totalNumShards));
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
client().prepareIndex("test", "type1", Integer.toString(i)).setSource(jsonBuilder().startObject()
|
||||
@ -57,7 +60,7 @@ public class UpdateNumberOfReplicasTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()).get();
|
||||
}
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
CountResponse countResponse = client().prepareCount().setQuery(matchAllQuery()).get();
|
||||
@ -69,13 +72,14 @@ public class UpdateNumberOfReplicasTests extends ElasticsearchIntegrationTest {
|
||||
Thread.sleep(200);
|
||||
|
||||
logger.info("Running Cluster Health");
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForYellowStatus().setWaitForActiveShards(10).execute().actionGet();
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForYellowStatus().setWaitForActiveShards(numShards.numPrimaries * 2).execute().actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(5));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(10));
|
||||
//only 2 copies allocated (1 replica) across 2 nodes
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries * 2));
|
||||
|
||||
logger.info("starting another node to new replicas will be allocated to it");
|
||||
allowNodes("test", 3);
|
||||
@ -86,9 +90,10 @@ public class UpdateNumberOfReplicasTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(5));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(15));
|
||||
//all 3 copies allocated across 3 nodes
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries * 3));
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
CountResponse countResponse = client().prepareCount().setQuery(matchAllQuery()).get();
|
||||
@ -104,9 +109,10 @@ public class UpdateNumberOfReplicasTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(5));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(0));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(5));
|
||||
//a single copy is allocated (replica set to 0)
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries));
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
assertHitCount(client().prepareSearch().setQuery(matchAllQuery()).get(), 10);
|
||||
@ -117,123 +123,129 @@ public class UpdateNumberOfReplicasTests extends ElasticsearchIntegrationTest {
|
||||
public void testAutoExpandNumberOfReplicas0ToData() {
|
||||
cluster().ensureAtMostNumNodes(2);
|
||||
logger.info("--> creating index test with auto expand replicas");
|
||||
prepareCreate("test", 2, settingsBuilder().put("index.number_of_shards", 2).put("auto_expand_replicas", "0-all")).execute().actionGet();
|
||||
assertAcked(prepareCreate("test", 2, settingsBuilder().put("auto_expand_replicas", "0-all")));
|
||||
|
||||
NumShards numShards = getNumShards("test");
|
||||
|
||||
logger.info("--> running cluster health");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(4).execute().actionGet();
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(numShards.numPrimaries * 2).execute().actionGet();
|
||||
logger.info("--> done cluster health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(1));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(4));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries * 2));
|
||||
|
||||
logger.info("--> add another node, should increase the number of replicas");
|
||||
allowNodes("test", 3);
|
||||
|
||||
logger.info("--> running cluster health");
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(6).setWaitForNodes(">=3").execute().actionGet();
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(numShards.numPrimaries * 3).setWaitForNodes(">=3").execute().actionGet();
|
||||
logger.info("--> done cluster health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(6));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries * 3));
|
||||
|
||||
logger.info("--> closing one node");
|
||||
cluster().ensureAtMostNumNodes(2);
|
||||
allowNodes("test", 2);
|
||||
|
||||
logger.info("--> running cluster health");
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(4).setWaitForNodes(">=2").execute().actionGet();
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(numShards.numPrimaries * 2).setWaitForNodes(">=2").execute().actionGet();
|
||||
logger.info("--> done cluster health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(1));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(4));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries * 2));
|
||||
|
||||
logger.info("--> closing another node");
|
||||
cluster().ensureAtMostNumNodes(1);
|
||||
allowNodes("test", 1);
|
||||
|
||||
logger.info("--> running cluster health");
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForNodes(">=1").setWaitForActiveShards(2).execute().actionGet();
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForNodes(">=1").setWaitForActiveShards(numShards.numPrimaries).execute().actionGet();
|
||||
logger.info("--> done cluster health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(0));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testAutoExpandNumberReplicas1ToData() {
|
||||
logger.info("--> creating index test with auto expand replicas");
|
||||
cluster().ensureAtMostNumNodes(2);
|
||||
prepareCreate("test", 2, settingsBuilder().put("index.number_of_shards", 2).put("auto_expand_replicas", "1-all")).execute().actionGet();
|
||||
assertAcked(prepareCreate("test", 2, settingsBuilder().put("auto_expand_replicas", "1-all")));
|
||||
|
||||
NumShards numShards = getNumShards("test");
|
||||
|
||||
logger.info("--> running cluster health");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(4).execute().actionGet();
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(numShards.numPrimaries * 2).execute().actionGet();
|
||||
logger.info("--> done cluster health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(1));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(4));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries * 2));
|
||||
|
||||
logger.info("--> add another node, should increase the number of replicas");
|
||||
allowNodes("test", 3);
|
||||
|
||||
logger.info("--> running cluster health");
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(6).execute().actionGet();
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(numShards.numPrimaries * 3).execute().actionGet();
|
||||
logger.info("--> done cluster health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(6));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries * 3));
|
||||
|
||||
logger.info("--> closing one node");
|
||||
cluster().ensureAtMostNumNodes(2);
|
||||
allowNodes("test", 2);
|
||||
|
||||
logger.info("--> running cluster health");
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForNodes(">=2").setWaitForActiveShards(4).execute().actionGet();
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForNodes(">=2").setWaitForActiveShards(numShards.numPrimaries * 2).execute().actionGet();
|
||||
logger.info("--> done cluster health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(1));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(4));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries * 2));
|
||||
|
||||
logger.info("--> closing another node");
|
||||
cluster().ensureAtMostNumNodes(1);
|
||||
allowNodes("test", 1);
|
||||
|
||||
logger.info("--> running cluster health");
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForYellowStatus().setWaitForNodes(">=1").setWaitForActiveShards(2).execute().actionGet();
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForYellowStatus().setWaitForNodes(">=1").setWaitForActiveShards(numShards.numPrimaries).execute().actionGet();
|
||||
logger.info("--> done cluster health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(1));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testAutoExpandNumberReplicas2() {
|
||||
logger.info("--> creating index test with auto expand replicas set to 0-2");
|
||||
prepareCreate("test", 3, settingsBuilder().put("index.number_of_shards", 2).put("auto_expand_replicas", "0-2")).execute().actionGet();
|
||||
assertAcked(prepareCreate("test", 3, settingsBuilder().put("auto_expand_replicas", "0-2")));
|
||||
|
||||
NumShards numShards = getNumShards("test");
|
||||
|
||||
logger.info("--> running cluster health");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(6).execute().actionGet();
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(numShards.numPrimaries * 3).execute().actionGet();
|
||||
logger.info("--> done cluster health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(6));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries * 3));
|
||||
|
||||
logger.info("--> add two more nodes");
|
||||
allowNodes("test", 4);
|
||||
@ -243,12 +255,12 @@ public class UpdateNumberOfReplicasTests extends ElasticsearchIntegrationTest {
|
||||
client().admin().indices().prepareUpdateSettings("test").setSettings(settingsBuilder().put("auto_expand_replicas", "0-3")).execute().actionGet();
|
||||
|
||||
logger.info("--> running cluster health");
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(8).execute().actionGet();
|
||||
clusterHealth = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().setWaitForActiveShards(numShards.numPrimaries * 4).execute().actionGet();
|
||||
logger.info("--> done cluster health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(2));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActivePrimaryShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(clusterHealth.getIndices().get("test").getNumberOfReplicas(), equalTo(3));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(8));
|
||||
assertThat(clusterHealth.getIndices().get("test").getActiveShards(), equalTo(numShards.numPrimaries * 4));
|
||||
}
|
||||
}
|
||||
|
@ -57,10 +57,12 @@ public class SimpleIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("--> waiting for green status");
|
||||
ensureGreen();
|
||||
|
||||
NumShards numShards = getNumShards("test");
|
||||
|
||||
ClusterStateResponse stateResponse = client().admin().cluster().prepareState().get();
|
||||
assertThat(stateResponse.getState().metaData().index("test").state(), equalTo(IndexMetaData.State.OPEN));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(5));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(10));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(numShards.numPrimaries));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(numShards.totalNumShards));
|
||||
|
||||
logger.info("--> indexing a simple document");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1").get();
|
||||
@ -94,8 +96,9 @@ public class SimpleIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
stateResponse = client().admin().cluster().prepareState().get();
|
||||
assertThat(stateResponse.getState().metaData().index("test").state(), equalTo(IndexMetaData.State.OPEN));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(5));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(10));
|
||||
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(numShards.numPrimaries));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(numShards.totalNumShards));
|
||||
|
||||
logger.info("--> indexing a simple document");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1").get();
|
||||
@ -105,8 +108,7 @@ public class SimpleIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
public void testFastCloseAfterCreateDoesNotClose() {
|
||||
logger.info("--> creating test index that cannot be allocated");
|
||||
client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.routing.allocation.include.tag", "no_such_node")
|
||||
.put("index.number_of_replicas", 1).build()).get();
|
||||
.put("index.routing.allocation.include.tag", "no_such_node").build()).get();
|
||||
|
||||
ClusterHealthResponse health = client().admin().cluster().prepareHealth("test").setWaitForNodes(">=2").get();
|
||||
assertThat(health.isTimedOut(), equalTo(false));
|
||||
@ -126,10 +128,12 @@ public class SimpleIndexStateTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("--> waiting for green status");
|
||||
ensureGreen();
|
||||
|
||||
NumShards numShards = getNumShards("test");
|
||||
|
||||
ClusterStateResponse stateResponse = client().admin().cluster().prepareState().get();
|
||||
assertThat(stateResponse.getState().metaData().index("test").state(), equalTo(IndexMetaData.State.OPEN));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(5));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(10));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shards().size(), equalTo(numShards.numPrimaries));
|
||||
assertThat(stateResponse.getState().routingTable().index("test").shardsWithState(ShardRoutingState.STARTED).size(), equalTo(numShards.totalNumShards));
|
||||
|
||||
logger.info("--> indexing a simple document");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1").get();
|
||||
|
@ -40,6 +40,7 @@ import java.io.IOException;
|
||||
import java.util.EnumSet;
|
||||
import java.util.Random;
|
||||
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
@ -189,7 +190,9 @@ public class SimpleIndexStatsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSegmentsStats() {
|
||||
prepareCreate("test1", 2).setSettings("index.number_of_shards", 5, "index.number_of_replicas", 1).get();
|
||||
assertAcked(prepareCreate("test1", 2));
|
||||
|
||||
NumShards test1 = getNumShards("test1");
|
||||
|
||||
ClusterHealthResponse clusterHealthResponse = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().get();
|
||||
assertThat(clusterHealthResponse.isTimedOut(), equalTo(false));
|
||||
@ -203,7 +206,7 @@ public class SimpleIndexStatsTests extends ElasticsearchIntegrationTest {
|
||||
IndicesStatsResponse stats = client().admin().indices().prepareStats().setSegments(true).get();
|
||||
|
||||
assertThat(stats.getTotal().getSegments(), notNullValue());
|
||||
assertThat(stats.getTotal().getSegments().getCount(), equalTo(10l));
|
||||
assertThat(stats.getTotal().getSegments().getCount(), equalTo((long)test1.totalNumShards));
|
||||
assumeTrue(org.elasticsearch.Version.CURRENT.luceneVersion != Version.LUCENE_46);
|
||||
assertThat(stats.getTotal().getSegments().getMemoryInBytes(), greaterThan(0l));
|
||||
}
|
||||
|
@ -37,13 +37,9 @@ import java.util.Set;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.is;
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
@ClusterScope(scope=Scope.TEST, numNodes=1)
|
||||
public class IndexTemplateFileLoadingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Override
|
||||
protected Settings nodeSettings(int nodeOrdinal) {
|
||||
ImmutableSettings.Builder settingsBuilder = ImmutableSettings.settingsBuilder();
|
||||
@ -69,6 +65,12 @@ public class IndexTemplateFileLoadingTests extends ElasticsearchIntegrationTest
|
||||
return settingsBuilder.build();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected int numberOfShards() {
|
||||
//number of shards won't be set through index settings, the one from the index templates needs to be used
|
||||
return -1;
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testThatLoadingTemplateFromFileWorks() throws Exception {
|
||||
final int iters = atLeast(5);
|
||||
|
@ -1,5 +1,6 @@
|
||||
{
|
||||
"template" : "foo*",
|
||||
"order" : 10,
|
||||
"settings" : {
|
||||
"index.number_of_shards": 10,
|
||||
"index.number_of_replicas": 0
|
||||
|
@ -1,5 +1,6 @@
|
||||
{
|
||||
"template" : "foo*",
|
||||
"order" : 10,
|
||||
"settings" : {
|
||||
"number_of_shards": 10,
|
||||
"number_of_replicas": 0
|
||||
|
@ -1,5 +1,6 @@
|
||||
{
|
||||
"template" : "foo*",
|
||||
"order" : 10,
|
||||
"settings" : {
|
||||
"index" : {
|
||||
"number_of_shards": 10,
|
||||
|
@ -1,6 +1,7 @@
|
||||
{
|
||||
"mytemplate" : {
|
||||
"template" : "foo*",
|
||||
"order" : 10,
|
||||
"settings" : {
|
||||
"index.number_of_shards": 10,
|
||||
"index.number_of_replicas": 0
|
||||
|
@ -1,6 +1,7 @@
|
||||
{
|
||||
"mytemplate" : {
|
||||
"template" : "foo*",
|
||||
"order" : 10,
|
||||
"settings" : {
|
||||
"number_of_shards": 10,
|
||||
"number_of_replicas": 0
|
||||
|
@ -1,6 +1,7 @@
|
||||
{
|
||||
"mytemplate" : {
|
||||
"template" : "foo*",
|
||||
"order" : 10,
|
||||
"settings" : {
|
||||
"index" : {
|
||||
"number_of_shards": 10,
|
||||
|
@ -19,14 +19,10 @@
|
||||
|
||||
package org.elasticsearch.indices.warmer;
|
||||
|
||||
import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
|
||||
import org.elasticsearch.action.admin.indices.warmer.delete.DeleteWarmerResponse;
|
||||
import org.elasticsearch.action.admin.indices.warmer.put.PutWarmerResponse;
|
||||
import org.elasticsearch.cluster.ClusterState;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.logging.ESLogger;
|
||||
import org.elasticsearch.common.logging.Loggers;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.index.query.QueryBuilders;
|
||||
import org.elasticsearch.search.warmer.IndexWarmersMetaData;
|
||||
@ -38,6 +34,7 @@ import org.hamcrest.Matchers;
|
||||
import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
/**
|
||||
@ -54,20 +51,14 @@ public class LocalGatewayIndicesWarmerTests extends ElasticsearchIntegrationTest
|
||||
cluster().startNode(settingsBuilder().put("gateway.type", "local"));
|
||||
|
||||
logger.info("--> putting two templates");
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1))
|
||||
.execute().actionGet();
|
||||
createIndex("test");
|
||||
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForYellowStatus().execute().actionGet();
|
||||
ensureYellow();
|
||||
|
||||
PutWarmerResponse putWarmerResponse = client().admin().indices().preparePutWarmer("warmer_1")
|
||||
.setSearchRequest(client().prepareSearch("test").setQuery(QueryBuilders.termQuery("field", "value1")))
|
||||
.execute().actionGet();
|
||||
assertThat(putWarmerResponse.isAcknowledged(), equalTo(true));
|
||||
putWarmerResponse = client().admin().indices().preparePutWarmer("warmer_2")
|
||||
.setSearchRequest(client().prepareSearch("test").setQuery(QueryBuilders.termQuery("field", "value2")))
|
||||
.execute().actionGet();
|
||||
assertThat(putWarmerResponse.isAcknowledged(), equalTo(true));
|
||||
assertAcked(client().admin().indices().preparePutWarmer("warmer_1")
|
||||
.setSearchRequest(client().prepareSearch("test").setQuery(QueryBuilders.termQuery("field", "value1"))));
|
||||
assertAcked(client().admin().indices().preparePutWarmer("warmer_2")
|
||||
.setSearchRequest(client().prepareSearch("test").setQuery(QueryBuilders.termQuery("field", "value2"))));
|
||||
|
||||
logger.info("--> put template with warmer");
|
||||
client().admin().indices().preparePutTemplate("template_1")
|
||||
@ -105,8 +96,7 @@ public class LocalGatewayIndicesWarmerTests extends ElasticsearchIntegrationTest
|
||||
}
|
||||
});
|
||||
|
||||
ClusterHealthResponse healthResponse = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForYellowStatus().execute().actionGet();
|
||||
assertThat(healthResponse.isTimedOut(), equalTo(false));
|
||||
ensureYellow();
|
||||
|
||||
logger.info("--> verify warmers are recovered");
|
||||
clusterState = client().admin().cluster().prepareState().execute().actionGet().getState();
|
||||
@ -144,8 +134,7 @@ public class LocalGatewayIndicesWarmerTests extends ElasticsearchIntegrationTest
|
||||
}
|
||||
});
|
||||
|
||||
healthResponse = client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForYellowStatus().execute().actionGet();
|
||||
assertThat(healthResponse.isTimedOut(), equalTo(false));
|
||||
ensureYellow();
|
||||
|
||||
logger.info("--> verify warmers are recovered");
|
||||
clusterState = client().admin().cluster().prepareState().execute().actionGet().getState();
|
||||
|
@ -46,18 +46,14 @@ import org.junit.Test;
|
||||
|
||||
import java.util.Locale;
|
||||
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
*/
|
||||
public class SimpleIndicesWarmerTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Test
|
||||
public void simpleWarmerTests() {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1))
|
||||
.execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
PutWarmerResponse putWarmerResponse = client().admin().indices().preparePutWarmer("warmer_1")
|
||||
@ -127,9 +123,7 @@ public class SimpleIndicesWarmerTests extends ElasticsearchIntegrationTest {
|
||||
"}")
|
||||
.execute().actionGet();
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1))
|
||||
.execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
ClusterState clusterState = client().admin().cluster().prepareState().execute().actionGet().getState();
|
||||
@ -143,11 +137,8 @@ public class SimpleIndicesWarmerTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void createIndexWarmer() {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSource("{\n" +
|
||||
" \"settings\" : {\n" +
|
||||
" \"index.number_of_shards\" : 1\n" +
|
||||
" },\n" +
|
||||
" \"warmers\" : {\n" +
|
||||
" \"warmer_1\" : {\n" +
|
||||
" \"types\" : [],\n" +
|
||||
@ -158,8 +149,7 @@ public class SimpleIndicesWarmerTests extends ElasticsearchIntegrationTest {
|
||||
" }\n" +
|
||||
" }\n" +
|
||||
" }\n" +
|
||||
"}")
|
||||
.execute().actionGet();
|
||||
"}"));
|
||||
|
||||
ClusterState clusterState = client().admin().cluster().prepareState().execute().actionGet().getState();
|
||||
IndexWarmersMetaData warmersMetaData = clusterState.metaData().index("test").custom(IndexWarmersMetaData.TYPE);
|
||||
@ -208,9 +198,7 @@ public class SimpleIndicesWarmerTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test // issue 3246
|
||||
public void ensureThatIndexWarmersCanBeChangedOnRuntime() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1, "index.number_of_replicas", 0))
|
||||
.execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
PutWarmerResponse putWarmerResponse = client().admin().indices().preparePutWarmer("custom_warmer")
|
||||
@ -233,9 +221,7 @@ public class SimpleIndicesWarmerTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void gettingAllWarmersUsingAllAndWildcardsShouldWork() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1, "index.number_of_replicas", 0))
|
||||
.execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
PutWarmerResponse putWarmerResponse = client().admin().indices().preparePutWarmer("custom_warmer")
|
||||
|
@ -22,7 +22,9 @@ import org.elasticsearch.action.get.MultiGetItemResponse;
|
||||
import org.elasticsearch.action.get.MultiGetRequest;
|
||||
import org.elasticsearch.action.get.MultiGetRequestBuilder;
|
||||
import org.elasticsearch.action.get.MultiGetResponse;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.search.fetch.source.FetchSourceContext;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
@ -31,6 +33,7 @@ import java.io.IOException;
|
||||
import java.util.Map;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
public class SimpleMgetTests extends ElasticsearchIntegrationTest {
|
||||
@ -142,15 +145,17 @@ public class SimpleMgetTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testThatRoutingPerDocumentIsSupported() throws Exception {
|
||||
createIndex("test");
|
||||
assertAcked(prepareCreate("test").setSettings(ImmutableSettings.builder()
|
||||
.put(indexSettings())
|
||||
.put(IndexMetaData.SETTING_NUMBER_OF_SHARDS, between(2, DEFAULT_MAX_NUM_SHARDS))));
|
||||
ensureYellow();
|
||||
|
||||
client().prepareIndex("test", "test", "1").setRefresh(true).setRouting("bar")
|
||||
client().prepareIndex("test", "test", "1").setRefresh(true).setRouting("2")
|
||||
.setSource(jsonBuilder().startObject().field("foo", "bar").endObject())
|
||||
.execute().actionGet();
|
||||
|
||||
MultiGetResponse mgetResponse = client().prepareMultiGet()
|
||||
.add(new MultiGetRequest.Item("test", "test", "1").routing("bar"))
|
||||
.add(new MultiGetRequest.Item("test", "test", "1").routing("2"))
|
||||
.add(new MultiGetRequest.Item("test", "test", "1"))
|
||||
.execute().actionGet();
|
||||
|
||||
|
@ -27,8 +27,6 @@ import org.elasticsearch.action.get.GetResponse;
|
||||
import org.elasticsearch.action.search.SearchPhaseExecutionException;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.search.SearchType;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.index.query.FilterBuilders;
|
||||
@ -40,7 +38,6 @@ import org.elasticsearch.search.facet.termsstats.TermsStatsFacet;
|
||||
import org.elasticsearch.search.sort.SortBuilders;
|
||||
import org.elasticsearch.search.sort.SortOrder;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.elasticsearch.test.hamcrest.ElasticsearchAssertions;
|
||||
import org.junit.Assert;
|
||||
import org.junit.Test;
|
||||
|
||||
@ -48,8 +45,7 @@ import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilde
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.*;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.*;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
@ -70,7 +66,7 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
endObject().
|
||||
endObject().
|
||||
endObject();
|
||||
ElasticsearchAssertions.assertAcked(prepareCreate("test").addMapping("type1", builder));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", builder));
|
||||
ensureGreen();
|
||||
|
||||
// check on no data, see it works
|
||||
@ -195,14 +191,13 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
private void simpleNestedDeleteByQuery(int total, int docToDelete) throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 1).put("index.referesh_interval", -1).build())
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put(indexSettings()).put("index.referesh_interval", -1).build())
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("nested1")
|
||||
.field("type", "nested")
|
||||
.endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
|
||||
ensureGreen();
|
||||
|
||||
@ -255,16 +250,15 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
private void noChildrenNestedDeleteByQuery(long total, int docToDelete) throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 1).put("index.referesh_interval", -1).build())
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put(indexSettings()).put("index.referesh_interval", -1).build())
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("nested1")
|
||||
.field("type", "nested")
|
||||
.endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
|
||||
for (int i = 0; i < total; i++) {
|
||||
@ -293,14 +287,13 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void multiNested() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("nested1")
|
||||
.field("type", "nested").startObject("properties")
|
||||
.startObject("nested2").field("type", "nested").endObject()
|
||||
.endObject().endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
|
||||
ensureGreen();
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder()
|
||||
@ -370,22 +363,23 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFacetsMultiShards() throws Exception {
|
||||
testFacets(3);
|
||||
testFacets(between(2, DEFAULT_MAX_NUM_SHARDS));
|
||||
}
|
||||
|
||||
private void testFacets(int numberOfShards) throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", numberOfShards))
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.number_of_shards", numberOfShards))
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("nested1")
|
||||
.field("type", "nested").startObject("properties")
|
||||
.startObject("nested2").field("type", "nested").endObject()
|
||||
.endObject().endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder()
|
||||
.startObject()
|
||||
@ -405,7 +399,7 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
.endArray()
|
||||
.endObject()).execute().actionGet();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("test").setQuery(matchAllQuery())
|
||||
.addFacet(FacetBuilders.termsStatsFacet("facet1").keyField("nested1.nested2.field2_1").valueField("nested1.nested2.field2_2").nested("nested1.nested2"))
|
||||
@ -497,19 +491,18 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
// This IncludeNestedDocsQuery also needs to be aware of the filter from alias
|
||||
public void testDeleteNestedDocsWithAlias() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 1).put("index.referesh_interval", -1).build())
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put(indexSettings()).put("index.referesh_interval", -1).build())
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("nested1")
|
||||
.field("type", "nested")
|
||||
.endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
|
||||
client().admin().indices().prepareAliases()
|
||||
.addAlias("test", "alias1", FilterBuilders.termFilter("field1", "value1")).execute().actionGet();
|
||||
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
@ -560,15 +553,14 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testExplain() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("nested1")
|
||||
.field("type", "nested")
|
||||
.endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
.field("field1", "value1")
|
||||
@ -603,26 +595,22 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimpleNestedSorting() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
.put("index.refresh_interval", -1)
|
||||
.build()
|
||||
)
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("nested1")
|
||||
.field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("field1")
|
||||
.field("type", "long")
|
||||
.field("store", "yes")
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.refresh_interval", -1))
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("nested1")
|
||||
.field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("field1")
|
||||
.field("type", "long")
|
||||
.field("store", "yes")
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject().endObject().endObject()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
.field("field1", 1)
|
||||
@ -657,7 +645,7 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()
|
||||
.endArray()
|
||||
.endObject()).execute().actionGet();
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("test")
|
||||
.setTypes("type1")
|
||||
@ -777,20 +765,16 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimpleNestedSorting_withNestedFilterMissing() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
.put("index.referesh_interval", -1)
|
||||
.build()
|
||||
)
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("nested1")
|
||||
.field("type", "nested")
|
||||
.endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.referesh_interval", -1))
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("nested1")
|
||||
.field("type", "nested")
|
||||
.endObject()
|
||||
.endObject().endObject().endObject()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
.field("field1", 1)
|
||||
@ -819,7 +803,7 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
.endArray()
|
||||
.endObject()).execute().actionGet();
|
||||
// Doc with missing nested docs if nested filter is used
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
client().prepareIndex("test", "type1", "3").setSource(jsonBuilder().startObject()
|
||||
.field("field1", 3)
|
||||
.startArray("nested1")
|
||||
@ -833,7 +817,7 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()
|
||||
.endArray()
|
||||
.endObject()).execute().actionGet();
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("test")
|
||||
.setTypes("type1")
|
||||
@ -866,27 +850,25 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSortNestedWithNestedFilter() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
.addMapping("type1", XContentFactory.jsonBuilder().startObject()
|
||||
.startObject("type1")
|
||||
.startObject("properties")
|
||||
.startObject("grand_parent_values").field("type", "long").endObject()
|
||||
.startObject("parent").field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("parent_values").field("type", "long").endObject()
|
||||
.startObject("child").field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("child_values").field("type", "long").endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", XContentFactory.jsonBuilder().startObject()
|
||||
.startObject("type1")
|
||||
.startObject("properties")
|
||||
.startObject("grand_parent_values").field("type", "long").endObject()
|
||||
.startObject("parent").field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("parent_values").field("type", "long").endObject()
|
||||
.startObject("child").field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("child_values").field("type", "long").endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()));
|
||||
ensureGreen();
|
||||
|
||||
// sum: 11
|
||||
client().prepareIndex("test", "type1", Integer.toString(1)).setSource(jsonBuilder().startObject()
|
||||
@ -983,7 +965,7 @@ public class SimpleNestedTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()).execute().actionGet();
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
// Without nested filter
|
||||
SearchResponse searchResponse = client().prepareSearch()
|
||||
|
@ -22,7 +22,6 @@ import org.elasticsearch.action.delete.DeleteResponse;
|
||||
import org.elasticsearch.action.index.IndexResponse;
|
||||
import org.elasticsearch.action.percolate.PercolateResponse;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.util.concurrent.ConcurrentCollections;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
@ -51,12 +50,7 @@ public class ConcurrentPercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimpleConcurrentPercolator() throws Exception {
|
||||
client().admin().indices().prepareCreate("index").setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
.build()
|
||||
).execute().actionGet();
|
||||
createIndex("index");
|
||||
ensureGreen();
|
||||
|
||||
final BytesReference onlyField1 = XContentFactory.jsonBuilder().startObject().startObject("doc")
|
||||
@ -147,12 +141,7 @@ public class ConcurrentPercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testConcurrentAddingAndPercolating() throws Exception {
|
||||
client().admin().indices().prepareCreate("index").setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 1)
|
||||
.build()
|
||||
).execute().actionGet();
|
||||
createIndex("index");
|
||||
ensureGreen();
|
||||
final int numIndexThreads = 3;
|
||||
final int numPercolateThreads = 6;
|
||||
@ -296,12 +285,7 @@ public class ConcurrentPercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testConcurrentAddingAndRemovingWhilePercolating() throws Exception {
|
||||
client().admin().indices().prepareCreate("index").setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 1)
|
||||
.build()
|
||||
).execute().actionGet();
|
||||
createIndex("index");
|
||||
ensureGreen();
|
||||
final int numIndexThreads = 3;
|
||||
final int numberPercolateOperation = 100;
|
||||
|
@ -19,9 +19,10 @@
|
||||
package org.elasticsearch.percolator;
|
||||
|
||||
import org.elasticsearch.action.ShardOperationFailedException;
|
||||
import org.elasticsearch.action.percolate.*;
|
||||
import org.elasticsearch.action.percolate.MultiPercolateRequestBuilder;
|
||||
import org.elasticsearch.action.percolate.MultiPercolateResponse;
|
||||
import org.elasticsearch.action.percolate.PercolateSourceBuilder;
|
||||
import org.elasticsearch.client.Requests;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.index.query.MatchQueryBuilder;
|
||||
@ -115,14 +116,7 @@ public class MultiPercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testExistingDocsOnly() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 1)
|
||||
.build())
|
||||
.execute().actionGet();
|
||||
ensureGreen();
|
||||
createIndex("test");
|
||||
|
||||
int numQueries = randomIntBetween(50, 100);
|
||||
logger.info("--> register a queries");
|
||||
@ -192,15 +186,11 @@ public class MultiPercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testWithDocsOnly() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 1)
|
||||
.build())
|
||||
.execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
NumShards test = getNumShards("test");
|
||||
|
||||
int numQueries = randomIntBetween(50, 100);
|
||||
logger.info("--> register a queries");
|
||||
for (int i = 0; i < numQueries; i++) {
|
||||
@ -240,7 +230,7 @@ public class MultiPercolatorTests extends ElasticsearchIntegrationTest {
|
||||
for (MultiPercolateResponse.Item item : response) {
|
||||
assertThat(item.isFailure(), equalTo(false));
|
||||
assertThat(item.getResponse().getSuccessfulShards(), equalTo(0));
|
||||
assertThat(item.getResponse().getShardFailures().length, equalTo(2));
|
||||
assertThat(item.getResponse().getShardFailures().length, equalTo(test.numPrimaries));
|
||||
for (ShardOperationFailedException shardFailure : item.getResponse().getShardFailures()) {
|
||||
assertThat(shardFailure.reason(), containsString("Failed to derive xcontent from"));
|
||||
assertThat(shardFailure.status().getStatus(), equalTo(500));
|
||||
|
@ -19,6 +19,7 @@
|
||||
package org.elasticsearch.percolator;
|
||||
|
||||
import com.google.common.base.Predicate;
|
||||
import org.elasticsearch.action.ShardOperationFailedException;
|
||||
import org.elasticsearch.action.admin.cluster.node.stats.NodeStats;
|
||||
import org.elasticsearch.action.admin.cluster.node.stats.NodesStatsResponse;
|
||||
import org.elasticsearch.action.admin.indices.alias.IndicesAliasesResponse;
|
||||
@ -54,6 +55,7 @@ import java.io.IOException;
|
||||
import java.util.*;
|
||||
|
||||
import static org.elasticsearch.action.percolate.PercolateSourceBuilder.docBuilder;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.builder;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.*;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.termFilter;
|
||||
@ -148,6 +150,8 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimple2() throws Exception {
|
||||
|
||||
//TODO this test seems to have problems with more shards and/or 1 replica instead of 0
|
||||
client().admin().indices().prepareCreate("index").setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
@ -270,7 +274,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void percolateOnRecreatedIndex() throws Exception {
|
||||
prepareCreate("test").setSettings(settingsBuilder().put("index.number_of_shards", 1)).execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "test", "1").setSource("field1", "value1").execute().actionGet();
|
||||
@ -284,7 +288,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
.execute().actionGet();
|
||||
|
||||
wipeIndices("test");
|
||||
prepareCreate("test").setSettings(settingsBuilder().put("index.number_of_shards", 1)).execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "test", "1").setSource("field1", "value1").execute().actionGet();
|
||||
@ -301,7 +305,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
// see #2814
|
||||
public void percolateCustomAnalyzer() throws Exception {
|
||||
Builder builder = ImmutableSettings.builder();
|
||||
Builder builder = builder();
|
||||
builder.put("index.analysis.analyzer.lwhitespacecomma.tokenizer", "whitespacecomma");
|
||||
builder.putArray("index.analysis.analyzer.lwhitespacecomma.filter", "lowercase");
|
||||
builder.put("index.analysis.tokenizer.whitespacecomma.type", "pattern");
|
||||
@ -313,10 +317,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()
|
||||
.endObject().endObject();
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.addMapping("doc", mapping)
|
||||
.setSettings(builder.put("index.number_of_shards", 1))
|
||||
.execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> register a query");
|
||||
@ -342,7 +343,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void createIndexAndThenRegisterPercolator() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(settingsBuilder().put("index.number_of_shards", 1)));
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> register a query");
|
||||
@ -391,7 +392,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void multiplePercolators() throws Exception {
|
||||
client().admin().indices().prepareCreate("test").setSettings(settingsBuilder().put("index.number_of_shards", 1)).execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> register a query 1");
|
||||
@ -432,7 +433,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void dynamicAddingRemovingQueries() throws Exception {
|
||||
client().admin().indices().prepareCreate("test").setSettings(settingsBuilder().put("index.number_of_shards", 1)).execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> register a query 1");
|
||||
@ -508,10 +509,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("_size").field("enabled", true).field("stored", "yes").endObject()
|
||||
.endObject().endObject().string();
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 2))
|
||||
.addMapping("type1", mapping)
|
||||
.execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> register a query");
|
||||
@ -554,8 +552,10 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
assertMatchCount(response, 1l);
|
||||
assertThat(convertFromTextArray(response.getMatches(), "test"), arrayContaining("1"));
|
||||
|
||||
NumShards numShards = getNumShards("test");
|
||||
|
||||
IndicesStatsResponse indicesResponse = client().admin().indices().prepareStats("test").execute().actionGet();
|
||||
assertThat(indicesResponse.getTotal().getPercolate().getCount(), equalTo(5l)); // We have 5 partitions
|
||||
assertThat(indicesResponse.getTotal().getPercolate().getCount(), equalTo((long) numShards.numPrimaries));
|
||||
assertThat(indicesResponse.getTotal().getPercolate().getCurrent(), equalTo(0l));
|
||||
assertThat(indicesResponse.getTotal().getPercolate().getNumQueries(), equalTo(2l)); // One primary and replica
|
||||
assertThat(indicesResponse.getTotal().getPercolate().getMemorySizeInBytes(), greaterThan(0l));
|
||||
@ -565,7 +565,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
for (NodeStats nodeStats : nodesResponse) {
|
||||
percolateCount += nodeStats.getIndices().getPercolate().getCount();
|
||||
}
|
||||
assertThat(percolateCount, equalTo(5l)); // We have 5 partitions
|
||||
assertThat(percolateCount, equalTo((long) numShards.numPrimaries));
|
||||
|
||||
logger.info("--> Second percolate request");
|
||||
response = client().preparePercolate()
|
||||
@ -577,7 +577,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(convertFromTextArray(response.getMatches(), "test"), arrayContaining("1"));
|
||||
|
||||
indicesResponse = client().admin().indices().prepareStats().setPercolate(true).execute().actionGet();
|
||||
assertThat(indicesResponse.getTotal().getPercolate().getCount(), equalTo(10l));
|
||||
assertThat(indicesResponse.getTotal().getPercolate().getCount(), equalTo((long) numShards.numPrimaries *2));
|
||||
assertThat(indicesResponse.getTotal().getPercolate().getCurrent(), equalTo(0l));
|
||||
assertThat(indicesResponse.getTotal().getPercolate().getNumQueries(), equalTo(2l));
|
||||
assertThat(indicesResponse.getTotal().getPercolate().getMemorySizeInBytes(), greaterThan(0l));
|
||||
@ -587,7 +587,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
for (NodeStats nodeStats : nodesResponse) {
|
||||
percolateCount += nodeStats.getIndices().getPercolate().getCount();
|
||||
}
|
||||
assertThat(percolateCount, equalTo(10l));
|
||||
assertThat(percolateCount, equalTo((long) numShards.numPrimaries *2));
|
||||
|
||||
// We might be faster than 1 ms, so run upto 1000 times until have spend 1ms or more on percolating
|
||||
boolean moreThanOneMs = false;
|
||||
@ -821,12 +821,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPercolateMultipleIndicesAndAliases() throws Exception {
|
||||
client().admin().indices().prepareCreate("test1")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 2))
|
||||
.execute().actionGet();
|
||||
client().admin().indices().prepareCreate("test2")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 2))
|
||||
.execute().actionGet();
|
||||
createIndex("test1", "test2");
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> registering queries");
|
||||
@ -1125,12 +1120,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPercolateScoreAndSorting() throws Exception {
|
||||
client().admin().indices().prepareCreate("my-index")
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 3)
|
||||
.put("index.number_of_replicas", 1)
|
||||
.build())
|
||||
.execute().actionGet();
|
||||
createIndex("my-index");
|
||||
ensureGreen();
|
||||
|
||||
// Add a dummy doc, that shouldn't never interfere with percolate operations.
|
||||
@ -1217,7 +1207,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPercolateSortingWithNoSize() throws Exception {
|
||||
client().admin().indices().prepareCreate("my-index").execute().actionGet();
|
||||
createIndex("my-index");
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("my-index", PercolatorService.TYPE_NAME, "1")
|
||||
@ -1246,12 +1236,11 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
.setPercolateQuery(QueryBuilders.functionScoreQuery(matchAllQuery(), scriptFunction("doc['level'].value")))
|
||||
.execute().actionGet();
|
||||
assertThat(response.getCount(), equalTo(0l));
|
||||
assertThat(response.getSuccessfulShards(), equalTo(3));
|
||||
assertThat(response.getShardFailures().length, equalTo(2));
|
||||
assertThat(response.getShardFailures()[0].status().getStatus(), equalTo(400));
|
||||
assertThat(response.getShardFailures()[0].reason(), containsString("Can't sort if size isn't specified"));
|
||||
assertThat(response.getShardFailures()[1].status().getStatus(), equalTo(400));
|
||||
assertThat(response.getShardFailures()[1].reason(), containsString("Can't sort if size isn't specified"));
|
||||
assertThat(response.getShardFailures().length, greaterThan(0));
|
||||
for (ShardOperationFailedException failure : response.getShardFailures()) {
|
||||
assertThat(failure.status(), equalTo(RestStatus.BAD_REQUEST));
|
||||
assertThat(failure.reason(), containsString("Can't sort if size isn't specified"));
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
@ -1276,7 +1265,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
.addSort(SortBuilders.fieldSort("level"))
|
||||
.get();
|
||||
|
||||
assertThat(response.getShardFailures().length, equalTo(5));
|
||||
assertThat(response.getShardFailures().length, equalTo(getNumShards("my-index").numPrimaries));
|
||||
assertThat(response.getShardFailures()[0].status(), equalTo(RestStatus.BAD_REQUEST));
|
||||
assertThat(response.getShardFailures()[0].reason(), containsString("Only _score desc is supported"));
|
||||
}
|
||||
@ -1298,7 +1287,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testPercolateNotEmptyIndexButNoRefresh() throws Exception {
|
||||
client().admin().indices().prepareCreate("my-index")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.refresh_interval", -1))
|
||||
.setSettings(settingsBuilder().put("index.refresh_interval", -1))
|
||||
.execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
@ -1318,9 +1307,7 @@ public class PercolatorTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testPercolatorWithHighlighting() throws Exception {
|
||||
Client client = client();
|
||||
client.admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
.execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
if (randomBoolean()) {
|
||||
|
@ -45,6 +45,7 @@ import java.util.concurrent.atomic.AtomicReference;
|
||||
|
||||
import static org.elasticsearch.action.percolate.PercolateSourceBuilder.docBuilder;
|
||||
import static org.elasticsearch.client.Requests.clusterHealthRequest;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.builder;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.*;
|
||||
@ -54,20 +55,24 @@ import static org.elasticsearch.test.ElasticsearchIntegrationTest.Scope;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
@ClusterScope(scope = Scope.TEST, numNodes = 0)
|
||||
@ClusterScope(scope = Scope.TEST, numNodes = 1)
|
||||
public class RecoveryPercolatorTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
protected int numberOfShards() {
|
||||
return 1;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Settings nodeSettings(int nodeOrdinal) {
|
||||
return builder().put("gateway.type", "local").build();
|
||||
}
|
||||
|
||||
@Test
|
||||
@Slow
|
||||
public void testRestartNodePercolator1() throws Exception {
|
||||
Settings settings = settingsBuilder()
|
||||
.put(super.indexSettings())
|
||||
.put("gateway.type", "local")
|
||||
.build();
|
||||
cluster().startNode(settings);
|
||||
client().admin().indices().prepareCreate("test").setSettings(
|
||||
settingsBuilder().put("index.number_of_shards", 1).put()
|
||||
).execute().actionGet();
|
||||
|
||||
createIndex("test");
|
||||
|
||||
logger.info("--> register a query");
|
||||
client().prepareIndex("test", PercolatorService.TYPE_NAME, "kuku")
|
||||
@ -107,13 +112,8 @@ public class RecoveryPercolatorTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
@Slow
|
||||
public void testRestartNodePercolator2() throws Exception {
|
||||
Settings settings = settingsBuilder()
|
||||
.put(super.indexSettings())
|
||||
.put("gateway.type", "local")
|
||||
.build();
|
||||
cluster().startNode(settings);
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 1)).execute().actionGet();
|
||||
|
||||
createIndex("test");
|
||||
|
||||
logger.info("--> register a query");
|
||||
client().prepareIndex("test", PercolatorService.TYPE_NAME, "kuku")
|
||||
@ -188,16 +188,8 @@ public class RecoveryPercolatorTests extends ElasticsearchIntegrationTest {
|
||||
@Slow
|
||||
@TestLogging("index.percolator:TRACE,percolator:TRACE")
|
||||
public void testLoadingPercolateQueriesDuringCloseAndOpen() throws Exception {
|
||||
Settings settings = settingsBuilder()
|
||||
.put(super.indexSettings())
|
||||
.put("gateway.type", "local")
|
||||
.build();
|
||||
logger.info("--> Starting 2 nodes");
|
||||
cluster().startNode(settings);
|
||||
cluster().startNode(settings);
|
||||
|
||||
client().admin().indices().prepareDelete("_all").execute().actionGet();
|
||||
ensureGreen();
|
||||
cluster().ensureAtLeastNumNodes(2);
|
||||
cluster().ensureAtMostNumNodes(2);
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 2))
|
||||
@ -264,10 +256,7 @@ public class RecoveryPercolatorTests extends ElasticsearchIntegrationTest {
|
||||
cluster().ensureAtMostNumNodes(2);
|
||||
logger.info("--> Adding 3th node");
|
||||
cluster().startNode(settingsBuilder().put("node.stay", true));
|
||||
|
||||
client().admin().indices().prepareDelete("_all").execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
|
@ -31,7 +31,6 @@ import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.collect.MapBuilder;
|
||||
import org.elasticsearch.common.logging.ESLogger;
|
||||
import org.elasticsearch.common.logging.Loggers;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.index.shard.DocsStats;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.elasticsearch.test.junit.annotations.TestLogging;
|
||||
@ -44,10 +43,9 @@ import java.util.concurrent.TimeUnit;
|
||||
import java.util.concurrent.atomic.AtomicBoolean;
|
||||
import java.util.concurrent.atomic.AtomicLong;
|
||||
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
import static org.hamcrest.Matchers.emptyIterable;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
@ -310,7 +308,7 @@ public class RecoveryWhileUnderLoadTests extends ElasticsearchIntegrationTest {
|
||||
cluster().ensureAtLeastNumNodes(3);
|
||||
logger.info("--> creating test index ...");
|
||||
int allowNodes = 2;
|
||||
assertAcked(prepareCreate("test").setSettings(ImmutableSettings.builder().put("number_of_shards", numShards).put("number_of_replicas", numReplicas).build()));
|
||||
assertAcked(prepareCreate("test").setSettings(settingsBuilder().put(indexSettings()).put("number_of_shards", numShards).put("number_of_replicas", numReplicas).build()));
|
||||
final AtomicLong idGenerator = new AtomicLong();
|
||||
final AtomicLong indexCounter = new AtomicLong();
|
||||
final AtomicBoolean stop = new AtomicBoolean(false);
|
||||
@ -364,7 +362,7 @@ public class RecoveryWhileUnderLoadTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("--> indexing threads stopped");
|
||||
logger.info("--> bump up number of replicas to 1 and allow all nodes to hold the index");
|
||||
allowNodes("test", 3);
|
||||
assertAcked(client().admin().indices().prepareUpdateSettings("test").setSettings(ImmutableSettings.settingsBuilder().put("number_of_replicas", 1)).get());
|
||||
assertAcked(client().admin().indices().prepareUpdateSettings("test").setSettings(settingsBuilder().put("number_of_replicas", 1)).get());
|
||||
assertThat(client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setTimeout("1m").setWaitForGreenStatus().execute().actionGet().isTimedOut(), equalTo(false));
|
||||
|
||||
logger.info("--> refreshing the index");
|
||||
|
@ -19,8 +19,6 @@
|
||||
|
||||
package org.elasticsearch.recovery;
|
||||
|
||||
import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
|
||||
import org.elasticsearch.action.admin.cluster.health.ClusterHealthStatus;
|
||||
import org.elasticsearch.action.admin.indices.flush.FlushResponse;
|
||||
import org.elasticsearch.action.admin.indices.refresh.RefreshResponse;
|
||||
import org.elasticsearch.action.get.GetResponse;
|
||||
@ -50,30 +48,26 @@ public class SimpleRecoveryTests extends ElasticsearchIntegrationTest {
|
||||
public void testSimpleRecovery() throws Exception {
|
||||
prepareCreate("test", 1).execute().actionGet(5000);
|
||||
|
||||
NumShards numShards = getNumShards("test");
|
||||
|
||||
logger.info("Running Cluster Health");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForYellowStatus()).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
ensureYellow();
|
||||
|
||||
client().index(indexRequest("test").type("type1").id("1").source(source("1", "test"))).actionGet();
|
||||
FlushResponse flushResponse = client().admin().indices().flush(flushRequest("test")).actionGet();
|
||||
assertThat(flushResponse.getTotalShards(), equalTo(10));
|
||||
assertThat(flushResponse.getSuccessfulShards(), equalTo(5));
|
||||
assertThat(flushResponse.getTotalShards(), equalTo(numShards.totalNumShards));
|
||||
assertThat(flushResponse.getSuccessfulShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(flushResponse.getFailedShards(), equalTo(0));
|
||||
client().index(indexRequest("test").type("type1").id("2").source(source("2", "test"))).actionGet();
|
||||
RefreshResponse refreshResponse = client().admin().indices().refresh(refreshRequest("test")).actionGet();
|
||||
assertThat(refreshResponse.getTotalShards(), equalTo(10));
|
||||
assertThat(refreshResponse.getSuccessfulShards(), equalTo(5));
|
||||
assertThat(refreshResponse.getTotalShards(), equalTo(numShards.totalNumShards));
|
||||
assertThat(refreshResponse.getSuccessfulShards(), equalTo(numShards.numPrimaries));
|
||||
assertThat(refreshResponse.getFailedShards(), equalTo(0));
|
||||
|
||||
allowNodes("test", 2);
|
||||
|
||||
logger.info("Running Cluster Health");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().local(true).waitForNodes(">=2")).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
GetResponse getResult;
|
||||
|
||||
@ -92,10 +86,7 @@ public class SimpleRecoveryTests extends ElasticsearchIntegrationTest {
|
||||
allowNodes("test", 3);
|
||||
Thread.sleep(200);
|
||||
logger.info("Running Cluster Health");
|
||||
clusterHealth = client().admin().cluster().health(clusterHealthRequest().waitForGreenStatus().waitForRelocatingShards(0).waitForNodes(">=3")).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.GREEN));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 5; i++) {
|
||||
getResult = client().get(getRequest("test").type("type1").id("1")).actionGet(1000);
|
||||
|
@ -143,6 +143,7 @@ public class RiverTests extends ElasticsearchIntegrationTest {
|
||||
IndexResponse indexResponse = client().prepareIndex(RiverIndexName.Conf.DEFAULT_INDEX_NAME, riverName, "_meta")
|
||||
.setSource("type", DummyRiverModule.class.getCanonicalName()).get();
|
||||
assertTrue(indexResponse.isCreated());
|
||||
ensureGreen();
|
||||
}
|
||||
|
||||
private void checkRiverIsStarted(final String riverName) throws InterruptedException {
|
||||
|
@ -21,7 +21,6 @@ package org.elasticsearch.routing;
|
||||
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.RoutingMissingException;
|
||||
import org.elasticsearch.action.admin.indices.alias.IndicesAliasesResponse;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.search.SearchType;
|
||||
import org.elasticsearch.client.Requests;
|
||||
@ -32,6 +31,7 @@ import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.cluster.metadata.AliasAction.newAddAliasAction;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.instanceOf;
|
||||
|
||||
@ -40,12 +40,16 @@ import static org.hamcrest.Matchers.instanceOf;
|
||||
*/
|
||||
public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
protected int minimumNumberOfShards() {
|
||||
return 2;
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testAliasCrudRouting() throws Exception {
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
IndicesAliasesResponse res = admin().indices().prepareAliases().addAliasAction(newAddAliasAction("test", "alias0").routing("0")).get();
|
||||
assertThat(res.isAcknowledged(), equalTo(true));
|
||||
assertAcked(admin().indices().prepareAliases().addAliasAction(newAddAliasAction("test", "alias0").routing("0")));
|
||||
|
||||
logger.info("--> indexing with id [1], and routing [0] using alias");
|
||||
client().prepareIndex("alias0", "type1", "1").setSource("field", "value1").setRefresh(true).execute().actionGet();
|
||||
@ -104,7 +108,7 @@ public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
logger.info("--> deleting_by_query with 1 as routing, should not delete anything");
|
||||
client().prepareDeleteByQuery().setQuery(matchAllQuery()).setRouting("1").execute().actionGet();
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
for (int i = 0; i < 5; i++) {
|
||||
assertThat(client().prepareGet("test", "type1", "1").execute().actionGet().isExists(), equalTo(false));
|
||||
assertThat(client().prepareGet("test", "type1", "1").setRouting("0").execute().actionGet().isExists(), equalTo(true));
|
||||
@ -113,7 +117,7 @@ public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
logger.info("--> deleting_by_query with alias0, should delete");
|
||||
client().prepareDeleteByQuery("alias0").setQuery(matchAllQuery()).execute().actionGet();
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
for (int i = 0; i < 5; i++) {
|
||||
assertThat(client().prepareGet("test", "type1", "1").execute().actionGet().isExists(), equalTo(false));
|
||||
assertThat(client().prepareGet("test", "type1", "1").setRouting("0").execute().actionGet().isExists(), equalTo(false));
|
||||
@ -125,12 +129,11 @@ public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
public void testAliasSearchRouting() throws Exception {
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
IndicesAliasesResponse res = admin().indices().prepareAliases()
|
||||
assertAcked(admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("test", "alias"))
|
||||
.addAliasAction(newAddAliasAction("test", "alias0").routing("0"))
|
||||
.addAliasAction(newAddAliasAction("test", "alias1").routing("1"))
|
||||
.addAliasAction(newAddAliasAction("test", "alias01").searchRouting("0,1")).get();
|
||||
assertThat(res.isAcknowledged(), equalTo(true));
|
||||
.addAliasAction(newAddAliasAction("test", "alias01").searchRouting("0,1")));
|
||||
|
||||
logger.info("--> indexing with id [1], and routing [0] using alias");
|
||||
client().prepareIndex("alias0", "type1", "1").setSource("field", "value1").setRefresh(true).execute().actionGet();
|
||||
@ -222,14 +225,13 @@ public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
createIndex("test-a");
|
||||
createIndex("test-b");
|
||||
ensureGreen();
|
||||
IndicesAliasesResponse res = admin().indices().prepareAliases()
|
||||
assertAcked(admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("test-a", "alias-a0").routing("0"))
|
||||
.addAliasAction(newAddAliasAction("test-a", "alias-a1").routing("1"))
|
||||
.addAliasAction(newAddAliasAction("test-b", "alias-b0").routing("0"))
|
||||
.addAliasAction(newAddAliasAction("test-b", "alias-b1").routing("1"))
|
||||
.addAliasAction(newAddAliasAction("test-a", "alias-ab").searchRouting("0"))
|
||||
.addAliasAction(newAddAliasAction("test-b", "alias-ab").searchRouting("1")).get();
|
||||
assertThat(res.isAcknowledged(), equalTo(true));
|
||||
.addAliasAction(newAddAliasAction("test-b", "alias-ab").searchRouting("1")));
|
||||
ensureGreen(); // wait for events again to make sure we got the aliases on all nodes
|
||||
logger.info("--> indexing with id [1], and routing [0] using alias to test-a");
|
||||
client().prepareIndex("alias-a0", "type1", "1").setSource("field", "value1").setRefresh(true).execute().actionGet();
|
||||
@ -283,9 +285,8 @@ public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
public void testAliasSearchRoutingWithConcreteAndAliasedIndices_issue2682() throws Exception {
|
||||
createIndex("index", "index_2");
|
||||
ensureGreen();
|
||||
IndicesAliasesResponse res = admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("index", "index_1").routing("1")).get();
|
||||
assertThat(res.isAcknowledged(), equalTo(true));
|
||||
assertAcked(admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("index", "index_1").routing("1")));
|
||||
|
||||
logger.info("--> indexing on index_1 which is an alias for index with routing [1]");
|
||||
client().prepareIndex("index_1", "type1", "1").setSource("field", "value1").setRefresh(true).execute().actionGet();
|
||||
@ -310,9 +311,8 @@ public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
public void testAliasSearchRoutingWithConcreteAndAliasedIndices_issue3268() throws Exception {
|
||||
createIndex("index", "index_2");
|
||||
ensureGreen();
|
||||
IndicesAliasesResponse res = admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("index", "index_1").routing("1")).get();
|
||||
assertThat(res.isAcknowledged(), equalTo(true));
|
||||
assertAcked(admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("index", "index_1").routing("1")));
|
||||
|
||||
logger.info("--> indexing on index_1 which is an alias for index with routing [1]");
|
||||
client().prepareIndex("index_1", "type1", "1").setSource("field", "value1").setRefresh(true).execute().actionGet();
|
||||
@ -370,7 +370,7 @@ public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
logger.info("--> bulk deleting with no routing, should broadcast the delete since _routing is required");
|
||||
client().prepareBulk().add(Requests.deleteRequest("test").type("type1").id("1")).execute().actionGet();
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
for (int i = 0; i < 5; i++) {
|
||||
try {
|
||||
assertThat(client().prepareGet("test", "type1", "1").execute().actionGet().isExists(), equalTo(false));
|
||||
@ -387,9 +387,8 @@ public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
logger.info("--> creating alias with routing [3]");
|
||||
IndicesAliasesResponse res = admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("test", "alias").routing("3")).get();
|
||||
assertThat(res.isAcknowledged(), equalTo(true));
|
||||
assertAcked(admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("test", "alias").routing("3")));
|
||||
|
||||
logger.info("--> indexing with id [0], and routing [3]");
|
||||
client().prepareIndex("alias", "type1", "0").setSource("field", "value1").setRefresh(true).execute().actionGet();
|
||||
@ -403,9 +402,8 @@ public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
logger.info("--> creating alias with routing [4]");
|
||||
res = admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("test", "alias").routing("4")).get();
|
||||
assertThat(res.isAcknowledged(), equalTo(true));
|
||||
assertAcked(admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("test", "alias").routing("4")));
|
||||
|
||||
logger.info("--> verifying search with wrong routing should not find");
|
||||
for (int i = 0; i < 5; i++) {
|
||||
@ -414,9 +412,8 @@ public class AliasRoutingTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
logger.info("--> creating alias with search routing [3,4] and index routing 4");
|
||||
client().admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("test", "alias").searchRouting("3,4").indexRouting("4"))
|
||||
.execute().actionGet();
|
||||
assertAcked(client().admin().indices().prepareAliases()
|
||||
.addAliasAction(newAddAliasAction("test", "alias").searchRouting("3,4").indexRouting("4")));
|
||||
|
||||
logger.info("--> indexing with id [1], and routing [4]");
|
||||
client().prepareIndex("alias", "type1", "1").setSource("field", "value2").setRefresh(true).execute().actionGet();
|
||||
|
@ -29,7 +29,6 @@ import org.elasticsearch.action.termvector.TermVectorRequest;
|
||||
import org.elasticsearch.action.termvector.TermVectorResponse;
|
||||
import org.elasticsearch.action.update.UpdateResponse;
|
||||
import org.elasticsearch.client.Requests;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.index.mapper.MapperParsingException;
|
||||
import org.elasticsearch.index.query.QueryBuilders;
|
||||
@ -38,20 +37,22 @@ import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.instanceOf;
|
||||
import static org.hamcrest.Matchers.nullValue;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
public class SimpleRoutingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
|
||||
@Override
|
||||
protected int minimumNumberOfShards() {
|
||||
return 2;
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testSimpleCrudRouting() throws Exception {
|
||||
createIndex("test");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> indexing with id [1], and routing [0]");
|
||||
client().prepareIndex("test", "type1", "1").setRouting("0").setSource("field", "value1").setRefresh(true).execute().actionGet();
|
||||
@ -109,7 +110,7 @@ public class SimpleRoutingTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testSimpleSearchRouting() {
|
||||
createIndex("test");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> indexing with id [1], and routing [0]");
|
||||
client().prepareIndex("test", "type1", "1").setRouting("0").setSource("field", "value1").setRefresh(true).execute().actionGet();
|
||||
@ -178,7 +179,7 @@ public class SimpleRoutingTests extends ElasticsearchIntegrationTest {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.addMapping("type1", XContentFactory.jsonBuilder().startObject().startObject("type1").startObject("_routing").field("required", true).endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> indexing with id [1], and routing [0]");
|
||||
client().prepareIndex("test", "type1", "1").setRouting("0").setSource("field", "value1").setRefresh(true).execute().actionGet();
|
||||
@ -236,7 +237,7 @@ public class SimpleRoutingTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("_routing").field("required", true).field("path", "routing_field").endObject()
|
||||
.endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> indexing with id [1], and routing [0]");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field", "value1", "routing_field", "0").setRefresh(true).execute().actionGet();
|
||||
@ -273,7 +274,7 @@ public class SimpleRoutingTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("_routing").field("required", true).field("path", "routing_field").endObject()
|
||||
.endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> indexing with id [1], and routing [0]");
|
||||
client().prepareBulk().add(
|
||||
@ -304,7 +305,7 @@ public class SimpleRoutingTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("_routing").field("required", true).field("path", "routing_field").endObject()
|
||||
.endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> indexing with id [1], and routing [0]");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field", "value1", "routing_field", 0).execute().actionGet();
|
||||
@ -331,7 +332,7 @@ public class SimpleRoutingTests extends ElasticsearchIntegrationTest {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.addMapping("type1", XContentFactory.jsonBuilder().startObject().startObject("type1").startObject("_routing").field("required", true).endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
logger.info("--> indexing with id [1], and routing [0]");
|
||||
client().prepareIndex("test", "type1", "1").setRouting("0").setSource("field", "value1").get();
|
||||
|
@ -29,7 +29,6 @@ import org.elasticsearch.index.query.QueryBuilders;
|
||||
import org.elasticsearch.index.query.functionscore.ScoreFunctionBuilders;
|
||||
import org.elasticsearch.search.SearchHit;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.elasticsearch.test.hamcrest.ElasticsearchAssertions;
|
||||
import org.hamcrest.Matchers;
|
||||
import org.junit.Test;
|
||||
|
||||
@ -40,6 +39,8 @@ import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.concurrent.ExecutionException;
|
||||
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
public class IndexLookupTests extends ElasticsearchIntegrationTest {
|
||||
@ -129,18 +130,18 @@ public class IndexLookupTests extends ElasticsearchIntegrationTest {
|
||||
XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("int_payload_field").field("type", "string").field("index_options", "offsets")
|
||||
.field("analyzer", "payload_int").endObject().endObject().endObject().endObject();
|
||||
ElasticsearchAssertions.assertAcked(prepareCreate("test").addMapping("type1", mapping).setSettings(
|
||||
ImmutableSettings.settingsBuilder().put("index.analysis.analyzer.payload_int.tokenizer", "whitespace")
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping).setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.analysis.analyzer.payload_int.tokenizer", "whitespace")
|
||||
.putArray("index.analysis.analyzer.payload_int.filter", "delimited_int")
|
||||
.put("index.analysis.filter.delimited_int.delimiter", "|")
|
||||
.put("index.analysis.filter.delimited_int.encoding", "int")
|
||||
.put("index.analysis.filter.delimited_int.type", "delimited_payload_filter")
|
||||
.put("index.number_of_replicas", 0).put("index.number_of_shards", randomIntBetween(1, 6))));
|
||||
.put("index.analysis.filter.delimited_int.type", "delimited_payload_filter")));
|
||||
indexRandom(true, client().prepareIndex("test", "type1", "1").setSource("int_payload_field", "a|1 b|2 b|3 c|4 d "), client()
|
||||
.prepareIndex("test", "type1", "2").setSource("int_payload_field", "b|1 b|2 c|3 d|4 a "),
|
||||
client().prepareIndex("test", "type1", "3").setSource("int_payload_field", "b|1 c|2 d|3 a|4 b "));
|
||||
ensureGreen();
|
||||
|
||||
}
|
||||
|
||||
@Test
|
||||
@ -191,7 +192,7 @@ public class IndexLookupTests extends ElasticsearchIntegrationTest {
|
||||
private void checkOnlyFunctionScore(String scoreScript, Map<String, Object> expectedScore, int numExpectedDocs) {
|
||||
SearchResponse sr = client().prepareSearch("test")
|
||||
.setQuery(QueryBuilders.functionScoreQuery(ScoreFunctionBuilders.scriptFunction(scoreScript))).execute().actionGet();
|
||||
ElasticsearchAssertions.assertHitCount(sr, numExpectedDocs);
|
||||
assertHitCount(sr, numExpectedDocs);
|
||||
for (SearchHit hit : sr.getHits().getHits()) {
|
||||
assertThat("for doc " + hit.getId(), ((Float) expectedScore.get(hit.getId())).doubleValue(),
|
||||
Matchers.closeTo(hit.score(), 1.e-4));
|
||||
@ -380,7 +381,7 @@ public class IndexLookupTests extends ElasticsearchIntegrationTest {
|
||||
private void checkArrayValsInEachDoc(String script, HashMap<String, List<Object>> expectedArray, int expectedHitSize) {
|
||||
SearchResponse sr = client().prepareSearch("test").setQuery(QueryBuilders.matchAllQuery()).addScriptField("tvtest", script)
|
||||
.execute().actionGet();
|
||||
ElasticsearchAssertions.assertHitCount(sr, expectedHitSize);
|
||||
assertHitCount(sr, expectedHitSize);
|
||||
int nullCounter = 0;
|
||||
for (SearchHit hit : sr.getHits().getHits()) {
|
||||
Object result = hit.getFields().get("tvtest").getValues().get(0);
|
||||
@ -401,8 +402,10 @@ public class IndexLookupTests extends ElasticsearchIntegrationTest {
|
||||
.field("index_options", "offsets").field("term_vector", "no").field("analyzer", "payload_string").endObject()
|
||||
.startObject("int_payload_field").field("type", "string").field("index_options", "offsets")
|
||||
.field("analyzer", "payload_int").endObject().endObject().endObject().endObject();
|
||||
ElasticsearchAssertions.assertAcked(prepareCreate("test").addMapping("type1", mapping).setSettings(
|
||||
ImmutableSettings.settingsBuilder().put("index.analysis.analyzer.payload_float.tokenizer", "whitespace")
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping).setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.analysis.analyzer.payload_float.tokenizer", "whitespace")
|
||||
.putArray("index.analysis.analyzer.payload_float.filter", "delimited_float")
|
||||
.put("index.analysis.filter.delimited_float.delimiter", "|")
|
||||
.put("index.analysis.filter.delimited_float.encoding", "float")
|
||||
@ -416,7 +419,7 @@ public class IndexLookupTests extends ElasticsearchIntegrationTest {
|
||||
.putArray("index.analysis.analyzer.payload_int.filter", "delimited_int")
|
||||
.put("index.analysis.filter.delimited_int.delimiter", "|")
|
||||
.put("index.analysis.filter.delimited_int.encoding", "int")
|
||||
.put("index.analysis.filter.delimited_int.type", "delimited_payload_filter").put("index.number_of_replicas", 0)
|
||||
.put("index.analysis.filter.delimited_int.type", "delimited_payload_filter")
|
||||
.put("index.number_of_shards", 1)));
|
||||
ensureYellow();
|
||||
indexRandom(true, client().prepareIndex("test", "type1", "1").setSource("float_payload_field", "a|1 b|2 a|3 b "), client()
|
||||
@ -586,7 +589,7 @@ public class IndexLookupTests extends ElasticsearchIntegrationTest {
|
||||
SearchResponse sr = client().prepareSearch("test")
|
||||
.setQuery(QueryBuilders.functionScoreQuery(ScoreFunctionBuilders.scriptFunction(scoreScript)))
|
||||
.addScriptField("tvtest", fieldScript).execute().actionGet();
|
||||
ElasticsearchAssertions.assertHitCount(sr, numExpectedDocs);
|
||||
assertHitCount(sr, numExpectedDocs);
|
||||
for (SearchHit hit : sr.getHits().getHits()) {
|
||||
Object result = hit.getFields().get("tvtest").getValues().get(0);
|
||||
Object expectedResult = expectedFieldVals.get(hit.getId());
|
||||
@ -599,7 +602,7 @@ public class IndexLookupTests extends ElasticsearchIntegrationTest {
|
||||
private void checkValueInEachDoc(String script, Map<String, Object> expectedResults, int numExpectedDocs) {
|
||||
SearchResponse sr = client().prepareSearch("test").setQuery(QueryBuilders.matchAllQuery()).addScriptField("tvtest", script)
|
||||
.execute().actionGet();
|
||||
ElasticsearchAssertions.assertHitCount(sr, numExpectedDocs);
|
||||
assertHitCount(sr, numExpectedDocs);
|
||||
for (SearchHit hit : sr.getHits().getHits()) {
|
||||
Object result = hit.getFields().get("tvtest").getValues().get(0);
|
||||
Object expectedResult = expectedResults.get(hit.getId());
|
||||
@ -610,11 +613,11 @@ public class IndexLookupTests extends ElasticsearchIntegrationTest {
|
||||
private void checkValueInEachDoc(int value, String script, int numExpectedDocs) {
|
||||
SearchResponse sr = client().prepareSearch("test").setQuery(QueryBuilders.matchAllQuery()).addScriptField("tvtest", script)
|
||||
.execute().actionGet();
|
||||
ElasticsearchAssertions.assertHitCount(sr, numExpectedDocs);
|
||||
assertHitCount(sr, numExpectedDocs);
|
||||
for (SearchHit hit : sr.getHits().getHits()) {
|
||||
Object result = hit.getFields().get("tvtest").getValues().get(0);
|
||||
if (result instanceof Integer) {
|
||||
assertThat(((Integer) result).intValue(), equalTo(value));
|
||||
assertThat((Integer)result, equalTo(value));
|
||||
} else if (result instanceof Long) {
|
||||
assertThat(((Long) result).intValue(), equalTo(value));
|
||||
} else {
|
||||
|
@ -31,21 +31,25 @@ import org.junit.Test;
|
||||
import java.util.concurrent.ExecutionException;
|
||||
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.*;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.Scope.*;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
|
||||
|
||||
/**
|
||||
*/
|
||||
@ElasticsearchIntegrationTest.ClusterScope(scope = ElasticsearchIntegrationTest.Scope.SUITE)
|
||||
@ClusterScope(scope = SUITE)
|
||||
public class StressSearchServiceReaperTest extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Override
|
||||
protected Settings nodeSettings(int nodeOrdinal) {
|
||||
// very frequent checks
|
||||
return ImmutableSettings.builder().put(SearchService.KEEPALIVE_INTERVAL_KEY, TimeValue.timeValueMillis(1)).build();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
@Slow
|
||||
@Test // see issue #5165 - this test fails each time without the fix in pull #5170
|
||||
public void testStressReaper() throws ExecutionException, InterruptedException {
|
||||
@ -54,7 +58,7 @@ public class StressSearchServiceReaperTest extends ElasticsearchIntegrationTest
|
||||
for (int i = 0; i < builders.length; i++) {
|
||||
builders[i] = client().prepareIndex("test", "type", "" + i).setSource("f", English.intToEnglish(i));
|
||||
}
|
||||
prepareCreate("test").setSettings("number_of_shards", randomIntBetween(1,5), "number_of_replicas", randomIntBetween(0,1)).setSettings();
|
||||
createIndex("test");
|
||||
indexRandom(true, builders);
|
||||
ensureYellow();
|
||||
final int iterations = atLeast(500);
|
||||
|
@ -23,8 +23,6 @@ import com.carrotsearch.hppc.IntIntMap;
|
||||
import com.carrotsearch.hppc.IntIntOpenHashMap;
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram;
|
||||
import org.elasticsearch.search.aggregations.bucket.missing.Missing;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.Terms;
|
||||
@ -46,13 +44,9 @@ import static org.hamcrest.core.IsNull.notNullValue;
|
||||
*/
|
||||
public class CombiTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -25,8 +25,6 @@ import org.elasticsearch.action.bulk.BulkResponse;
|
||||
import org.elasticsearch.action.search.SearchRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.support.IndicesOptions;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.FilterBuilders;
|
||||
import org.elasticsearch.index.query.RangeFilterBuilder;
|
||||
@ -51,15 +49,10 @@ import static org.hamcrest.core.IsNull.notNullValue;
|
||||
public class RandomTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
|
||||
|
||||
// Make sure that unordered, reversed, disjoint and/or overlapping ranges are supported
|
||||
// Duel with filters
|
||||
public void testRandomRanges() throws Exception {
|
||||
|
@ -20,8 +20,6 @@ package org.elasticsearch.search.aggregations.bucket;
|
||||
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.search.aggregations.AbstractAggregationBuilder;
|
||||
import org.elasticsearch.search.aggregations.bucket.histogram.DateHistogram;
|
||||
@ -53,11 +51,8 @@ import static org.hamcrest.core.IsNull.notNullValue;
|
||||
public class DateHistogramTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
private DateTime date(int month, int day) {
|
||||
|
@ -54,11 +54,8 @@ import static org.hamcrest.core.IsNull.nullValue;
|
||||
public class DateRangeTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
private static IndexRequestBuilder indexDoc(int month, int day, int value) throws Exception {
|
||||
|
@ -62,11 +62,8 @@ public class DoubleTermsTests extends ElasticsearchIntegrationTest {
|
||||
private static final String MULTI_VALUED_FIELD_NAME = "d_values";
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
@Before
|
||||
|
@ -21,8 +21,6 @@ package org.elasticsearch.search.aggregations.bucket;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.search.aggregations.bucket.filter.Filter;
|
||||
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram;
|
||||
import org.elasticsearch.search.aggregations.metrics.avg.Avg;
|
||||
@ -50,11 +48,8 @@ import static org.hamcrest.core.IsNull.notNullValue;
|
||||
public class FilterTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
int numDocs, numTag1Docs;
|
||||
|
@ -21,8 +21,6 @@ package org.elasticsearch.search.aggregations.bucket;
|
||||
import com.google.common.collect.Sets;
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.unit.DistanceUnit;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram;
|
||||
@ -52,11 +50,8 @@ import static org.hamcrest.core.IsNull.notNullValue;
|
||||
public class GeoDistanceTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
private IndexRequestBuilder indexCity(String idx, String name, String... latLons) throws Exception {
|
||||
|
@ -24,8 +24,6 @@ import com.carrotsearch.hppc.cursors.ObjectIntCursor;
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.geo.GeoHashUtils;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.GeoBoundingBoxFilterBuilder;
|
||||
import org.elasticsearch.search.aggregations.AggregationBuilders;
|
||||
@ -41,20 +39,15 @@ import java.util.Random;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.search.aggregations.AggregationBuilders.geohashGrid;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.greaterThanOrEqualTo;
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
public class GeoHashGridTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
private IndexRequestBuilder indexCity(String name, String latLon) throws Exception {
|
||||
@ -75,9 +68,8 @@ public class GeoHashGridTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Before
|
||||
public void init() throws Exception {
|
||||
prepareCreate("idx")
|
||||
.addMapping("type", "location", "type=geo_point", "city", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
assertAcked(prepareCreate("idx")
|
||||
.addMapping("type", "location", "type=geo_point", "city", "type=string,index=not_analyzed"));
|
||||
|
||||
createIndex("idx_unmapped");
|
||||
|
||||
|
@ -49,11 +49,8 @@ public class GlobalTests extends ElasticsearchIntegrationTest {
|
||||
int numDocs;
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
@Before
|
||||
|
@ -21,8 +21,6 @@ package org.elasticsearch.search.aggregations.bucket;
|
||||
import com.carrotsearch.hppc.LongOpenHashSet;
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.search.aggregations.bucket.filter.Filter;
|
||||
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.Terms;
|
||||
@ -55,11 +53,8 @@ public class HistogramTests extends ElasticsearchIntegrationTest {
|
||||
private static final String MULTI_VALUED_FIELD_NAME = "l_values";
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
int numDocs;
|
||||
|
@ -20,8 +20,6 @@ package org.elasticsearch.search.aggregations.bucket;
|
||||
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.index.mapper.ip.IpFieldMapper;
|
||||
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram;
|
||||
import org.elasticsearch.search.aggregations.bucket.range.ipv4.IPv4Range;
|
||||
@ -50,11 +48,8 @@ import static org.hamcrest.core.IsNull.nullValue;
|
||||
public class IPv4RangeTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
@Before
|
||||
|
@ -60,11 +60,8 @@ public class LongTermsTests extends ElasticsearchIntegrationTest {
|
||||
private static final String MULTI_VALUED_FIELD_NAME = "l_values";
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
@Before
|
||||
|
@ -50,11 +50,8 @@ public class MinDocCountTests extends ElasticsearchIntegrationTest {
|
||||
private static final QueryBuilder QUERY = QueryBuilders.termQuery("match", true);
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
private int cardinality;
|
||||
|
@ -20,8 +20,6 @@ package org.elasticsearch.search.aggregations.bucket;
|
||||
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram;
|
||||
import org.elasticsearch.search.aggregations.bucket.missing.Missing;
|
||||
import org.elasticsearch.search.aggregations.metrics.avg.Avg;
|
||||
@ -46,13 +44,9 @@ import static org.hamcrest.core.IsNull.notNullValue;
|
||||
*/
|
||||
public class MissingTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
int numDocs, numDocsMissing, numDocsUnmapped;
|
||||
|
@ -42,6 +42,7 @@ import java.util.List;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.search.aggregations.AggregationBuilders.*;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertSearchResponse;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.is;
|
||||
@ -53,11 +54,8 @@ import static org.hamcrest.core.IsNull.notNullValue;
|
||||
public class NestedTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
int numParents;
|
||||
@ -66,10 +64,9 @@ public class NestedTests extends ElasticsearchIntegrationTest {
|
||||
@Before
|
||||
public void init() throws Exception {
|
||||
|
||||
prepareCreate("idx")
|
||||
.addMapping("type", "nested", "type=nested")
|
||||
.setSettings(indexSettings())
|
||||
.execute().actionGet();
|
||||
assertAcked(prepareCreate("idx")
|
||||
.addMapping("type", "nested", "type=nested"));
|
||||
|
||||
List<IndexRequestBuilder> builders = new ArrayList<IndexRequestBuilder>();
|
||||
|
||||
numParents = randomIntBetween(3, 10);
|
||||
|
@ -20,8 +20,6 @@ package org.elasticsearch.search.aggregations.bucket;
|
||||
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram;
|
||||
import org.elasticsearch.search.aggregations.bucket.range.Range;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.Terms;
|
||||
@ -52,11 +50,8 @@ public class RangeTests extends ElasticsearchIntegrationTest {
|
||||
private static final String MULTI_VALUED_FIELD_NAME = "l_values";
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
int numDocs;
|
||||
|
@ -21,8 +21,6 @@ package org.elasticsearch.search.aggregations.bucket;
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.geo.GeoHashUtils;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.index.query.FilterBuilders;
|
||||
import org.elasticsearch.index.query.QueryBuilders;
|
||||
import org.elasticsearch.search.aggregations.bucket.filter.Filter;
|
||||
@ -42,6 +40,7 @@ import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.search.aggregations.AggregationBuilders.*;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertSearchResponse;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
@ -54,11 +53,8 @@ import static org.hamcrest.Matchers.equalTo;
|
||||
public class ShardReduceTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
protected Settings nodeSettings(int nodeOrdinal) {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", randomBoolean() ? 1 : randomIntBetween(2, 10))
|
||||
.put("index.number_of_replicas", randomIntBetween(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
private IndexRequestBuilder indexDoc(String date, int value) throws Exception {
|
||||
@ -79,10 +75,8 @@ public class ShardReduceTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Before
|
||||
public void init() throws Exception {
|
||||
prepareCreate("idx")
|
||||
.addMapping("type", "nested", "type=nested", "ip", "type=ip", "location", "type=geo_point")
|
||||
.setSettings(indexSettings())
|
||||
.execute().actionGet();
|
||||
assertAcked(prepareCreate("idx")
|
||||
.addMapping("type", "nested", "type=nested", "ip", "type=ip", "location", "type=geo_point"));
|
||||
|
||||
indexRandom(true,
|
||||
indexDoc("2014-01-01", 1),
|
||||
|
@ -19,54 +19,22 @@
|
||||
package org.elasticsearch.search.aggregations.bucket;
|
||||
|
||||
import com.google.common.collect.ImmutableMap;
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.Terms;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Collection;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.search.aggregations.AggregationBuilders.terms;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.Scope;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.is;
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
@ClusterScope(scope = Scope.TEST)
|
||||
public class ShardSizeTermsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
/**
|
||||
* to properly test the effect/functionality of shard_size, we need to force having 2 shards and also
|
||||
* control the routing such that certain documents will end on each shard. Using "djb" routing hash + ignoring the
|
||||
* doc type when hashing will ensure that docs with routing value "1" will end up in a different shard than docs with
|
||||
* routing value "2".
|
||||
*/
|
||||
@Override
|
||||
protected Settings nodeSettings(int nodeOrdinal) {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 0)
|
||||
.put("cluster.routing.operation.hash.type", "djb")
|
||||
.put("cluster.routing.operation.use_type", "false")
|
||||
.build();
|
||||
}
|
||||
public class ShardSizeTermsTests extends ShardSizeTests {
|
||||
|
||||
@Test
|
||||
public void noShardSize_string() throws Exception {
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -91,9 +59,7 @@ public class ShardSizeTermsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_string() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -118,9 +84,7 @@ public class ShardSizeTermsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_string_singleShard() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -145,9 +109,7 @@ public class ShardSizeTermsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void noShardSize_long() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -172,9 +134,7 @@ public class ShardSizeTermsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_long() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -199,9 +159,7 @@ public class ShardSizeTermsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_long_singleShard() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -226,9 +184,7 @@ public class ShardSizeTermsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void noShardSize_double() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=double");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -253,9 +209,7 @@ public class ShardSizeTermsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_double() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=double");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -280,9 +234,7 @@ public class ShardSizeTermsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_double_singleShard() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=double");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -303,59 +255,4 @@ public class ShardSizeTermsTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(bucket.getDocCount(), equalTo(expected.get(bucket.getKeyAsNumber().intValue())));
|
||||
}
|
||||
}
|
||||
|
||||
private void indexData() throws Exception {
|
||||
|
||||
/*
|
||||
|
||||
|
||||
|| || size = 3, shard_size = 5 || shard_size = size = 3 ||
|
||||
||==========||==================================================||===============================================||
|
||||
|| shard 1: || "1" - 5 | "2" - 4 | "3" - 3 | "4" - 2 | "5" - 1 || "1" - 5 | "3" - 3 | "2" - 4 ||
|
||||
||----------||--------------------------------------------------||-----------------------------------------------||
|
||||
|| shard 2: || "1" - 3 | "2" - 1 | "3" - 5 | "4" - 2 | "5" - 1 || "1" - 3 | "3" - 5 | "4" - 2 ||
|
||||
||----------||--------------------------------------------------||-----------------------------------------------||
|
||||
|| reduced: || "1" - 8 | "2" - 5 | "3" - 8 | "4" - 4 | "5" - 2 || ||
|
||||
|| || || "1" - 8, "3" - 8, "2" - 4 <= WRONG ||
|
||||
|| || "1" - 8 | "3" - 8 | "2" - 5 <= CORRECT || ||
|
||||
|
||||
|
||||
*/
|
||||
|
||||
List<IndexRequestBuilder> indexOps = new ArrayList<IndexRequestBuilder>();
|
||||
|
||||
indexDoc("1", "1", 5, indexOps);
|
||||
indexDoc("1", "2", 4, indexOps);
|
||||
indexDoc("1", "3", 3, indexOps);
|
||||
indexDoc("1", "4", 2, indexOps);
|
||||
indexDoc("1", "5", 1, indexOps);
|
||||
|
||||
// total docs in shard "1" = 15
|
||||
|
||||
indexDoc("2", "1", 3, indexOps);
|
||||
indexDoc("2", "2", 1, indexOps);
|
||||
indexDoc("2", "3", 5, indexOps);
|
||||
indexDoc("2", "4", 2, indexOps);
|
||||
indexDoc("2", "5", 1, indexOps);
|
||||
|
||||
// total docs in shard "2" = 12
|
||||
|
||||
indexRandom(true, indexOps);
|
||||
|
||||
long totalOnOne = client().prepareSearch("idx").setTypes("type").setRouting("1").setQuery(matchAllQuery()).execute().actionGet().getHits().getTotalHits();
|
||||
assertThat(totalOnOne, is(15l));
|
||||
long totalOnTwo = client().prepareSearch("idx").setTypes("type").setRouting("2").setQuery(matchAllQuery()).execute().actionGet().getHits().getTotalHits();
|
||||
assertThat(totalOnTwo, is(12l));
|
||||
ensureSearchable();
|
||||
}
|
||||
|
||||
private void indexDoc(String shard, String key, int times, List<IndexRequestBuilder> indexOps) throws Exception {
|
||||
for (int i = 0; i < times; i++) {
|
||||
indexOps.add(client().prepareIndex("idx", "type").setRouting(shard).setCreate(true).setSource(jsonBuilder()
|
||||
.startObject()
|
||||
.field("key", key)
|
||||
.endObject()));
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -0,0 +1,120 @@
|
||||
/*
|
||||
* Licensed to Elasticsearch under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.search.aggregations.bucket;
|
||||
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Ignore;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.Scope.SUITE;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.is;
|
||||
|
||||
@Ignore
|
||||
@ClusterScope(scope = SUITE)
|
||||
public abstract class ShardSizeTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
/**
|
||||
* to properly test the effect/functionality of shard_size, we need to force having 2 shards and also
|
||||
* control the routing such that certain documents will end on each shard. Using "djb" routing hash + ignoring the
|
||||
* doc type when hashing will ensure that docs with routing value "1" will end up in a different shard than docs with
|
||||
* routing value "2".
|
||||
*/
|
||||
@Override
|
||||
protected Settings nodeSettings(int nodeOrdinal) {
|
||||
return ImmutableSettings.builder()
|
||||
.put("cluster.routing.operation.hash.type", "djb")
|
||||
.put("cluster.routing.operation.use_type", "false")
|
||||
.build();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected int numberOfShards() {
|
||||
return 2;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
protected void createIdx(String keyFieldMapping) {
|
||||
assertAcked(prepareCreate("idx")
|
||||
.addMapping("type", "key", keyFieldMapping));
|
||||
}
|
||||
|
||||
protected void indexData() throws Exception {
|
||||
|
||||
/*
|
||||
|
||||
|
||||
|| || size = 3, shard_size = 5 || shard_size = size = 3 ||
|
||||
||==========||==================================================||===============================================||
|
||||
|| shard 1: || "1" - 5 | "2" - 4 | "3" - 3 | "4" - 2 | "5" - 1 || "1" - 5 | "3" - 3 | "2" - 4 ||
|
||||
||----------||--------------------------------------------------||-----------------------------------------------||
|
||||
|| shard 2: || "1" - 3 | "2" - 1 | "3" - 5 | "4" - 2 | "5" - 1 || "1" - 3 | "3" - 5 | "4" - 2 ||
|
||||
||----------||--------------------------------------------------||-----------------------------------------------||
|
||||
|| reduced: || "1" - 8 | "2" - 5 | "3" - 8 | "4" - 4 | "5" - 2 || ||
|
||||
|| || || "1" - 8, "3" - 8, "2" - 4 <= WRONG ||
|
||||
|| || "1" - 8 | "3" - 8 | "2" - 5 <= CORRECT || ||
|
||||
|
||||
|
||||
*/
|
||||
|
||||
|
||||
indexDoc("1", "1", 5);
|
||||
indexDoc("1", "2", 4);
|
||||
indexDoc("1", "3", 3);
|
||||
indexDoc("1", "4", 2);
|
||||
indexDoc("1", "5", 1);
|
||||
|
||||
// total docs in shard "1" = 15
|
||||
|
||||
indexDoc("2", "1", 3);
|
||||
indexDoc("2", "2", 1);
|
||||
indexDoc("2", "3", 5);
|
||||
indexDoc("2", "4", 2);
|
||||
indexDoc("2", "5", 1);
|
||||
|
||||
// total docs in shard "2" = 12
|
||||
|
||||
client().admin().indices().prepareFlush("idx").execute().actionGet();
|
||||
client().admin().indices().prepareRefresh("idx").execute().actionGet();
|
||||
|
||||
long totalOnOne = client().prepareSearch("idx").setTypes("type").setRouting("1").setQuery(matchAllQuery()).execute().actionGet().getHits().getTotalHits();
|
||||
assertThat(totalOnOne, is(15l));
|
||||
long totalOnTwo = client().prepareSearch("idx").setTypes("type").setRouting("2").setQuery(matchAllQuery()).execute().actionGet().getHits().getTotalHits();
|
||||
assertThat(totalOnTwo, is(12l));
|
||||
}
|
||||
|
||||
protected void indexDoc(String shard, String key, int times) throws Exception {
|
||||
for (int i = 0; i < times; i++) {
|
||||
client().prepareIndex("idx", "type").setRouting(shard).setCreate(true).setSource(jsonBuilder()
|
||||
.startObject()
|
||||
.field("key", key)
|
||||
.field("value", 1)
|
||||
.endObject()).execute().actionGet();
|
||||
}
|
||||
}
|
||||
}
|
@ -63,11 +63,8 @@ public class StringTermsTests extends ElasticsearchIntegrationTest {
|
||||
private static final String MULTI_VALUED_FIELD_NAME = "s_values";
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
public static String randomExecutionHint() {
|
||||
|
@ -19,8 +19,6 @@
|
||||
package org.elasticsearch.search.aggregations.metrics;
|
||||
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Before;
|
||||
|
||||
@ -33,13 +31,10 @@ import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
*
|
||||
*/
|
||||
public abstract class AbstractNumericTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
protected long minValue, maxValue, minValues, maxValues;
|
||||
|
@ -19,8 +19,6 @@
|
||||
package org.elasticsearch.search.aggregations.metrics;
|
||||
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.search.aggregations.metrics.valuecount.ValueCount;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Before;
|
||||
@ -38,11 +36,8 @@ import static org.hamcrest.Matchers.notNullValue;
|
||||
public class ValueCountTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", between(0, 1))
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
@Before
|
||||
|
@ -21,7 +21,6 @@ package org.elasticsearch.search.basic;
|
||||
|
||||
import org.apache.lucene.util.LuceneTestCase.Slow;
|
||||
import org.elasticsearch.action.admin.cluster.health.ClusterHealthStatus;
|
||||
import org.elasticsearch.action.admin.indices.create.CreateIndexResponse;
|
||||
import org.elasticsearch.action.admin.indices.refresh.RefreshResponse;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.client.Client;
|
||||
@ -29,9 +28,7 @@ import org.elasticsearch.index.query.QueryBuilders;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.greaterThanOrEqualTo;
|
||||
|
||||
|
||||
@ -44,28 +41,28 @@ public class SearchWhileCreatingIndexTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
@Slow
|
||||
public void testIndexCausesIndexCreation() throws Exception {
|
||||
searchWhileCreatingIndex(-1, 1); // 1 replica in our default...
|
||||
searchWhileCreatingIndex(false, 1); // 1 replica in our default...
|
||||
}
|
||||
|
||||
@Test
|
||||
@Slow
|
||||
public void testNoReplicas() throws Exception {
|
||||
searchWhileCreatingIndex(10, 0);
|
||||
searchWhileCreatingIndex(true, 0);
|
||||
}
|
||||
|
||||
@Test
|
||||
@Slow
|
||||
public void testOneReplica() throws Exception {
|
||||
searchWhileCreatingIndex(10, 1);
|
||||
searchWhileCreatingIndex(true, 1);
|
||||
}
|
||||
|
||||
@Test
|
||||
@Slow
|
||||
public void testTwoReplicas() throws Exception {
|
||||
searchWhileCreatingIndex(10, 2);
|
||||
searchWhileCreatingIndex(true, 2);
|
||||
}
|
||||
|
||||
private void searchWhileCreatingIndex(int numberOfShards, int numberOfReplicas) throws Exception {
|
||||
private void searchWhileCreatingIndex(boolean createIndex, int numberOfReplicas) throws Exception {
|
||||
|
||||
// make sure we have enough nodes to guaranty default QUORUM consistency.
|
||||
// TODO: add a smarter choice based on actual consistency (when that is randomized)
|
||||
@ -74,10 +71,8 @@ public class SearchWhileCreatingIndexTests extends ElasticsearchIntegrationTest
|
||||
cluster().ensureAtLeastNumNodes(randomIntBetween(neededNodes, shardsNo));
|
||||
for (int i = 0; i < 20; i++) {
|
||||
logger.info("running iteration {}", i);
|
||||
if (numberOfShards > 0) {
|
||||
CreateIndexResponse createIndexResponse = prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", numberOfShards).put("index.number_of_replicas", numberOfReplicas)).get();
|
||||
assertThat(createIndexResponse.isAcknowledged(), equalTo(true));
|
||||
if (createIndex) {
|
||||
createIndex("test");
|
||||
}
|
||||
client().prepareIndex("test", "type1", randomAsciiOfLength(5)).setSource("field", "test").execute().actionGet();
|
||||
RefreshResponse refreshResponse = client().admin().indices().prepareRefresh("test").execute().actionGet();
|
||||
|
@ -51,7 +51,6 @@ public class SearchWithRandomExceptionsTests extends ElasticsearchIntegrationTes
|
||||
|
||||
@Test
|
||||
public void testRandomDirectoryIOExceptions() throws IOException, InterruptedException, ExecutionException {
|
||||
final int numShards = between(1, 5);
|
||||
String mapping = XContentFactory.jsonBuilder().
|
||||
startObject().
|
||||
startObject("type").
|
||||
@ -85,7 +84,7 @@ public class SearchWithRandomExceptionsTests extends ElasticsearchIntegrationTes
|
||||
}
|
||||
|
||||
Builder settings = settingsBuilder()
|
||||
.put("index.number_of_shards", numShards)
|
||||
.put(indexSettings())
|
||||
.put("index.number_of_replicas", randomIntBetween(0, 1))
|
||||
.put(MockDirectoryHelper.RANDOM_IO_EXCEPTION_RATE, exceptionRate)
|
||||
.put(MockDirectoryHelper.RANDOM_IO_EXCEPTION_RATE_ON_OPEN, exceptionOnOpenRate)
|
||||
@ -122,6 +121,7 @@ public class SearchWithRandomExceptionsTests extends ElasticsearchIntegrationTes
|
||||
} catch (ElasticsearchException ex) {
|
||||
}
|
||||
}
|
||||
NumShards test = getNumShards("test");
|
||||
logger.info("Start Refresh");
|
||||
RefreshResponse refreshResponse = client().admin().indices().prepareRefresh("test").execute().get(); // don't assert on failures here
|
||||
final boolean refreshFailed = refreshResponse.getShardFailures().length != 0 || refreshResponse.getFailedShards() != 0;
|
||||
@ -134,7 +134,7 @@ public class SearchWithRandomExceptionsTests extends ElasticsearchIntegrationTes
|
||||
long expectedResults = added[docToQuery] ? 1 : 0;
|
||||
logger.info("Searching for [test:{}]", English.intToEnglish(docToQuery));
|
||||
SearchResponse searchResponse = client().prepareSearch().setQuery(QueryBuilders.matchQuery("test", English.intToEnglish(docToQuery))).get();
|
||||
logger.info("Successful shards: [{}] numShards: [{}]", searchResponse.getSuccessfulShards(), numShards);
|
||||
logger.info("Successful shards: [{}] numShards: [{}]", searchResponse.getSuccessfulShards(), test.numPrimaries);
|
||||
// check match all
|
||||
searchResponse = client().prepareSearch().setQuery(QueryBuilders.matchAllQuery()).get();
|
||||
} catch (SearchPhaseExecutionException ex) {
|
||||
@ -150,7 +150,6 @@ public class SearchWithRandomExceptionsTests extends ElasticsearchIntegrationTes
|
||||
|
||||
@Test
|
||||
public void testRandomExceptions() throws IOException, InterruptedException, ExecutionException {
|
||||
final int numShards = between(1, 5);
|
||||
String mapping = XContentFactory.jsonBuilder().
|
||||
startObject().
|
||||
startObject("type").
|
||||
@ -184,7 +183,7 @@ public class SearchWithRandomExceptionsTests extends ElasticsearchIntegrationTes
|
||||
}
|
||||
|
||||
Builder settings = settingsBuilder()
|
||||
.put("index.number_of_shards", numShards)
|
||||
.put(indexSettings())
|
||||
.put("index.number_of_replicas", randomIntBetween(0, 1))
|
||||
.put(MockInternalEngine.READER_WRAPPER_TYPE, RandomExceptionDirectoryReaderWrapper.class.getName())
|
||||
.put(EXCEPTION_TOP_LEVEL_RATIO_KEY, topLevelRate)
|
||||
@ -213,6 +212,7 @@ public class SearchWithRandomExceptionsTests extends ElasticsearchIntegrationTes
|
||||
final boolean refreshFailed = refreshResponse.getShardFailures().length != 0 || refreshResponse.getFailedShards() != 0;
|
||||
logger.info("Refresh failed [{}] numShardsFailed: [{}], shardFailuresLength: [{}], successfulShards: [{}], totalShards: [{}] ", refreshFailed, refreshResponse.getFailedShards(), refreshResponse.getShardFailures().length, refreshResponse.getSuccessfulShards(), refreshResponse.getTotalShards());
|
||||
|
||||
NumShards test = getNumShards("test");
|
||||
final int numSearches = atLeast(100);
|
||||
// we don't check anything here really just making sure we don't leave any open files or a broken index behind.
|
||||
for (int i = 0; i < numSearches; i++) {
|
||||
@ -221,14 +221,14 @@ public class SearchWithRandomExceptionsTests extends ElasticsearchIntegrationTes
|
||||
long expectedResults = added[docToQuery] ? 1 : 0;
|
||||
logger.info("Searching for [test:{}]", English.intToEnglish(docToQuery));
|
||||
SearchResponse searchResponse = client().prepareSearch().setQuery(QueryBuilders.matchQuery("test", English.intToEnglish(docToQuery))).get();
|
||||
logger.info("Successful shards: [{}] numShards: [{}]", searchResponse.getSuccessfulShards(), numShards);
|
||||
if (searchResponse.getSuccessfulShards() == numShards && !refreshFailed) {
|
||||
logger.info("Successful shards: [{}] numShards: [{}]", searchResponse.getSuccessfulShards(), test.numPrimaries);
|
||||
if (searchResponse.getSuccessfulShards() == test.numPrimaries && !refreshFailed) {
|
||||
assertThat(searchResponse.getHits().getTotalHits(), Matchers.equalTo(expectedResults));
|
||||
}
|
||||
// check match all
|
||||
searchResponse = client().prepareSearch().setQuery(QueryBuilders.matchAllQuery()).get();
|
||||
logger.info("Match all Successful shards: [{}] numShards: [{}]", searchResponse.getSuccessfulShards(), numShards);
|
||||
if (searchResponse.getSuccessfulShards() == numShards && !refreshFailed) {
|
||||
logger.info("Match all Successful shards: [{}] numShards: [{}]", searchResponse.getSuccessfulShards(), test.numPrimaries);
|
||||
if (searchResponse.getSuccessfulShards() == test.numPrimaries && !refreshFailed) {
|
||||
assertThat(searchResponse.getHits().getTotalHits(), Matchers.equalTo(numCreated));
|
||||
}
|
||||
|
||||
|
@ -50,25 +50,27 @@ public class TransportSearchFailuresTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testFailedSearchWithWrongQuery() throws Exception {
|
||||
logger.info("Start Testing failed search with wrong query");
|
||||
prepareCreate("test", 1, settingsBuilder().put("index.number_of_shards", 3)
|
||||
prepareCreate("test", 1, settingsBuilder().put(indexSettings())
|
||||
.put("index.number_of_replicas", 2)
|
||||
.put("routing.hash.type", "simple")).execute().actionGet();
|
||||
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForYellowStatus().execute().actionGet();
|
||||
ensureYellow();
|
||||
|
||||
NumShards test = getNumShards("test");
|
||||
|
||||
for (int i = 0; i < 100; i++) {
|
||||
index(client(), Integer.toString(i), "test", i);
|
||||
}
|
||||
RefreshResponse refreshResponse = client().admin().indices().refresh(refreshRequest("test")).actionGet();
|
||||
assertThat(refreshResponse.getTotalShards(), equalTo(9));
|
||||
assertThat(refreshResponse.getSuccessfulShards(), equalTo(3));
|
||||
assertThat(refreshResponse.getTotalShards(), equalTo(test.totalNumShards));
|
||||
assertThat(refreshResponse.getSuccessfulShards(), equalTo(test.numPrimaries));
|
||||
assertThat(refreshResponse.getFailedShards(), equalTo(0));
|
||||
for (int i = 0; i < 5; i++) {
|
||||
try {
|
||||
SearchResponse searchResponse = client().search(searchRequest("test").source("{ xxx }".getBytes(Charsets.UTF_8))).actionGet();
|
||||
assertThat(searchResponse.getTotalShards(), equalTo(3));
|
||||
assertThat(searchResponse.getTotalShards(), equalTo(test.numPrimaries));
|
||||
assertThat(searchResponse.getSuccessfulShards(), equalTo(0));
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(3));
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(test.numPrimaries));
|
||||
fail("search should fail");
|
||||
} catch (ElasticsearchException e) {
|
||||
assertThat(e.unwrapCause(), instanceOf(SearchPhaseExecutionException.class));
|
||||
@ -81,23 +83,23 @@ public class TransportSearchFailuresTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
logger.info("Running Cluster Health");
|
||||
ClusterHealthResponse clusterHealth = client().admin().cluster().health(clusterHealthRequest("test")
|
||||
.waitForYellowStatus().waitForRelocatingShards(0).waitForActiveShards(6)).actionGet();
|
||||
.waitForYellowStatus().waitForRelocatingShards(0).waitForActiveShards(test.numPrimaries * 2)).actionGet();
|
||||
logger.info("Done Cluster Health, status " + clusterHealth.getStatus());
|
||||
assertThat(clusterHealth.isTimedOut(), equalTo(false));
|
||||
assertThat(clusterHealth.getStatus(), equalTo(ClusterHealthStatus.YELLOW));
|
||||
assertThat(clusterHealth.getActiveShards(), equalTo(6));
|
||||
assertThat(clusterHealth.getActiveShards(), equalTo(test.numPrimaries * 2));
|
||||
|
||||
refreshResponse = client().admin().indices().refresh(refreshRequest("test")).actionGet();
|
||||
assertThat(refreshResponse.getTotalShards(), equalTo(9));
|
||||
assertThat(refreshResponse.getSuccessfulShards(), equalTo(6));
|
||||
assertThat(refreshResponse.getTotalShards(), equalTo(test.totalNumShards));
|
||||
assertThat(refreshResponse.getSuccessfulShards(), equalTo(test.numPrimaries * 2));
|
||||
assertThat(refreshResponse.getFailedShards(), equalTo(0));
|
||||
|
||||
for (int i = 0; i < 5; i++) {
|
||||
try {
|
||||
SearchResponse searchResponse = client().search(searchRequest("test").source("{ xxx }".getBytes(Charsets.UTF_8))).actionGet();
|
||||
assertThat(searchResponse.getTotalShards(), equalTo(3));
|
||||
assertThat(searchResponse.getTotalShards(), equalTo(test.numPrimaries));
|
||||
assertThat(searchResponse.getSuccessfulShards(), equalTo(0));
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(3));
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(test.numPrimaries));
|
||||
fail("search should fail");
|
||||
} catch (ElasticsearchException e) {
|
||||
assertThat(e.unwrapCause(), instanceOf(SearchPhaseExecutionException.class));
|
||||
|
@ -27,8 +27,8 @@ import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.search.MultiSearchResponse;
|
||||
import org.elasticsearch.action.search.SearchPhaseExecutionException;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.client.Requests;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.index.query.QueryBuilders;
|
||||
import org.elasticsearch.index.query.functionscore.script.ScriptScoreFunctionBuilder;
|
||||
@ -46,6 +46,7 @@ import java.util.Set;
|
||||
|
||||
import static org.elasticsearch.action.search.SearchType.*;
|
||||
import static org.elasticsearch.client.Requests.*;
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_NUMBER_OF_SHARDS;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.common.unit.TimeValue.timeValueMinutes;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
@ -60,24 +61,40 @@ import static org.hamcrest.Matchers.*;
|
||||
*/
|
||||
public class TransportTwoNodesSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return 0;
|
||||
}
|
||||
|
||||
private Set<String> prepareData() throws Exception {
|
||||
return prepareData(-1);
|
||||
}
|
||||
|
||||
private Set<String> prepareData(int numShards) throws Exception {
|
||||
Set<String> fullExpectedIds = Sets.newHashSet();
|
||||
|
||||
ImmutableSettings.Builder settingsBuilder = settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("routing.hash.type", "simple");
|
||||
|
||||
if (numShards > 0) {
|
||||
settingsBuilder.put(SETTING_NUMBER_OF_SHARDS, numShards);
|
||||
}
|
||||
|
||||
client().admin().indices().create(createIndexRequest("test")
|
||||
.settings(settingsBuilder().put("index.number_of_shards", 3)
|
||||
.put("index.number_of_replicas", 0)
|
||||
.put("routing.hash.type", "simple")))
|
||||
.settings(settingsBuilder))
|
||||
.actionGet();
|
||||
|
||||
ensureGreen();
|
||||
for (int i = 0; i < 100; i++) {
|
||||
index(client(), Integer.toString(i), "test", i);
|
||||
index(Integer.toString(i), "test", i);
|
||||
fullExpectedIds.add(Integer.toString(i));
|
||||
}
|
||||
client().admin().indices().refresh(refreshRequest("test")).actionGet();
|
||||
refresh();
|
||||
return fullExpectedIds;
|
||||
}
|
||||
|
||||
private void index(Client client, String id, String nameValue, int age) throws IOException {
|
||||
private void index(String id, String nameValue, int age) throws IOException {
|
||||
client().index(Requests.indexRequest("test").type("type1").id(id).source(source(id, nameValue, age))).actionGet();
|
||||
}
|
||||
|
||||
@ -242,7 +259,7 @@ public class TransportTwoNodesSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testQueryAndFetch() throws Exception {
|
||||
prepareData();
|
||||
prepareData(3);
|
||||
|
||||
SearchSourceBuilder source = searchSource()
|
||||
.query(termQuery("multi", "test"))
|
||||
@ -281,7 +298,7 @@ public class TransportTwoNodesSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testDfsQueryAndFetch() throws Exception {
|
||||
prepareData();
|
||||
prepareData(3);
|
||||
|
||||
SearchSourceBuilder source = searchSource()
|
||||
.query(termQuery("multi", "test"))
|
||||
@ -341,12 +358,14 @@ public class TransportTwoNodesSearchTests extends ElasticsearchIntegrationTest {
|
||||
public void testFailedSearchWithWrongQuery() throws Exception {
|
||||
prepareData();
|
||||
|
||||
NumShards test = getNumShards("test");
|
||||
|
||||
logger.info("Start Testing failed search with wrong query");
|
||||
try {
|
||||
SearchResponse searchResponse = client().search(searchRequest("test").source("{ xxx }".getBytes(Charsets.UTF_8))).actionGet();
|
||||
assertThat(searchResponse.getTotalShards(), equalTo(3));
|
||||
assertThat(searchResponse.getTotalShards(), equalTo(test.numPrimaries));
|
||||
assertThat(searchResponse.getSuccessfulShards(), equalTo(0));
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(3));
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(test.numPrimaries));
|
||||
fail("search should fail");
|
||||
} catch (ElasticsearchException e) {
|
||||
assertThat(e.unwrapCause(), instanceOf(SearchPhaseExecutionException.class));
|
||||
@ -359,14 +378,16 @@ public class TransportTwoNodesSearchTests extends ElasticsearchIntegrationTest {
|
||||
public void testFailedSearchWithWrongFrom() throws Exception {
|
||||
prepareData();
|
||||
|
||||
NumShards test = getNumShards("test");
|
||||
|
||||
logger.info("Start Testing failed search with wrong from");
|
||||
SearchSourceBuilder source = searchSource()
|
||||
.query(termQuery("multi", "test"))
|
||||
.from(1000).size(20).explain(true);
|
||||
SearchResponse response = client().search(searchRequest("test").searchType(DFS_QUERY_AND_FETCH).source(source)).actionGet();
|
||||
assertThat(response.getHits().hits().length, equalTo(0));
|
||||
assertThat(response.getTotalShards(), equalTo(3));
|
||||
assertThat(response.getSuccessfulShards(), equalTo(3));
|
||||
assertThat(response.getTotalShards(), equalTo(test.numPrimaries));
|
||||
assertThat(response.getSuccessfulShards(), equalTo(test.numPrimaries));
|
||||
assertThat(response.getFailedShards(), equalTo(0));
|
||||
|
||||
response = client().search(searchRequest("test").searchType(QUERY_THEN_FETCH).source(source)).actionGet();
|
||||
|
@ -30,9 +30,8 @@ import org.elasticsearch.action.search.SearchPhaseExecutionException;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.search.SearchType;
|
||||
import org.elasticsearch.action.search.ShardSearchFailure;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.common.lucene.search.function.CombineFunction;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.unit.TimeValue;
|
||||
import org.elasticsearch.index.mapper.MergeMappingException;
|
||||
import org.elasticsearch.index.query.*;
|
||||
@ -52,7 +51,8 @@ import java.util.concurrent.atomic.AtomicReference;
|
||||
|
||||
import static com.google.common.collect.Maps.newHashMap;
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_NUMBER_OF_REPLICAS;
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_NUMBER_OF_SHARDS;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.builder;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.*;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.*;
|
||||
@ -68,18 +68,16 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void multiLevelChild() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.addMapping("grandchild", "_parent", "type=child")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
.get();
|
||||
.addMapping("grandchild", "_parent", "type=child"));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "parent", "p1").setSource("p_field", "p_value1").get();
|
||||
client().prepareIndex("test", "child", "c1").setSource("c_field", "c_value1").setParent("p1").get();
|
||||
client().prepareIndex("test", "grandchild", "gc1").setSource("gc_field", "gc_value1")
|
||||
.setParent("c1").setRouting("gc1").get();
|
||||
.setParent("c1").setRouting("p1").get();
|
||||
refresh();
|
||||
|
||||
SearchResponse searchResponse = client()
|
||||
@ -125,17 +123,15 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
// see #2744
|
||||
public void test2744() throws ElasticsearchException, IOException {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("test", "_parent", "type=foo")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
.get();
|
||||
.addMapping("test", "_parent", "type=foo"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
client().prepareIndex("test", "foo", "1").setSource("foo", 1).get();
|
||||
client().prepareIndex("test", "test").setSource("foo", 1).setParent("1").get();
|
||||
client().admin().indices().prepareRefresh().get();
|
||||
refresh();
|
||||
SearchResponse searchResponse = client().prepareSearch("test").setQuery(hasChildQuery("test", matchQuery("foo", 1))).execute()
|
||||
.actionGet();
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
@ -146,11 +142,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void simpleChildQuery() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -251,13 +245,8 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testClearIdCacheBug() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
).get();
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent"));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "parent", "p0").setSource("p_field", "p_value0").get();
|
||||
@ -281,7 +270,7 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
client().prepareIndex("test", "child", "c3").setSource("c_field", "blue").setParent("p2").get();
|
||||
client().prepareIndex("test", "child", "c4").setSource("c_field", "red").setParent("p2").get();
|
||||
|
||||
client().admin().indices().prepareRefresh().get();
|
||||
refresh();
|
||||
|
||||
indicesStatsResponse = client().admin().indices()
|
||||
.prepareStats("test").setFieldData(true).get();
|
||||
@ -312,11 +301,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
// See: https://github.com/elasticsearch/elasticsearch/issues/3290
|
||||
public void testCachingBug_withFqueryFilter() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -330,8 +317,7 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
client().prepareIndex("test", "child", Integer.toString(i + 10)).setSource("c_field", i + 10).setParent(Integer.toString(i))
|
||||
.get();
|
||||
}
|
||||
client().admin().indices().prepareFlush().get();
|
||||
client().admin().indices().prepareRefresh().get();
|
||||
flushAndRefresh();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
SearchResponse searchResponse = client().prepareSearch("test")
|
||||
@ -351,11 +337,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testHasParentFilter() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
Map<String, Set<String>> parentToChildren = newHashMap();
|
||||
// Childless parent
|
||||
@ -382,7 +366,7 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
assertThat(parentToChildren.get(previousParentId).add(childId), is(true));
|
||||
}
|
||||
indexRandom(true, builders.toArray(new IndexRequestBuilder[0]));
|
||||
indexRandom(true, builders.toArray(new IndexRequestBuilder[builders.size()]));
|
||||
|
||||
assertThat(parentToChildren.isEmpty(), equalTo(false));
|
||||
for (Map.Entry<String, Set<String>> parentToChildrenEntry : parentToChildren.entrySet()) {
|
||||
@ -404,11 +388,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void simpleChildQueryWithFlush() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data with flushes, so we have many segments
|
||||
@ -503,115 +485,11 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(searchResponse.getHits().getAt(1).id(), anyOf(equalTo("p2"), equalTo("p1")));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void simpleChildQueryWithFlushAnd3Shards() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 3).put("index.number_of_replicas", 0))
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
ensureGreen();
|
||||
|
||||
// index simple data with flushes, so we have many segments
|
||||
client().prepareIndex("test", "parent", "p1").setSource("p_field", "p_value1").get();
|
||||
client().admin().indices().prepareFlush().get();
|
||||
client().prepareIndex("test", "child", "c1").setSource("c_field", "red").setParent("p1").get();
|
||||
client().admin().indices().prepareFlush().get();
|
||||
client().prepareIndex("test", "child", "c2").setSource("c_field", "yellow").setParent("p1").get();
|
||||
client().admin().indices().prepareFlush().get();
|
||||
client().prepareIndex("test", "parent", "p2").setSource("p_field", "p_value2").get();
|
||||
client().admin().indices().prepareFlush().get();
|
||||
client().prepareIndex("test", "child", "c3").setSource("c_field", "blue").setParent("p2").get();
|
||||
client().admin().indices().prepareFlush().get();
|
||||
client().prepareIndex("test", "child", "c4").setSource("c_field", "red").setParent("p2").get();
|
||||
client().admin().indices().prepareFlush().get();
|
||||
|
||||
client().admin().indices().prepareRefresh().get();
|
||||
|
||||
// TOP CHILDREN QUERY
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("test").setQuery(topChildrenQuery("child", termQuery("c_field", "yellow")))
|
||||
.get();
|
||||
assertNoFailures(searchResponse);
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(1l));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), equalTo("p1"));
|
||||
|
||||
searchResponse = client().prepareSearch("test").setQuery(topChildrenQuery("child", termQuery("c_field", "blue"))).execute()
|
||||
.actionGet();
|
||||
if (searchResponse.getFailedShards() > 0) {
|
||||
logger.warn("Failed shards:");
|
||||
for (ShardSearchFailure shardSearchFailure : searchResponse.getShardFailures()) {
|
||||
logger.warn("-> {}", shardSearchFailure);
|
||||
}
|
||||
}
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(1l));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), equalTo("p2"));
|
||||
|
||||
searchResponse = client().prepareSearch("test").setQuery(topChildrenQuery("child", termQuery("c_field", "red"))).execute()
|
||||
.actionGet();
|
||||
assertNoFailures(searchResponse);
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(2l));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), anyOf(equalTo("p2"), equalTo("p1")));
|
||||
assertThat(searchResponse.getHits().getAt(1).id(), anyOf(equalTo("p2"), equalTo("p1")));
|
||||
|
||||
// HAS CHILD QUERY
|
||||
|
||||
searchResponse = client().prepareSearch("test").setQuery(hasChildQuery("child", termQuery("c_field", "yellow"))).execute()
|
||||
.actionGet();
|
||||
assertNoFailures(searchResponse);
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(1l));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), equalTo("p1"));
|
||||
|
||||
searchResponse = client().prepareSearch("test").setQuery(hasChildQuery("child", termQuery("c_field", "blue"))).execute()
|
||||
.actionGet();
|
||||
assertNoFailures(searchResponse);
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(1l));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), equalTo("p2"));
|
||||
|
||||
searchResponse = client().prepareSearch("test").setQuery(hasChildQuery("child", termQuery("c_field", "red"))).get();
|
||||
assertNoFailures(searchResponse);
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(2l));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), anyOf(equalTo("p2"), equalTo("p1")));
|
||||
assertThat(searchResponse.getHits().getAt(1).id(), anyOf(equalTo("p2"), equalTo("p1")));
|
||||
|
||||
// HAS CHILD FILTER
|
||||
|
||||
searchResponse = client().prepareSearch("test")
|
||||
.setQuery(constantScoreQuery(hasChildFilter("child", termQuery("c_field", "yellow")))).get();
|
||||
assertNoFailures(searchResponse);
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(1l));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), equalTo("p1"));
|
||||
|
||||
searchResponse = client().prepareSearch("test").setQuery(constantScoreQuery(hasChildFilter("child", termQuery("c_field", "blue"))))
|
||||
.get();
|
||||
assertNoFailures(searchResponse);
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(1l));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), equalTo("p2"));
|
||||
|
||||
searchResponse = client().prepareSearch("test").setQuery(constantScoreQuery(hasChildFilter("child", termQuery("c_field", "red"))))
|
||||
.get();
|
||||
assertNoFailures(searchResponse);
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(2l));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), anyOf(equalTo("p2"), equalTo("p1")));
|
||||
assertThat(searchResponse.getHits().getAt(1).id(), anyOf(equalTo("p2"), equalTo("p1")));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testScopedFacet() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -648,11 +526,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testDeletedParent() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
// index simple data
|
||||
client().prepareIndex("test", "parent", "p1").setSource("p_field", "p_value1").get();
|
||||
@ -711,11 +587,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testDfsSearchType() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -745,11 +619,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFixAOBEIfTopChildrenIsWrappedInMusNotClause() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -770,11 +642,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testTopChildrenReSearchBug() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
int numberOfParents = 4;
|
||||
int numberOfChildrenPerParent = 123;
|
||||
@ -803,16 +673,15 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
assertNoFailures(searchResponse);
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(0));
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(2l));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), equalTo("p2"));
|
||||
assertThat(searchResponse.getHits().getAt(0).id(), anyOf(equalTo("p2"), equalTo("p4")));
|
||||
assertThat(searchResponse.getHits().getAt(1).id(), anyOf(equalTo("p2"), equalTo("p4")));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testHasChildAndHasParentFailWhenSomeSegmentsDontContainAnyParentOrChildDocs() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "parent", "1").setSource("p_field", 1).get();
|
||||
@ -837,11 +706,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testCountApiUsage() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
String parentId = "p1";
|
||||
@ -872,11 +739,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testExplainUsage() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
String parentId = "p1";
|
||||
@ -970,16 +835,13 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testScoreForParentChildQueries_withFunctionScore() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.addMapping("child1", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child1", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
indexRandom(false, createDocBuilders().toArray(new IndexRequestBuilder[0]));
|
||||
refresh();
|
||||
indexRandom(true, createDocBuilders().toArray(new IndexRequestBuilder[0]));
|
||||
SearchResponse response = client()
|
||||
.prepareSearch("test")
|
||||
.setQuery(
|
||||
@ -1057,11 +919,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
// https://github.com/elasticsearch/elasticsearch/issues/2536
|
||||
public void testParentChildQueriesCanHandleNoRelevantTypesInIndex() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
SearchResponse response = client().prepareSearch("test")
|
||||
@ -1093,11 +953,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testHasChildAndHasParentFilter_withFilter() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "parent", "1").setSource("p_field", 1).get();
|
||||
@ -1124,11 +982,13 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimpleQueryRewrite() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
//top_children query needs at least 2 shards for the totalHits to be accurate
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put(IndexMetaData.SETTING_NUMBER_OF_SHARDS, between(2, DEFAULT_MAX_NUM_SHARDS)))
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -1170,7 +1030,7 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(searchResponse.getHits().hits()[4].id(), equalTo("c004"));
|
||||
|
||||
searchResponse = client().prepareSearch("test").setSearchType(searchType)
|
||||
.setQuery(topChildrenQuery("child", prefixQuery("c_field", "c"))).addSort("p_field", SortOrder.ASC).setSize(5)
|
||||
.setQuery(topChildrenQuery("child", prefixQuery("c_field", "c")).factor(10)).addSort("p_field", SortOrder.ASC).setSize(5)
|
||||
.get();
|
||||
assertNoFailures(searchResponse);
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(10L));
|
||||
@ -1186,11 +1046,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
// See also issue:
|
||||
// https://github.com/elasticsearch/elasticsearch/issues/3144
|
||||
public void testReIndexingParentAndChildDocuments() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -1255,11 +1113,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
// See also issue:
|
||||
// https://github.com/elasticsearch/elasticsearch/issues/3203
|
||||
public void testHasChildQueryWithMinimumScore() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -1283,13 +1139,12 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testParentFieldFilter() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 1)
|
||||
.put("index.refresh_interval", -1))
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put(indexSettings())
|
||||
.put("index.refresh_interval", -1))
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.addMapping("child2", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child2", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// test term filter
|
||||
@ -1351,15 +1206,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testHasChildNotBeingCached() throws ElasticsearchException, IOException {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -1395,16 +1244,13 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testDeleteByQuery_has_child() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 1)
|
||||
settingsBuilder().put(indexSettings())
|
||||
.put("index.refresh_interval", "-1")
|
||||
)
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -1441,16 +1287,14 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testDeleteByQuery_has_child_SingleRefresh() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 1)
|
||||
settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.refresh_interval", "-1")
|
||||
)
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -1497,16 +1341,14 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testDeleteByQuery_has_parent() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 1)
|
||||
settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.refresh_interval", "-1")
|
||||
)
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -1552,17 +1394,11 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
// Relates to bug: https://github.com/elasticsearch/elasticsearch/issues/3818
|
||||
public void testHasChildQueryOnlyReturnsSingleChildType() {
|
||||
client().admin().indices().prepareCreate("grandissue")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
)
|
||||
assertAcked(prepareCreate("grandissue")
|
||||
.addMapping("grandparent", "name", "type=string")
|
||||
.addMapping("parent", "_parent", "type=grandparent")
|
||||
.addMapping("child_type_one", "_parent", "type=parent")
|
||||
.addMapping("child_type_two", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child_type_two", "_parent", "type=parent"));
|
||||
|
||||
client().prepareIndex("grandissue", "grandparent", "1").setSource("name", "Grandpa").get();
|
||||
client().prepareIndex("grandissue", "parent", "2").setParent("1").setSource("name", "Dana").get();
|
||||
@ -1611,15 +1447,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void indexChildDocWithNoParentMapping() throws ElasticsearchException, IOException {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child1")
|
||||
.get();
|
||||
.addMapping("child1"));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "parent", "p1").setSource("p_field", "p_value1", "_parent", "bla").get();
|
||||
@ -1641,12 +1471,7 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testAddingParentToExistingMapping() throws ElasticsearchException, IOException {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
).get();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
PutMappingResponse putMappingResponse = client().admin().indices().preparePutMapping("test").setType("child").setSource("number", "type=integer")
|
||||
@ -1672,15 +1497,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
// The SimpleIdReaderTypeCache#docById method used lget, which can't be used if a map is shared.
|
||||
public void testTopChildrenBug_concurrencyIssue() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -1730,15 +1549,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testHasChildQueryWithNestedInnerObjects() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent", "objects", "type=nested")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "parent", "p1")
|
||||
@ -1778,11 +1591,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testNamedFilters() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
String parentId = "p1";
|
||||
@ -1823,17 +1634,15 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testParentChildQueriesNoParentType() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.refresh_interval", -1)
|
||||
.put("index.number_of_replicas", 0))
|
||||
.get();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().get();
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.refresh_interval", -1)));
|
||||
ensureGreen();
|
||||
|
||||
String parentId = "p1";
|
||||
client().prepareIndex("test", "parent", parentId).setSource("p_field", "1").get();
|
||||
client().admin().indices().prepareRefresh().get();
|
||||
refresh();
|
||||
|
||||
try {
|
||||
client().prepareSearch("test")
|
||||
@ -1892,17 +1701,15 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testAdd_ParentFieldAfterIndexingParentDocButBeforeIndexingChildDoc() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.refresh_interval", -1)
|
||||
.put("index.number_of_replicas", 0))
|
||||
.get();
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.refresh_interval", -1)));
|
||||
ensureGreen();
|
||||
|
||||
String parentId = "p1";
|
||||
client().prepareIndex("test", "parent", parentId).setSource("p_field", "1").get();
|
||||
client().admin().indices().prepareRefresh().get();
|
||||
refresh();
|
||||
assertAcked(client().admin()
|
||||
.indices()
|
||||
.preparePutMapping("test")
|
||||
@ -1951,16 +1758,14 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testParentChildCaching() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.refresh_interval", -1)
|
||||
)
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
// index simple data
|
||||
@ -2004,18 +1809,16 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testParentChildQueriesViaScrollApi() throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("parent")
|
||||
.addMapping("child", "_parent", "type=parent")
|
||||
.get();
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
for (int i = 0; i < 10; i++) {
|
||||
client().prepareIndex("test", "parent", "p" + i).setSource("{}").get();
|
||||
client().prepareIndex("test", "child", "c" + i).setSource("{}").setParent("p" + i).get();
|
||||
}
|
||||
|
||||
client().admin().indices().prepareRefresh().get();
|
||||
refresh();
|
||||
|
||||
QueryBuilder[] queries = new QueryBuilder[]{
|
||||
hasChildQuery("child", matchAllQuery()),
|
||||
@ -2052,8 +1855,10 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testValidateThatHasChildAndHasParentFilterAreNeverCached() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.setSettings(SETTING_NUMBER_OF_SHARDS, 1, SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(builder().put(indexSettings())
|
||||
//we need 0 replicas here to make sure we always hit the very same shards
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, 0))
|
||||
.addMapping("child", "_parent", "type=parent"));
|
||||
ensureGreen();
|
||||
|
||||
@ -2084,6 +1889,9 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
|
||||
.get();
|
||||
assertHitCount(searchResponse, 1l);
|
||||
|
||||
statsResponse = client().admin().indices().prepareStats("test").clear().setFilterCache(true).get();
|
||||
assertThat(statsResponse.getIndex("test").getTotal().getFilterCache().getMemorySizeInBytes(), equalTo(initialCacheSize));
|
||||
|
||||
searchResponse = client().prepareSearch("test")
|
||||
.setQuery(QueryBuilders.filteredQuery(matchAllQuery(), FilterBuilders.hasParentFilter("parent", matchAllQuery()).cache(true)))
|
||||
.get();
|
||||
|
@ -23,8 +23,6 @@ import org.apache.lucene.util.LuceneTestCase.Slow;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.collect.Tuple;
|
||||
import org.elasticsearch.common.regex.Regex;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.text.StringText;
|
||||
import org.elasticsearch.common.text.Text;
|
||||
import org.elasticsearch.index.query.QueryBuilders;
|
||||
@ -44,22 +42,19 @@ import static org.hamcrest.Matchers.equalTo;
|
||||
public class ExtendedFacetsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", numberOfShards())
|
||||
.put("index.number_of_replicas", 0)
|
||||
.build();
|
||||
}
|
||||
|
||||
protected int numberOfShards() {
|
||||
return 1;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected int numberOfReplicas() {
|
||||
return 0;
|
||||
}
|
||||
|
||||
protected int numDocs() {
|
||||
return 2500;
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
@Slow
|
||||
public void testTermFacet_stringFields() throws Throwable {
|
||||
|
@ -26,11 +26,8 @@ import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.search.SearchType;
|
||||
import org.elasticsearch.action.search.ShardSearchFailure;
|
||||
import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.bytes.BytesArray;
|
||||
import org.elasticsearch.common.joda.Joda;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.unit.TimeValue;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.search.facet.datehistogram.DateHistogramFacet;
|
||||
@ -60,6 +57,7 @@ import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.termFilter;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.*;
|
||||
import static org.elasticsearch.search.facet.FacetBuilders.*;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
@ -69,12 +67,10 @@ import static org.hamcrest.Matchers.*;
|
||||
public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
private int numRuns = -1;
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", between(1, 5))
|
||||
.put("index.number_of_replicas", 0)
|
||||
.build();
|
||||
protected int numberOfReplicas() {
|
||||
return between(0, 1);
|
||||
}
|
||||
|
||||
protected int numberOfRuns() {
|
||||
@ -133,9 +129,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testBinaryFacet() throws Exception {
|
||||
createIndex("test");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("tag", "green")
|
||||
@ -175,16 +169,15 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFacetNumeric() throws ElasticsearchException, IOException {
|
||||
prepareCreate("test").addMapping("type", jsonBuilder().startObject().startObject("type").startObject("properties")
|
||||
assertAcked(prepareCreate("test").addMapping("type", jsonBuilder().startObject().startObject("type").startObject("properties")
|
||||
.startObject("byte").field("type", "byte").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
.startObject("short").field("type", "short").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
.startObject("integer").field("type", "integer").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
.startObject("long").field("type", "long").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
.startObject("float").field("type", "float").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
.startObject("double").field("type", "double").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 100; i++) {
|
||||
client().prepareIndex("test", "type", "" + i).setSource(jsonBuilder().startObject()
|
||||
@ -306,14 +299,12 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(facet.getOtherCount(), equalTo(90l));
|
||||
assertThat(facet.getMissingCount(), equalTo(10l));
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
@Slow
|
||||
public void testConcurrentFacets() throws ElasticsearchException, IOException, InterruptedException, ExecutionException {
|
||||
prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type", jsonBuilder().startObject().startObject("type").startObject("properties")
|
||||
.startObject("byte").field("type", "byte").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
.startObject("short").field("type", "short").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
@ -321,9 +312,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("long").field("type", "long").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
.startObject("float").field("type", "float").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
.startObject("double").field("type", "double").startObject("fielddata").field("format", maybeDocValues() ? "doc_values" : null).endObject().endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 100; i++) {
|
||||
client().prepareIndex("test", "type", "" + i).setSource(jsonBuilder().startObject()
|
||||
@ -481,49 +471,48 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
@Slow
|
||||
public void testDuelByteFieldDataImpl() throws ElasticsearchException, IOException, InterruptedException, ExecutionException {
|
||||
prepareCreate("test")
|
||||
.addMapping("type", jsonBuilder().startObject().startObject("type").startObject("properties")
|
||||
.startObject("name_paged")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "paged_bytes").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("name_fst")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "fst").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("name_dv")
|
||||
.field("type", "string")
|
||||
.field("index", "no")
|
||||
.startObject("fielddata").field("format", "doc_values").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("name_paged_mv")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "paged_bytes").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("name_fst_mv")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "fst").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("name_dv_mv")
|
||||
.field("type", "string")
|
||||
.field("index", "no")
|
||||
.startObject("fielddata").field("format", "doc_values").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("filtered")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "fst").field("loading", randomBoolean() ? "eager" : "lazy").startObject("filter")
|
||||
.startObject("regex").field("pattern", "\\d{1,2}").endObject().endObject()
|
||||
.endObject()
|
||||
// only 1 or 2 digits
|
||||
.endObject()
|
||||
.startObject("filtered_mv")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "fst").field("loading", randomBoolean() ? "eager" : "lazy").startObject("filter")
|
||||
.startObject("regex").field("pattern", "\\d{1,2}").endObject().endObject()
|
||||
.endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type", jsonBuilder().startObject().startObject("type").startObject("properties")
|
||||
.startObject("name_paged")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "paged_bytes").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("name_fst")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "fst").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("name_dv")
|
||||
.field("type", "string")
|
||||
.field("index", "no")
|
||||
.startObject("fielddata").field("format", "doc_values").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("name_paged_mv")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "paged_bytes").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("name_fst_mv")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "fst").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("name_dv_mv")
|
||||
.field("type", "string")
|
||||
.field("index", "no")
|
||||
.startObject("fielddata").field("format", "doc_values").field("loading", randomBoolean() ? "eager" : "lazy").endObject()
|
||||
.endObject()
|
||||
.startObject("filtered")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "fst").field("loading", randomBoolean() ? "eager" : "lazy").startObject("filter")
|
||||
.startObject("regex").field("pattern", "\\d{1,2}").endObject().endObject()
|
||||
.endObject()
|
||||
// only 1 or 2 digits
|
||||
.endObject()
|
||||
.startObject("filtered_mv")
|
||||
.field("type", "string")
|
||||
.startObject("fielddata").field("format", "fst").field("loading", randomBoolean() ? "eager" : "lazy").startObject("filter")
|
||||
.startObject("regex").field("pattern", "\\d{1,2}").endObject().endObject()
|
||||
.endObject()
|
||||
.endObject().endObject().endObject()));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 100; i++) {
|
||||
client().prepareIndex("test", "type", "" + i).setSource(jsonBuilder().startObject()
|
||||
@ -639,9 +628,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testSearchFilter() throws Exception {
|
||||
createIndex("test");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("tag", "green")
|
||||
@ -689,9 +676,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testFacetsWithSize0() throws Exception {
|
||||
createIndex("test");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("stag", "111")
|
||||
@ -747,7 +732,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
public void testTermsIndexFacet() throws Exception {
|
||||
createIndex("test1");
|
||||
createIndex("test2");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test1", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("stag", "111")
|
||||
@ -791,7 +776,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testFilterFacets() throws Exception {
|
||||
createIndex("test");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("stag", "111")
|
||||
@ -804,7 +789,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startArray("tag").value("zzz").value("yyy").endArray()
|
||||
.endObject()).execute().actionGet();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
for (int i = 0; i < numberOfRuns(); i++) {
|
||||
SearchResponse searchResponse = client().prepareSearch()
|
||||
@ -835,7 +820,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testTermsFacetsMissing() throws Exception {
|
||||
prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("bstag").field("type", "byte").endObject()
|
||||
.startObject("shstag").field("type", "short").endObject()
|
||||
@ -843,9 +828,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("lstag").field("type", "long").endObject()
|
||||
.startObject("fstag").field("type", "float").endObject()
|
||||
.startObject("dstag").field("type", "double").endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("stag", "111")
|
||||
@ -883,7 +867,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
private void testTermsFacets(String executionHint) throws Exception {
|
||||
prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("bstag").field("type", "byte").endObject()
|
||||
.startObject("shstag").field("type", "short").endObject()
|
||||
@ -891,9 +875,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("lstag").field("type", "long").endObject()
|
||||
.startObject("fstag").field("type", "float").endObject()
|
||||
.startObject("dstag").field("type", "double").endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("stag", "111")
|
||||
@ -922,7 +905,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startArray("dtag").value(3000.1).value(2000.1).endArray()
|
||||
.endObject()).execute().actionGet();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
for (int i = 0; i < numberOfRuns(); i++) {
|
||||
SearchResponse searchResponse = client().prepareSearch()
|
||||
@ -1293,9 +1276,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testTermFacetWithEqualTermDistribution() throws Exception {
|
||||
createIndex("test");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
// at the end of the index, we should have 10 of each `bar`, `foo`, and `baz`
|
||||
for (int i = 0; i < 5; i++) {
|
||||
@ -1314,7 +1295,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.field("text", "baz foo")
|
||||
.endObject()).execute().actionGet();
|
||||
}
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
for (int i = 0; i < numberOfRuns(); i++) {
|
||||
SearchResponse searchResponse = client().prepareSearch()
|
||||
@ -1341,8 +1322,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("num").field("type", "integer").endObject()
|
||||
.startObject("multi_num").field("type", "float").endObject()
|
||||
.endObject().endObject().endObject().string();
|
||||
prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("num", 1)
|
||||
@ -1439,8 +1420,9 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
String mapping = jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("num").field("type", "integer").endObject()
|
||||
.endObject().endObject().endObject().string();
|
||||
prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("num", 100)
|
||||
.endObject()).execute().actionGet();
|
||||
@ -1486,8 +1468,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("multi_num").field("type", "float").endObject()
|
||||
.startObject("date").field("type", "date").endObject()
|
||||
.endObject().endObject().endObject().string();
|
||||
prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("num", 1055)
|
||||
@ -1655,8 +1637,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("multi_value").field("type", "float").endObject()
|
||||
.startObject("date").field("type", "date").endObject()
|
||||
.endObject().endObject().endObject().string();
|
||||
prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("num", 1055)
|
||||
@ -1825,8 +1807,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("date").field("type", "date").endObject()
|
||||
.startObject("date_in_seconds").field("type", "long").endObject()
|
||||
.endObject().endObject().endObject().string();
|
||||
prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
DateTimeFormatter parser = Joda.forPattern("dateOptionalTime").parser();
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("date", "2009-03-05T01:01:01")
|
||||
@ -1952,8 +1934,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("num").field("type", "integer").endObject()
|
||||
.startObject("date").field("type", "date").endObject()
|
||||
.endObject().endObject().endObject().string();
|
||||
prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("date", "2009-03-05T23:31:01")
|
||||
@ -2019,8 +2001,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("num").field("type", "integer").endObject()
|
||||
.startObject("multi_num").field("type", "float").endObject()
|
||||
.endObject().endObject().endObject().string();
|
||||
prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("field", "xxx")
|
||||
@ -2205,8 +2187,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("num").field("type", "float").endObject()
|
||||
.startObject("multi_num").field("type", "integer").endObject()
|
||||
.endObject().endObject().endObject().string();
|
||||
prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource(jsonBuilder().startObject()
|
||||
.field("lField", 100l)
|
||||
@ -2277,8 +2259,8 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
String mapping = jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("num").field("type", "float").endObject()
|
||||
.endObject().endObject().endObject().string();
|
||||
prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 20; i++) {
|
||||
client().prepareIndex("test", "type1", Integer.toString(i)).setSource("num", i % 10).execute().actionGet();
|
||||
@ -2310,7 +2292,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testQueryFacet() throws Exception {
|
||||
createIndex("test");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 20; i++) {
|
||||
client().prepareIndex("test", "type1", Integer.toString(i)).setSource("num", i % 10).execute().actionGet();
|
||||
@ -2351,7 +2333,7 @@ public class SimpleFacetsTests extends ElasticsearchIntegrationTest {
|
||||
.field("field", "xxx")
|
||||
.endObject()).execute().actionGet();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
for (int i = 0; i < numberOfRuns(); i++) {
|
||||
SearchResponse searchResponse = client().prepareSearch()
|
||||
|
@ -20,51 +20,23 @@ package org.elasticsearch.search.facet.terms;
|
||||
|
||||
import com.google.common.collect.ImmutableMap;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.search.aggregations.bucket.ShardSizeTests;
|
||||
import org.elasticsearch.search.facet.Facets;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.search.facet.FacetBuilders.termsFacet;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.Scope;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.is;
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
@ClusterScope(scope = Scope.SUITE)
|
||||
public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
/**
|
||||
* to properly test the effect/functionality of shard_size, we need to force having 2 shards and also
|
||||
* control the routing such that certain documents will end on each shard. Using "djb" routing hash + ignoring the
|
||||
* doc type when hashing will ensure that docs with routing value "1" will end up in a different shard than docs with
|
||||
* routing value "2".
|
||||
*/
|
||||
@Override
|
||||
protected Settings nodeSettings(int nodeOrdinal) {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 0)
|
||||
.put("cluster.routing.operation.hash.type", "djb")
|
||||
.put("cluster.routing.operation.use_type", "false")
|
||||
.build();
|
||||
}
|
||||
public class ShardSizeTermsFacetTests extends ShardSizeTests {
|
||||
|
||||
@Test
|
||||
public void noShardSize_string() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -90,9 +62,7 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_string() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -118,9 +88,7 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_string_singleShard() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -146,9 +114,7 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_string_withExecutionHintMap() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -174,9 +140,7 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_string_withExecutionHintMap_singleShard() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -202,9 +166,7 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void noShardSize_long() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -230,9 +192,7 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_long() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -258,9 +218,7 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_long_singleShard() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -286,9 +244,7 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void noShardSize_double() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -314,9 +270,7 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_double() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=double");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -342,9 +296,7 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void withShardSize_double_singleShard() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=double");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -366,58 +318,4 @@ public class ShardSizeTermsFacetTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(entry.getCount(), equalTo(expected.get(entry.getTermAsNumber().intValue())));
|
||||
}
|
||||
}
|
||||
|
||||
private void indexData() throws Exception {
|
||||
|
||||
/*
|
||||
|
||||
|
||||
|| || size = 3, shard_size = 5 || shard_size = size = 3 ||
|
||||
||==========||==================================================||===============================================||
|
||||
|| shard 1: || "1" - 5 | "2" - 4 | "3" - 3 | "4" - 2 | "5" - 1 || "1" - 5 | "3" - 3 | "2" - 4 ||
|
||||
||----------||--------------------------------------------------||-----------------------------------------------||
|
||||
|| shard 2: || "1" - 3 | "2" - 1 | "3" - 5 | "4" - 2 | "5" - 1 || "1" - 3 | "3" - 5 | "4" - 2 ||
|
||||
||----------||--------------------------------------------------||-----------------------------------------------||
|
||||
|| reduced: || "1" - 8 | "2" - 5 | "3" - 8 | "4" - 4 | "5" - 2 || ||
|
||||
|| || || "1" - 8, "3" - 8, "2" - 4 <= WRONG ||
|
||||
|| || "1" - 8 | "3" - 8 | "2" - 5 <= CORRECT || ||
|
||||
|
||||
|
||||
*/
|
||||
|
||||
|
||||
indexDoc("1", "1", 5);
|
||||
indexDoc("1", "2", 4);
|
||||
indexDoc("1", "3", 3);
|
||||
indexDoc("1", "4", 2);
|
||||
indexDoc("1", "5", 1);
|
||||
|
||||
// total docs in shard "1" = 15
|
||||
|
||||
indexDoc("2", "1", 3);
|
||||
indexDoc("2", "2", 1);
|
||||
indexDoc("2", "3", 5);
|
||||
indexDoc("2", "4", 2);
|
||||
indexDoc("2", "5", 1);
|
||||
|
||||
// total docs in shard "2" = 12
|
||||
|
||||
client().admin().indices().prepareFlush("idx").execute().actionGet();
|
||||
client().admin().indices().prepareRefresh("idx").execute().actionGet();
|
||||
|
||||
long totalOnOne = client().prepareSearch("idx").setTypes("type").setRouting("1").setQuery(matchAllQuery()).execute().actionGet().getHits().getTotalHits();
|
||||
assertThat(totalOnOne, is(15l));
|
||||
long totalOnTwo = client().prepareSearch("idx").setTypes("type").setRouting("2").setQuery(matchAllQuery()).execute().actionGet().getHits().getTotalHits();
|
||||
assertThat(totalOnTwo, is(12l));
|
||||
}
|
||||
|
||||
private void indexDoc(String shard, String key, int times) throws Exception {
|
||||
for (int i = 0; i < times; i++) {
|
||||
client().prepareIndex("idx", "type").setRouting(shard).setCreate(true).setSource(jsonBuilder()
|
||||
.startObject()
|
||||
.field("key", key)
|
||||
.endObject()).execute().actionGet();
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -20,9 +20,6 @@ package org.elasticsearch.search.facet.terms;
|
||||
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
@ -32,6 +29,7 @@ import java.util.ArrayList;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.search.facet.FacetBuilders.termsFacet;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.is;
|
||||
|
||||
@ -40,25 +38,13 @@ import static org.hamcrest.Matchers.is;
|
||||
*/
|
||||
public class UnmappedFieldsTermsFacetsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", numberOfShards())
|
||||
.put("index.number_of_replicas", 0)
|
||||
.build();
|
||||
}
|
||||
|
||||
protected int numberOfShards() {
|
||||
return 5;
|
||||
}
|
||||
|
||||
/**
|
||||
* Tests the terms facet when faceting on unmapped field
|
||||
*/
|
||||
@Test
|
||||
public void testUnmappedField() throws Exception {
|
||||
createIndex("idx");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
client().prepareIndex("idx", "type", "" + i).setSource(jsonBuilder().startObject()
|
||||
@ -156,8 +142,7 @@ public class UnmappedFieldsTermsFacetsTests extends ElasticsearchIntegrationTest
|
||||
*/
|
||||
@Test
|
||||
public void testPartiallyUnmappedField() throws ElasticsearchException, IOException {
|
||||
client().admin().indices().prepareCreate("mapped_idx")
|
||||
.setSettings(indexSettings())
|
||||
assertAcked(prepareCreate("mapped_idx")
|
||||
.addMapping("type", jsonBuilder().startObject().startObject("type").startObject("properties")
|
||||
.startObject("partially_mapped_byte").field("type", "byte").endObject()
|
||||
.startObject("partially_mapped_short").field("type", "short").endObject()
|
||||
@ -165,12 +150,11 @@ public class UnmappedFieldsTermsFacetsTests extends ElasticsearchIntegrationTest
|
||||
.startObject("partially_mapped_long").field("type", "long").endObject()
|
||||
.startObject("partially_mapped_float").field("type", "float").endObject()
|
||||
.startObject("partially_mapped_double").field("type", "double").endObject()
|
||||
.endObject().endObject().endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject().endObject()));
|
||||
ensureGreen();
|
||||
|
||||
createIndex("unmapped_idx");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
client().prepareIndex("mapped_idx", "type", "" + i).setSource(jsonBuilder().startObject()
|
||||
@ -284,8 +268,7 @@ public class UnmappedFieldsTermsFacetsTests extends ElasticsearchIntegrationTest
|
||||
|
||||
@Test
|
||||
public void testMappedYetMissingField() throws IOException {
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.setSettings(indexSettings())
|
||||
assertAcked(prepareCreate("idx")
|
||||
.addMapping("type", jsonBuilder().startObject()
|
||||
.field("type").startObject()
|
||||
.field("properties").startObject()
|
||||
@ -293,9 +276,8 @@ public class UnmappedFieldsTermsFacetsTests extends ElasticsearchIntegrationTest
|
||||
.field("long").startObject().field("type", "long").endObject()
|
||||
.field("double").startObject().field("type", "double").endObject()
|
||||
.endObject()
|
||||
.endObject())
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject()));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
client().prepareIndex("idx", "type", "" + i).setSource(jsonBuilder().startObject()
|
||||
@ -338,7 +320,7 @@ public class UnmappedFieldsTermsFacetsTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void testMultiFields() throws Exception {
|
||||
createIndex("idx");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < 10; i++) {
|
||||
client().prepareIndex("idx", "type", "" + i).setSource(jsonBuilder().startObject()
|
||||
|
@ -20,51 +20,23 @@ package org.elasticsearch.search.facet.termsstats;
|
||||
|
||||
import com.google.common.collect.ImmutableMap;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.search.aggregations.bucket.ShardSizeTests;
|
||||
import org.elasticsearch.search.facet.Facets;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.search.facet.FacetBuilders.termsStatsFacet;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope;
|
||||
import static org.elasticsearch.test.ElasticsearchIntegrationTest.Scope;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.is;
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
@ClusterScope(scope = Scope.SUITE)
|
||||
public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
/**
|
||||
* to properly test the effect/functionality of shard_size, we need to force having 2 shards and also
|
||||
* control the routing such that certain documents will end on each shard. Using "djb" routing hash + ignoring the
|
||||
* doc type when hashing will ensure that docs with routing value "1" will end up in a different shard than docs with
|
||||
* routing value "2".
|
||||
*/
|
||||
@Override
|
||||
protected Settings nodeSettings(int nodeOrdinal) {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.put("index.number_of_replicas", 0)
|
||||
.put("cluster.routing.operation.hash.type", "djb")
|
||||
.put("cluster.routing.operation.use_type", "false")
|
||||
.build();
|
||||
}
|
||||
public class ShardSizeTermsStatsFacetTests extends ShardSizeTests {
|
||||
|
||||
@Test
|
||||
public void noShardSize_string() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -90,9 +62,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void noShardSize_string_allTerms() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -120,9 +90,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void withShardSize_string_allTerms() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -150,9 +118,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void withShardSize_string() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -178,9 +144,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void withShardSize_string_singleShard() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=string,index=not_analyzed")
|
||||
.execute().actionGet();
|
||||
createIdx("type=string,index=not_analyzed");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -206,9 +170,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void noShardSize_long() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -234,9 +196,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void noShardSize_long_allTerms() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -264,9 +224,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void withShardSize_long_allTerms() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -294,9 +252,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void withShardSize_long() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -322,9 +278,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void withShardSize_long_singleShard() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=long")
|
||||
.execute().actionGet();
|
||||
createIdx("type=long");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -350,9 +304,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void noShardSize_double() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=double");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -378,9 +330,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void noShardSize_double_allTerms() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=double");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -408,9 +358,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void withShardSize_double_allTerms() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=double");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -438,9 +386,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void withShardSize_double() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=double");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -466,9 +412,7 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
@Test
|
||||
public void withShardSize_double_singleShard() throws Exception {
|
||||
|
||||
client().admin().indices().prepareCreate("idx")
|
||||
.addMapping("type", "key", "type=double")
|
||||
.execute().actionGet();
|
||||
createIdx("type=double");
|
||||
|
||||
indexData();
|
||||
|
||||
@ -490,59 +434,4 @@ public class ShardSizeTermsStatsFacetTests extends ElasticsearchIntegrationTest
|
||||
assertThat(entry.getCount(), equalTo(expected.get(entry.getTermAsNumber().intValue())));
|
||||
}
|
||||
}
|
||||
|
||||
private void indexData() throws Exception {
|
||||
|
||||
/*
|
||||
|
||||
|
||||
|| || size = 3, shard_size = 5 || shard_size = size = 3 ||
|
||||
||==========||==================================================||===============================================||
|
||||
|| shard 1: || "1" - 5 | "2" - 4 | "3" - 3 | "4" - 2 | "5" - 1 || "1" - 5 | "3" - 3 | "2" - 4 ||
|
||||
||----------||--------------------------------------------------||-----------------------------------------------||
|
||||
|| shard 2: || "1" - 3 | "2" - 1 | "3" - 5 | "4" - 2 | "5" - 1 || "1" - 3 | "3" - 5 | "4" - 2 ||
|
||||
||----------||--------------------------------------------------||-----------------------------------------------||
|
||||
|| reduced: || "1" - 8 | "2" - 5 | "3" - 8 | "4" - 4 | "5" - 2 || ||
|
||||
|| || || "1" - 8, "3" - 8, "2" - 4 <= WRONG ||
|
||||
|| || "1" - 8 | "3" - 8 | "2" - 5 <= CORRECT || ||
|
||||
|
||||
|
||||
*/
|
||||
|
||||
|
||||
indexDoc("1", "1", 5);
|
||||
indexDoc("1", "2", 4);
|
||||
indexDoc("1", "3", 3);
|
||||
indexDoc("1", "4", 2);
|
||||
indexDoc("1", "5", 1);
|
||||
|
||||
// total docs in shard "1" = 15
|
||||
|
||||
indexDoc("2", "1", 3);
|
||||
indexDoc("2", "2", 1);
|
||||
indexDoc("2", "3", 5);
|
||||
indexDoc("2", "4", 2);
|
||||
indexDoc("2", "5", 1);
|
||||
|
||||
// total docs in shard "2" = 12
|
||||
|
||||
client().admin().indices().prepareFlush("idx").execute().actionGet();
|
||||
client().admin().indices().prepareRefresh("idx").execute().actionGet();
|
||||
|
||||
long totalOnOne = client().prepareSearch("idx").setTypes("type").setRouting("1").setQuery(matchAllQuery()).execute().actionGet().getHits().getTotalHits();
|
||||
assertThat(totalOnOne, is(15l));
|
||||
long totalOnTwo = client().prepareSearch("idx").setTypes("type").setRouting("2").setQuery(matchAllQuery()).execute().actionGet().getHits().getTotalHits();
|
||||
assertThat(totalOnTwo, is(12l));
|
||||
}
|
||||
|
||||
private void indexDoc(String shard, String key, int times) throws Exception {
|
||||
for (int i = 0; i < times; i++) {
|
||||
client().prepareIndex("idx", "type").setRouting(shard).setCreate(true).setSource(jsonBuilder()
|
||||
.startObject()
|
||||
.field("key", key)
|
||||
.field("value", 1)
|
||||
.endObject()).execute().actionGet();
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -28,7 +28,6 @@ import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.collect.MapBuilder;
|
||||
import org.elasticsearch.common.joda.Joda;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.rest.RestStatus;
|
||||
import org.elasticsearch.search.sort.SortOrder;
|
||||
@ -44,6 +43,7 @@ import java.util.Map;
|
||||
import static org.elasticsearch.client.Requests.refreshRequest;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertFailures;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
@ -52,14 +52,6 @@ import static org.hamcrest.Matchers.*;
|
||||
*/
|
||||
public class SearchFieldsTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Override
|
||||
public Settings indexSettings() {
|
||||
return ImmutableSettings.builder()
|
||||
.put("index.number_of_shards", 1) // why just one?
|
||||
.put("index.number_of_replicas", 0)
|
||||
.build();
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testStoredFields() throws Exception {
|
||||
createIndex("test");
|
||||
@ -358,10 +350,9 @@ public class SearchFieldsTests extends ElasticsearchIntegrationTest {
|
||||
.setRefresh(true)
|
||||
.get();
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch("my-index").setTypes("my-type1").addField("field1").get();
|
||||
assertThat(searchResponse.getShardFailures().length, equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures()[0].status(), equalTo(RestStatus.BAD_REQUEST));
|
||||
assertThat(searchResponse.getShardFailures()[0].reason(), containsString("field [field1] isn't a leaf field"));
|
||||
assertFailures(client().prepareSearch("my-index").setTypes("my-type1").addField("field1"),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("field [field1] isn't a leaf field"));
|
||||
}
|
||||
|
||||
@Test
|
||||
|
@ -42,8 +42,7 @@ import java.util.List;
|
||||
import static org.elasticsearch.client.Requests.indexRequest;
|
||||
import static org.elasticsearch.client.Requests.searchRequest;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.functionScoreQuery;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.termQuery;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.*;
|
||||
import static org.elasticsearch.index.query.functionscore.ScoreFunctionBuilders.*;
|
||||
import static org.elasticsearch.search.builder.SearchSourceBuilder.searchSource;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
@ -87,17 +86,16 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
IndexRequestBuilder[] builders = indexBuilders.toArray(new IndexRequestBuilder[indexBuilders.size()]);
|
||||
|
||||
indexRandom(false, builders);
|
||||
refresh();
|
||||
indexRandom(true, builders);
|
||||
|
||||
// Test Gauss
|
||||
List<Float> lonlat = new ArrayList<Float>();
|
||||
lonlat.add(new Float(20));
|
||||
lonlat.add(new Float(11));
|
||||
lonlat.add(20f);
|
||||
lonlat.add(11f);
|
||||
|
||||
ActionFuture<SearchResponse> response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
searchSource().explain(false).query(termQuery("test", "value"))));
|
||||
searchSource().explain(false).query(constantScoreQuery(termQuery("test", "value")))));
|
||||
SearchResponse sr = response.actionGet();
|
||||
SearchHits sh = sr.getHits();
|
||||
assertThat(sh.getTotalHits(), equalTo((long) (numDummyDocs + 2)));
|
||||
@ -105,7 +103,7 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
searchSource().explain(true).query(
|
||||
functionScoreQuery(termQuery("test", "value"), gaussDecayFunction("loc", lonlat, "1000km")))));
|
||||
functionScoreQuery(constantScoreQuery(termQuery("test", "value")), gaussDecayFunction("loc", lonlat, "1000km")))));
|
||||
sr = response.actionGet();
|
||||
sh = sr.getHits();
|
||||
assertThat(sh.getTotalHits(), equalTo((long) (numDummyDocs + 2)));
|
||||
@ -116,7 +114,7 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
searchSource().explain(false).query(termQuery("test", "value"))));
|
||||
searchSource().explain(false).query(constantScoreQuery(termQuery("test", "value")))));
|
||||
sr = response.actionGet();
|
||||
sh = sr.getHits();
|
||||
assertThat(sh.getTotalHits(), equalTo((long) (numDummyDocs + 2)));
|
||||
@ -124,7 +122,7 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
searchSource().explain(true).query(
|
||||
functionScoreQuery(termQuery("test", "value"), linearDecayFunction("loc", lonlat, "1000km")))));
|
||||
functionScoreQuery(constantScoreQuery(termQuery("test", "value")), linearDecayFunction("loc", lonlat, "1000km")))));
|
||||
sr = response.actionGet();
|
||||
sh = sr.getHits();
|
||||
assertThat(sh.getTotalHits(), equalTo((long) (numDummyDocs + 2)));
|
||||
@ -135,7 +133,7 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
searchSource().explain(false).query(termQuery("test", "value"))));
|
||||
searchSource().explain(false).query(constantScoreQuery(termQuery("test", "value")))));
|
||||
sr = response.actionGet();
|
||||
sh = sr.getHits();
|
||||
assertThat(sh.getTotalHits(), equalTo((long) (numDummyDocs + 2)));
|
||||
@ -143,7 +141,7 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
searchSource().explain(true).query(
|
||||
functionScoreQuery(termQuery("test", "value"), exponentialDecayFunction("loc", lonlat, "1000km")))));
|
||||
functionScoreQuery(constantScoreQuery(termQuery("test", "value")), exponentialDecayFunction("loc", lonlat, "1000km")))));
|
||||
sr = response.actionGet();
|
||||
sh = sr.getHits();
|
||||
assertThat(sh.getTotalHits(), equalTo((long) (numDummyDocs + 2)));
|
||||
@ -175,8 +173,7 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
IndexRequestBuilder[] builders = indexBuilders.toArray(new IndexRequestBuilder[indexBuilders.size()]);
|
||||
|
||||
indexRandom(false, builders);
|
||||
refresh();
|
||||
indexRandom(true, builders);
|
||||
|
||||
// Test Gauss
|
||||
|
||||
@ -257,13 +254,12 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
.endObject().endObject()));
|
||||
IndexRequestBuilder[] builders = indexBuilders.toArray(new IndexRequestBuilder[indexBuilders.size()]);
|
||||
|
||||
indexRandom(false, builders);
|
||||
refresh();
|
||||
indexRandom(true, builders);
|
||||
|
||||
// Test Gauss
|
||||
List<Float> lonlat = new ArrayList<Float>();
|
||||
lonlat.add(new Float(20));
|
||||
lonlat.add(new Float(11));
|
||||
lonlat.add(20f);
|
||||
lonlat.add(11f);
|
||||
|
||||
ActionFuture<SearchResponse> response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
@ -308,8 +304,8 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()));
|
||||
IndexRequestBuilder[] builders = indexBuilders.toArray(new IndexRequestBuilder[indexBuilders.size()]);
|
||||
|
||||
indexRandom(false, builders);
|
||||
refresh();
|
||||
indexRandom(true, builders);
|
||||
|
||||
GeoPoint point = new GeoPoint(20, 11);
|
||||
ActionFuture<SearchResponse> response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
@ -349,8 +345,7 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
.setSource(jsonBuilder().startObject().field("test", "value").field("num", 1.0).endObject()));
|
||||
IndexRequestBuilder[] builders = indexBuilders.toArray(new IndexRequestBuilder[indexBuilders.size()]);
|
||||
|
||||
indexRandom(false, builders);
|
||||
refresh();
|
||||
indexRandom(true, builders);
|
||||
|
||||
// function score should return 0.5 for this function
|
||||
|
||||
@ -486,11 +481,9 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
}
|
||||
|
||||
|
||||
@Test(expected = ElasticsearchIllegalStateException.class)
|
||||
public void testExceptionThrownIfScaleRefNotBetween0And1() throws Exception {
|
||||
DecayFunctionBuilder gfb = new GaussDecayFunctionBuilder("num1", "2013-05-28", "1d").setDecay(100);
|
||||
|
||||
}
|
||||
|
||||
@Test
|
||||
@ -500,21 +493,20 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
"type1",
|
||||
jsonBuilder().startObject().startObject("type1").startObject("properties").startObject("test").field("type", "string")
|
||||
.endObject().startObject("num1").field("type", "date").endObject().startObject("num2").field("type", "double")
|
||||
.endObject().endObject().endObject().endObject()));
|
||||
.endObject().endObject().endObject().endObject())
|
||||
);
|
||||
|
||||
ensureYellow();
|
||||
|
||||
client().index(
|
||||
indexRequest("test")
|
||||
.type("type1")
|
||||
.id("1")
|
||||
indexRequest("test").type("type1").id("1")
|
||||
.source(jsonBuilder().startObject().field("test", "value").field("num1", "2013-05-27").field("num2", "1.0")
|
||||
.endObject())).actionGet();
|
||||
client().index(
|
||||
indexRequest("test").type("type1").id("2")
|
||||
.source(jsonBuilder().startObject().field("test", "value").field("num2", "1.0").endObject())).actionGet();
|
||||
client().index(
|
||||
indexRequest("test")
|
||||
.type("type1")
|
||||
.id("3")
|
||||
indexRequest("test").type("type1").id("3")
|
||||
.source(jsonBuilder().startObject().field("test", "value").field("num1", "2013-05-30").field("num2", "1.0")
|
||||
.endObject())).actionGet();
|
||||
client().index(
|
||||
@ -525,11 +517,12 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
ActionFuture<SearchResponse> response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
searchSource().explain(false).query(
|
||||
functionScoreQuery(termQuery("test", "value")).add(linearDecayFunction("num1", "2013-05-28", "+3d"))
|
||||
searchSource().explain(true).query(
|
||||
functionScoreQuery(constantScoreQuery(termQuery("test", "value"))).add(linearDecayFunction("num1", "2013-05-28", "+3d"))
|
||||
.add(linearDecayFunction("num2", "0.0", "1")).scoreMode("multiply"))));
|
||||
|
||||
SearchResponse sr = response.actionGet();
|
||||
|
||||
assertNoFailures(sr);
|
||||
SearchHits sh = sr.getHits();
|
||||
assertThat(sh.hits().length, equalTo(4));
|
||||
@ -619,8 +612,8 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
IndexRequestBuilder[] builders = indexBuilders.toArray(new IndexRequestBuilder[indexBuilders.size()]);
|
||||
indexRandom(true, builders);
|
||||
List<Float> lonlat = new ArrayList<Float>();
|
||||
lonlat.add(new Float(100));
|
||||
lonlat.add(new Float(110));
|
||||
lonlat.add(100f);
|
||||
lonlat.add(110f);
|
||||
ActionFuture<SearchResponse> response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
searchSource().size(numDocs).query(
|
||||
@ -640,9 +633,7 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
for (int i = 0; i < numDocs - 1; i++) {
|
||||
assertThat(scores[i], lessThan(scores[i + 1]));
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@Test(expected = SearchPhaseExecutionException.class)
|
||||
@ -659,8 +650,8 @@ public class DecayFunctionScoreTests extends ElasticsearchIntegrationTest {
|
||||
.endObject())).actionGet();
|
||||
refresh();
|
||||
List<Float> lonlat = new ArrayList<Float>();
|
||||
lonlat.add(new Float(100));
|
||||
lonlat.add(new Float(110));
|
||||
lonlat.add(100f);
|
||||
lonlat.add(110f);
|
||||
ActionFuture<SearchResponse> response = client().search(
|
||||
searchRequest().searchType(SearchType.QUERY_THEN_FETCH).source(
|
||||
searchSource()
|
||||
|
@ -42,11 +42,10 @@ public class RandomScoreFunctionTests extends ElasticsearchIntegrationTest {
|
||||
public void consistentHitsWithSameSeed() throws Exception {
|
||||
final int replicas = between(0, 2); // needed for green status!
|
||||
cluster().ensureAtLeastNumNodes(replicas + 1);
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(
|
||||
ImmutableSettings.builder().put("index.number_of_shards", between(2, 5))
|
||||
.put("index.number_of_replicas", replicas)
|
||||
.build()));
|
||||
ImmutableSettings.builder().put(indexSettings())
|
||||
.put("index.number_of_replicas", replicas)));
|
||||
ensureGreen(); // make sure we are done otherwise preference could change?
|
||||
int docCount = atLeast(100);
|
||||
for (int i = 0; i < docCount; i++) {
|
||||
@ -86,7 +85,7 @@ public class RandomScoreFunctionTests extends ElasticsearchIntegrationTest {
|
||||
public void distribution() throws Exception {
|
||||
int count = 10000;
|
||||
|
||||
prepareCreate("test").execute().actionGet();
|
||||
assertAcked(prepareCreate("test"));
|
||||
ensureGreen();
|
||||
|
||||
for (int i = 0; i < count; i++) {
|
||||
@ -148,7 +147,6 @@ public class RandomScoreFunctionTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
System.out.println("mean: " + sum / (double) count);
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -20,16 +20,16 @@
|
||||
package org.elasticsearch.search.geo;
|
||||
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.search.SearchHit;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.geoBoundingBoxFilter;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.*;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
|
||||
import static org.hamcrest.Matchers.anyOf;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
@ -40,16 +40,11 @@ public class GeoBoundingBoxTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void simpleBoundingBoxTest() throws Exception {
|
||||
try {
|
||||
client().admin().indices().prepareDelete("test").execute().actionGet();
|
||||
} catch (Exception e) {
|
||||
// ignore
|
||||
}
|
||||
String mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
XContentBuilder xContentBuilder = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties").startObject("location").field("type", "geo_point").field("lat_lon", true).endObject().endObject()
|
||||
.endObject().endObject().string();
|
||||
client().admin().indices().prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", xContentBuilder));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
.field("name", "New York")
|
||||
@ -115,16 +110,11 @@ public class GeoBoundingBoxTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void limitsBoundingBoxTest() throws Exception {
|
||||
try {
|
||||
client().admin().indices().prepareDelete("test").execute().actionGet();
|
||||
} catch (Exception e) {
|
||||
// ignore
|
||||
}
|
||||
String mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
XContentBuilder xContentBuilder = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties").startObject("location").field("type", "geo_point").field("lat_lon", true).endObject().endObject()
|
||||
.endObject().endObject().string();
|
||||
client().admin().indices().prepareCreate("test").addMapping("type1", mapping).setSettings(settingsBuilder().put("index.number_of_shards", "1")).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", xContentBuilder));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
.startObject("location").field("lat", 40).field("lon", -20).endObject()
|
||||
@ -166,7 +156,7 @@ public class GeoBoundingBoxTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("location").field("lat", -10).field("lon", 170).endObject()
|
||||
.endObject()).execute().actionGet();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch()
|
||||
.setQuery(filteredQuery(matchAllQuery(), geoBoundingBoxFilter("location").topLeft(41, -11).bottomRight(40, 9)))
|
||||
@ -223,16 +213,11 @@ public class GeoBoundingBoxTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void limit2BoundingBoxTest() throws Exception {
|
||||
try {
|
||||
client().admin().indices().prepareDelete("test").execute().actionGet();
|
||||
} catch (Exception e) {
|
||||
// ignore
|
||||
}
|
||||
String mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
XContentBuilder xContentBuilder = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties").startObject("location").field("type", "geo_point").field("lat_lon", true).endObject().endObject()
|
||||
.endObject().endObject().string();
|
||||
client().admin().indices().prepareCreate("test").addMapping("type1", mapping).setSettings(settingsBuilder().put("index.number_of_shards", "1")).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", xContentBuilder));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
.field("userid", 880)
|
||||
|
@ -19,9 +19,7 @@
|
||||
|
||||
package org.elasticsearch.search.geo;
|
||||
|
||||
import org.elasticsearch.action.search.SearchPhaseExecutionException;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.common.geo.GeoDistance;
|
||||
import org.elasticsearch.common.geo.GeoHashUtils;
|
||||
import org.elasticsearch.common.unit.DistanceUnit;
|
||||
@ -38,9 +36,7 @@ import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Arrays;
|
||||
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.*;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.filteredQuery;
|
||||
@ -54,12 +50,13 @@ public class GeoDistanceTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void simpleDistanceTests() throws Exception {
|
||||
String mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
XContentBuilder xContentBuilder = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties").startObject("location").field("type", "geo_point").field("lat_lon", true)
|
||||
.startObject("fielddata").field("format", randomNumericFieldDataFormat()).endObject().endObject().endObject()
|
||||
.endObject().endObject().string();
|
||||
client().admin().indices().prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", xContentBuilder));
|
||||
ensureGreen();
|
||||
|
||||
indexRandom(true, client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
.field("name", "New York")
|
||||
.startObject("location").field("lat", 40.7143528).field("lon", -74.0059731).endObject()
|
||||
@ -206,16 +203,13 @@ public class GeoDistanceTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testDistanceSortingMVFields() throws Exception {
|
||||
String mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
XContentBuilder xContentBuilder = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties").startObject("locations").field("type", "geo_point").field("lat_lon", true)
|
||||
.startObject("fielddata").field("format", randomNumericFieldDataFormat()).endObject().endObject().endObject()
|
||||
.endObject().endObject().string();
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
.addMapping("type1", mapping)
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth("test").setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject();
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", xContentBuilder));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
.field("names", "New York")
|
||||
@ -325,29 +319,21 @@ public class GeoDistanceTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(((Number) searchResponse.getHits().getAt(2).sortValues()[0]).doubleValue(), closeTo(1157.0d, 10d));
|
||||
assertThat(((Number) searchResponse.getHits().getAt(3).sortValues()[0]).doubleValue(), closeTo(0d, 10d));
|
||||
|
||||
try {
|
||||
client().prepareSearch("test").setQuery(matchAllQuery())
|
||||
.addSort(SortBuilders.geoDistanceSort("locations").point(40.7143528, -74.0059731).sortMode("sum"))
|
||||
.execute().actionGet();
|
||||
fail("Expected error");
|
||||
} catch (SearchPhaseExecutionException e) {
|
||||
assertThat(e.shardFailures()[0].status(), equalTo(RestStatus.BAD_REQUEST));
|
||||
}
|
||||
assertFailures(client().prepareSearch("test").setQuery(matchAllQuery())
|
||||
.addSort(SortBuilders.geoDistanceSort("locations").point(40.7143528, -74.0059731).sortMode("sum")),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("sort_mode [sum] isn't supported for sorting by geo distance"));
|
||||
}
|
||||
|
||||
@Test
|
||||
// Regression bug: https://github.com/elasticsearch/elasticsearch/issues/2851
|
||||
public void testDistanceSortingWithMissingGeoPoint() throws Exception {
|
||||
String mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
XContentBuilder xContentBuilder = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties").startObject("locations").field("type", "geo_point").field("lat_lon", true)
|
||||
.startObject("fielddata").field("format", randomNumericFieldDataFormat()).endObject().endObject().endObject()
|
||||
.endObject().endObject().string();
|
||||
|
||||
client().admin().indices().prepareCreate("test")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
.addMapping("type1", mapping)
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth("test").setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", xContentBuilder));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
.field("names", "Times Square", "Tribeca")
|
||||
@ -363,7 +349,7 @@ public class GeoDistanceTests extends ElasticsearchIntegrationTest {
|
||||
.field("names", "Wall Street", "Soho")
|
||||
.endObject()).execute().actionGet();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
// Order: Asc
|
||||
SearchResponse searchResponse = client().prepareSearch("test").setQuery(matchAllQuery())
|
||||
@ -394,18 +380,18 @@ public class GeoDistanceTests extends ElasticsearchIntegrationTest {
|
||||
double target_lat = 32.81;
|
||||
double target_long = -117.21;
|
||||
|
||||
String mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
XContentBuilder xContentBuilder = XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties").startObject("location").field("type", "geo_point").field("lat_lon", true).endObject().endObject()
|
||||
.endObject().endObject().string();
|
||||
client().admin().indices().prepareCreate("test").addMapping("type1", mapping).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
.endObject().endObject();
|
||||
assertAcked(prepareCreate("test").addMapping("type1", xContentBuilder));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject()
|
||||
.field("name", "TestPosition")
|
||||
.startObject("location").field("lat", source_lat).field("lon", source_long).endObject()
|
||||
.endObject()).execute().actionGet();
|
||||
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
SearchResponse searchResponse1 = client().prepareSearch().addField("_source").addScriptField("distance", "doc['location'].arcDistance(" + target_lat + "," + target_long + ")").execute().actionGet();
|
||||
Double resultDistance1 = searchResponse1.getHits().getHits()[0].getFields().get("distance").getValue();
|
||||
@ -438,30 +424,27 @@ public class GeoDistanceTests extends ElasticsearchIntegrationTest {
|
||||
SearchResponse searchResponse8 = client().prepareSearch().addField("_source").addScriptField("distance", "doc['location'].distanceInMiles(" + target_lat + "," + target_long + ")").execute().actionGet();
|
||||
Double resultDistance8 = searchResponse8.getHits().getHits()[0].getFields().get("distance").getValue();
|
||||
assertThat(resultDistance8, closeTo(GeoDistance.PLANE.calculate(source_lat, source_long, target_lat, target_long, DistanceUnit.MILES), 0.0001d));
|
||||
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testDistanceSortingNestedFields() throws Exception {
|
||||
String mapping = XContentFactory.jsonBuilder().startObject().startObject("company")
|
||||
XContentBuilder xContentBuilder = XContentFactory.jsonBuilder().startObject().startObject("company")
|
||||
.startObject("properties")
|
||||
.startObject("name").field("type", "string").endObject()
|
||||
.startObject("branches")
|
||||
.field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("name").field("type", "string").endObject()
|
||||
.startObject("location").field("type", "geo_point").field("lat_lon", true)
|
||||
.startObject("fielddata").field("format", randomNumericFieldDataFormat()).endObject().endObject()
|
||||
.endObject()
|
||||
.field("type", "nested")
|
||||
.startObject("properties")
|
||||
.startObject("name").field("type", "string").endObject()
|
||||
.startObject("location").field("type", "geo_point").field("lat_lon", true)
|
||||
.startObject("fielddata").field("format", randomNumericFieldDataFormat()).endObject().endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject().endObject().string();
|
||||
.endObject()
|
||||
.endObject().endObject();
|
||||
|
||||
assertAcked(prepareCreate("companies").addMapping("company", xContentBuilder));
|
||||
ensureGreen();
|
||||
|
||||
client().admin().indices().prepareCreate("companies")
|
||||
.setSettings(settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
.addMapping("company", mapping)
|
||||
.execute().actionGet();
|
||||
client().admin().cluster().prepareHealth("companies").setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
indexRandom(true, client().prepareIndex("companies", "company", "1").setSource(jsonBuilder().startObject()
|
||||
.field("name", "company 1")
|
||||
.startArray("branches")
|
||||
@ -598,14 +581,10 @@ public class GeoDistanceTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(((Number) searchResponse.getHits().getAt(2).sortValues()[0]).doubleValue(), equalTo(Double.MAX_VALUE));
|
||||
assertThat(((Number) searchResponse.getHits().getAt(3).sortValues()[0]).doubleValue(), equalTo(Double.MAX_VALUE));
|
||||
|
||||
try {
|
||||
client().prepareSearch("companies").setQuery(matchAllQuery())
|
||||
.addSort(SortBuilders.geoDistanceSort("branches.location").point(40.7143528, -74.0059731).sortMode("sum"))
|
||||
.execute().actionGet();
|
||||
fail("Expected error");
|
||||
} catch (SearchPhaseExecutionException e) {
|
||||
assertThat(e.shardFailures()[0].status(), equalTo(RestStatus.BAD_REQUEST));
|
||||
}
|
||||
assertFailures(client().prepareSearch("companies").setQuery(matchAllQuery())
|
||||
.addSort(SortBuilders.geoDistanceSort("branches.location").point(40.7143528, -74.0059731).sortMode("sum")),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("sort_mode [sum] isn't supported for sorting by geo distance"));
|
||||
}
|
||||
|
||||
/**
|
||||
@ -637,12 +616,10 @@ public class GeoDistanceTests extends ElasticsearchIntegrationTest {
|
||||
.startObject()
|
||||
.field("pin", GeoHashUtils.encode(lat, lon))
|
||||
.endObject();
|
||||
|
||||
ensureYellow();
|
||||
|
||||
client().admin().indices().prepareCreate("locations").addMapping("location", mapping).execute().actionGet();
|
||||
|
||||
assertAcked(prepareCreate("locations").addMapping("location", mapping));
|
||||
client().prepareIndex("locations", "location", "1").setCreate(true).setSource(source).execute().actionGet();
|
||||
client().admin().indices().prepareRefresh("locations").execute().actionGet();
|
||||
refresh();
|
||||
client().prepareGet("locations", "location", "1").execute().actionGet();
|
||||
|
||||
SearchResponse result = client().prepareSearch("locations")
|
||||
|
@ -24,9 +24,9 @@ import com.google.common.collect.Iterables;
|
||||
import org.apache.lucene.util.LuceneTestCase.Slow;
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.*;
|
||||
import org.elasticsearch.client.Requests;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.action.search.SearchRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.search.SearchType;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings.Builder;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
@ -48,14 +48,15 @@ import java.util.Map;
|
||||
|
||||
import static org.elasticsearch.action.search.SearchType.QUERY_THEN_FETCH;
|
||||
import static org.elasticsearch.client.Requests.searchRequest;
|
||||
import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.*;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.missingFilter;
|
||||
import static org.elasticsearch.index.query.FilterBuilders.typeFilter;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.*;
|
||||
import static org.elasticsearch.search.builder.SearchSourceBuilder.highlight;
|
||||
import static org.elasticsearch.search.builder.SearchSourceBuilder.searchSource;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
import static org.elasticsearch.test.hamcrest.RegexMatcher.matches;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
@ -66,9 +67,8 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
// see #3486
|
||||
public void testHighTermFrequencyDoc() throws ElasticsearchException, IOException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.addMapping("test", "name", "type=string,term_vector=with_positions_offsets,store=" + (randomBoolean() ? "yes" : "no"))
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", between(1, 5))));
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("test", "name", "type=string,term_vector=with_positions_offsets,store=" + (randomBoolean() ? "yes" : "no")));
|
||||
ensureYellow();
|
||||
StringBuilder builder = new StringBuilder();
|
||||
for (int i = 0; i < 6000; i++) {
|
||||
@ -105,8 +105,8 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject())
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1)
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("analysis.tokenizer.autocomplete.max_gram", 20)
|
||||
.put("analysis.tokenizer.autocomplete.min_gram", 1)
|
||||
.put("analysis.tokenizer.autocomplete.token_chars", "letter,digit")
|
||||
@ -147,8 +147,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("test", "body", "type=string,index_analyzer=custom_analyzer,search_analyzer=custom_analyzer,term_vector=with_positions_offsets")
|
||||
.setSettings(
|
||||
ImmutableSettings.settingsBuilder().put("index.number_of_shards", 1)
|
||||
.put("index.number_of_replicas", 0)
|
||||
settingsBuilder().put(indexSettings())
|
||||
.put("analysis.filter.wordDelimiter.type", "word_delimiter")
|
||||
.put("analysis.filter.wordDelimiter.type.split_on_numerics", false)
|
||||
.put("analysis.filter.wordDelimiter.generate_word_parts", true)
|
||||
@ -174,12 +173,12 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testNgramHighlightingPreLucene42() throws ElasticsearchException, IOException {
|
||||
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("test",
|
||||
"name", "type=string,index_analyzer=name_index_analyzer,search_analyzer=name_search_analyzer," + randomStoreField() + "term_vector=with_positions_offsets",
|
||||
"name2", "type=string,index_analyzer=name2_index_analyzer,search_analyzer=name_search_analyzer," + randomStoreField() + "term_vector=with_positions_offsets")
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("analysis.filter.my_ngram.max_gram", 20)
|
||||
.put("analysis.filter.my_ngram.version", "4.1")
|
||||
.put("analysis.filter.my_ngram.min_gram", 1)
|
||||
@ -203,38 +202,50 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
refresh();
|
||||
|
||||
SearchResponse search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name", "logica m"))).addHighlightedField("name").get();
|
||||
assertHighlight(search, 0, "name", 0, equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"));
|
||||
assertHighlight(search, 1, "name", 0, equalTo("avinci, unilog avinci, <em>logica</em>c<em>m</em>g, <em>logica</em>"));
|
||||
assertHighlight(search, 0, "name", 0, anyOf(equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>c<em>m</em>g, <em>logica</em>")));
|
||||
assertHighlight(search, 1, "name", 0, anyOf(equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>c<em>m</em>g, <em>logica</em>")));
|
||||
|
||||
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name", "logica ma"))).addHighlightedField("name").get();
|
||||
assertHighlight(search, 0, "name", 0, equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"));
|
||||
assertHighlight(search, 1, "name", 0, equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>"));
|
||||
assertHighlight(search, 0, "name", 0, anyOf(equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
|
||||
assertHighlight(search, 1, "name", 0, anyOf(equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
|
||||
|
||||
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name", "logica"))).addHighlightedField("name").get();
|
||||
assertHighlight(search, 0, "name", 0, equalTo("<em>logica</em>cmg ehemals avinci - the know how company"));
|
||||
assertHighlight(search, 0, "name", 0, anyOf(equalTo("<em>logica</em>cmg ehemals avinci - the know how company"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
|
||||
assertHighlight(search, 0, "name", 0, anyOf(equalTo("<em>logica</em>cmg ehemals avinci - the know how company"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
|
||||
|
||||
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name2", "logica m"))).addHighlightedField("name2").get();
|
||||
assertHighlight(search, 0, "name2", 0, equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"));
|
||||
assertHighlight(search, 1, "name2", 0, equalTo("avinci, unilog avinci, <em>logica</em>c<em>m</em>g, <em>logica</em>"));
|
||||
assertHighlight(search, 0, "name2", 0, anyOf(equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>c<em>m</em>g, <em>logica</em>")));
|
||||
assertHighlight(search, 1, "name2", 0, anyOf(equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>c<em>m</em>g, <em>logica</em>")));
|
||||
|
||||
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name2", "logica ma"))).addHighlightedField("name2").get();
|
||||
assertHighlight(search, 0, "name2", 0, equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"));
|
||||
assertHighlight(search, 1, "name2", 0, equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>"));
|
||||
assertHighlight(search, 0, "name2", 0, anyOf(equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
|
||||
assertHighlight(search, 1, "name2", 0, anyOf(equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
|
||||
|
||||
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name2", "logica"))).addHighlightedField("name2").get();
|
||||
assertHighlight(search, 0, "name2", 0, equalTo("<em>logica</em>cmg ehemals avinci - the know how company"));
|
||||
assertHighlight(search, 1, "name2", 0, equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>"));
|
||||
|
||||
assertHighlight(search, 0, "name2", 0, anyOf(equalTo("<em>logica</em>cmg ehemals avinci - the know how company"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
|
||||
assertHighlight(search, 1, "name2", 0, anyOf(equalTo("<em>logica</em>cmg ehemals avinci - the know how company"),
|
||||
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testNgramHighlighting() throws ElasticsearchException, IOException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("test",
|
||||
"name", "type=string,index_analyzer=name_index_analyzer,search_analyzer=name_search_analyzer,term_vector=with_positions_offsets",
|
||||
"name2", "type=string,index_analyzer=name2_index_analyzer,search_analyzer=name_search_analyzer,term_vector=with_positions_offsets")
|
||||
.setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 2)
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("analysis.filter.my_ngram.max_gram", 20)
|
||||
.put("analysis.filter.my_ngram.min_gram", 1)
|
||||
.put("analysis.filter.my_ngram.type", "ngram")
|
||||
@ -272,7 +283,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testEnsureNoNegativeOffsets() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1",
|
||||
"no_long_term", "type=string,term_vector=with_positions_offsets",
|
||||
"long_term", "type=string,term_vector=with_positions_offsets"));
|
||||
@ -306,7 +317,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSourceLookupHighlightingUsingPlainHighlighter() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
// we don't store title and don't use term vector, now lets see if it works...
|
||||
.startObject("title").field("type", "string").field("store", "no").field("term_vector", "no").endObject()
|
||||
@ -346,7 +357,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSourceLookupHighlightingUsingFastVectorHighlighter() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
// we don't store title, now lets see if it works...
|
||||
.startObject("title").field("type", "string").field("store", "no").field("term_vector", "with_positions_offsets").endObject()
|
||||
@ -386,7 +397,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSourceLookupHighlightingUsingPostingsHighlighter() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
// we don't store title, now lets see if it works...
|
||||
.startObject("title").field("type", "string").field("store", "no").field("index_options", "offsets").endObject()
|
||||
@ -438,7 +449,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testHighlightIssue1994() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "title", "type=string,store=no", "titleTV", "type=string,store=no,term_vector=with_positions_offsets"));
|
||||
ensureYellow();
|
||||
|
||||
@ -522,9 +533,8 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testForceSourceWithSourceDisabled() throws Exception {
|
||||
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1")
|
||||
//just to make sure that we hit the stored fields rather than the _source
|
||||
.startObject("_source").field("enabled", false).endObject()
|
||||
.startObject("properties")
|
||||
.startObject("field1").field("type", "string").field("store", "yes").field("index_options", "offsets")
|
||||
@ -544,43 +554,35 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
.get();
|
||||
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("The <xxx>quick</xxx> brown fox jumps over the lazy dog"));
|
||||
|
||||
searchResponse = client().prepareSearch("test")
|
||||
assertFailures(client().prepareSearch("test")
|
||||
.setQuery(termQuery("field1", "quick"))
|
||||
.addHighlightedField(new Field("field1").preTags("<xxx>").postTags("</xxx>").highlighterType("plain").forceSource(true))
|
||||
.get();
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures().length, equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures()[0].reason(), containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
|
||||
.addHighlightedField(new Field("field1").preTags("<xxx>").postTags("</xxx>").highlighterType("plain").forceSource(true)),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
|
||||
|
||||
searchResponse = client().prepareSearch("test")
|
||||
assertFailures(client().prepareSearch("test")
|
||||
.setQuery(termQuery("field1", "quick"))
|
||||
.addHighlightedField(new Field("field1").preTags("<xxx>").postTags("</xxx>").highlighterType("fvh").forceSource(true))
|
||||
.get();
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures().length, equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures()[0].reason(), containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
|
||||
.addHighlightedField(new Field("field1").preTags("<xxx>").postTags("</xxx>").highlighterType("fvh").forceSource(true)),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
|
||||
|
||||
searchResponse = client().prepareSearch("test")
|
||||
assertFailures(client().prepareSearch("test")
|
||||
.setQuery(termQuery("field1", "quick"))
|
||||
.addHighlightedField(new Field("field1").preTags("<xxx>").postTags("</xxx>").highlighterType("postings").forceSource(true))
|
||||
.get();
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures().length, equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures()[0].reason(), containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
|
||||
.addHighlightedField(new Field("field1").preTags("<xxx>").postTags("</xxx>").highlighterType("postings").forceSource(true)),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
|
||||
|
||||
SearchSourceBuilder searchSource = SearchSourceBuilder.searchSource().query(termQuery("field1", "quick"))
|
||||
.highlight(highlight().forceSource(true).field("field1"));
|
||||
searchResponse = client().search(Requests.searchRequest("test").source(searchSource)).get();
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures().length, equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures()[0].reason(), containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
|
||||
assertFailures(client().prepareSearch("test").setSource(searchSource.buildAsBytes()),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
|
||||
|
||||
searchSource = SearchSourceBuilder.searchSource().query(termQuery("field1", "quick"))
|
||||
.highlight(highlight().forceSource(true).field("field*"));
|
||||
searchResponse = client().search(Requests.searchRequest("test").source(searchSource)).get();
|
||||
assertThat(searchResponse.getFailedShards(), equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures().length, equalTo(1));
|
||||
assertThat(searchResponse.getShardFailures()[0].reason(), matches("source is forced for fields \\[field\\d, field\\d\\] but type \\[type1\\] has disabled _source"));
|
||||
assertFailures(client().prepareSearch("test").setSource(searchSource.buildAsBytes()),
|
||||
RestStatus.BAD_REQUEST,
|
||||
matches("source is forced for fields \\[field\\d, field\\d\\] but type \\[type1\\] has disabled _source"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@ -654,7 +656,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFastVectorHighlighter() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1")
|
||||
@ -710,7 +712,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
*/
|
||||
@Test(timeout=120000)
|
||||
public void testFVHManyMatches() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
ensureGreen();
|
||||
|
||||
// Index one megabyte of "t " over and over and over again
|
||||
@ -739,7 +741,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
private void checkMatchedFieldsCase(boolean requireFieldMatch) throws Exception {
|
||||
client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", XContentFactory.jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("properties")
|
||||
.startObject("foo")
|
||||
@ -774,7 +776,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()
|
||||
.endObject()).execute().actionGet();
|
||||
.endObject()));
|
||||
ensureGreen();
|
||||
|
||||
index("test", "type1", "1",
|
||||
@ -882,15 +884,14 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
assertHighlight(resp, 0, "foo", 0, equalTo("<em>weird</em>"));
|
||||
assertHighlight(resp, 0, "bar", 0, equalTo("<em>resul</em>t"));
|
||||
|
||||
//But be careful. It'll blow up if there is a result paste the end of the field.
|
||||
resp = req.setQuery(queryString("result").field("foo").field("foo.plain").field("bar").field("bar.plain")).get();
|
||||
assertThat("Expected ShardFailures", resp.getShardFailures().length, greaterThan(0));
|
||||
assertFailures(req.setQuery(queryString("result").field("foo").field("foo.plain").field("bar").field("bar.plain")),
|
||||
RestStatus.INTERNAL_SERVER_ERROR, containsString("String index out of range"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@Slow
|
||||
public void testFastVectorHighlighterManyDocs() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
ensureGreen();
|
||||
|
||||
int COUNT = between(20, 100);
|
||||
@ -949,7 +950,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSameContent() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "title", "type=string,store=yes,term_vector=with_positions_offsets"));
|
||||
ensureYellow();
|
||||
|
||||
@ -972,7 +973,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFastVectorHighlighterOffsetParameter() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "title", "type=string,store=yes,term_vector=with_positions_offsets").get());
|
||||
ensureYellow();
|
||||
|
||||
@ -996,7 +997,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testEscapeHtml() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "title", "type=string,store=yes"));
|
||||
ensureYellow();
|
||||
|
||||
@ -1020,7 +1021,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testEscapeHtml_vector() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "title", "type=string,store=yes,term_vector=with_positions_offsets"));
|
||||
ensureYellow();
|
||||
|
||||
@ -1044,7 +1045,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMultiMapperVectorWithStore() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("title").field("type", "multi_field").startObject("fields")
|
||||
.startObject("title").field("type", "string").field("store", "yes").field("term_vector", "with_positions_offsets").field("analyzer", "classic").endObject()
|
||||
@ -1075,7 +1076,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMultiMapperVectorFromSource() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("title").field("type", "multi_field").startObject("fields")
|
||||
.startObject("title").field("type", "string").field("store", "no").field("term_vector", "with_positions_offsets").field("analyzer", "classic").endObject()
|
||||
@ -1108,7 +1109,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMultiMapperNoVectorWithStore() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("title").field("type", "multi_field").startObject("fields")
|
||||
.startObject("title").field("type", "string").field("store", "yes").field("term_vector", "no").field("analyzer", "classic").endObject()
|
||||
@ -1141,7 +1142,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMultiMapperNoVectorFromSource() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("title").field("type", "multi_field").startObject("fields")
|
||||
.startObject("title").field("type", "string").field("store", "no").field("term_vector", "no").field("analyzer", "classic").endObject()
|
||||
@ -1173,7 +1174,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFastVectorHighlighterShouldFailIfNoTermVectors() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "title", "type=string,store=yes,term_vector=no"));
|
||||
ensureGreen();
|
||||
|
||||
@ -1190,30 +1191,24 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
.get();
|
||||
assertNoFailures(search);
|
||||
|
||||
search = client().prepareSearch()
|
||||
assertFailures(client().prepareSearch()
|
||||
.setQuery(matchPhraseQuery("title", "this is a test"))
|
||||
.addHighlightedField("title", 50, 1, 10)
|
||||
.setHighlighterType("fast-vector-highlighter")
|
||||
.execute().actionGet();
|
||||
assertThat(search.getFailedShards(), equalTo(2));
|
||||
for (ShardSearchFailure shardSearchFailure : search.getShardFailures()) {
|
||||
assertThat(shardSearchFailure.reason(), containsString("the field [title] should be indexed with term vector with position offsets to be used with fast vector highlighter"));
|
||||
}
|
||||
.setHighlighterType("fast-vector-highlighter"),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("the field [title] should be indexed with term vector with position offsets to be used with fast vector highlighter"));
|
||||
|
||||
search = client().prepareSearch()
|
||||
assertFailures(client().prepareSearch()
|
||||
.setQuery(matchPhraseQuery("title", "this is a test"))
|
||||
.addHighlightedField("tit*", 50, 1, 10)
|
||||
.setHighlighterType("fast-vector-highlighter")
|
||||
.execute().actionGet();
|
||||
assertThat(search.getFailedShards(), equalTo(2));
|
||||
for (ShardSearchFailure shardSearchFailure : search.getShardFailures()) {
|
||||
assertThat(shardSearchFailure.reason(), containsString("the field [title] should be indexed with term vector with position offsets to be used with fast vector highlighter"));
|
||||
}
|
||||
.setHighlighterType("fast-vector-highlighter"),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("the field [title] should be indexed with term vector with position offsets to be used with fast vector highlighter"));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testDisableFastVectorHighlighter() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "title", "type=string,store=yes,term_vector=with_positions_offsets,analyzer=classic"));
|
||||
ensureGreen();
|
||||
|
||||
@ -1259,8 +1254,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFSHHighlightAllMvFragments() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "tags", "type=string,term_vector=with_positions_offsets"));
|
||||
ensureGreen();
|
||||
client().prepareIndex("test", "type1", "1")
|
||||
@ -1298,7 +1292,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testBoostingQueryTermVector() throws ElasticsearchException, IOException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
ensureGreen();
|
||||
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog")
|
||||
.get();
|
||||
@ -1338,7 +1332,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testCommonTermsTermVector() throws ElasticsearchException, IOException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog").get();
|
||||
@ -1355,13 +1349,14 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPhrasePrefix() throws ElasticsearchException, IOException {
|
||||
Builder builder = ImmutableSettings.builder();
|
||||
builder.put("index.analysis.analyzer.synonym.tokenizer", "whitespace");
|
||||
builder.putArray("index.analysis.analyzer.synonym.filter", "synonym", "lowercase");
|
||||
builder.put("index.analysis.filter.synonym.type", "synonym");
|
||||
builder.putArray("index.analysis.filter.synonym.synonyms", "quick => fast");
|
||||
Builder builder = settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("index.analysis.analyzer.synonym.tokenizer", "whitespace")
|
||||
.putArray("index.analysis.analyzer.synonym.filter", "synonym", "lowercase")
|
||||
.put("index.analysis.filter.synonym.type", "synonym")
|
||||
.putArray("index.analysis.filter.synonym.synonyms", "quick => fast");
|
||||
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(builder.build()).addMapping("type1", type1TermVectorMapping())
|
||||
assertAcked(prepareCreate("test").setSettings(builder.build()).addMapping("type1", type1TermVectorMapping())
|
||||
.addMapping("type2", "_all", "store=yes,termVector=with_positions_offsets",
|
||||
"field4", "type=string,term_vector=with_positions_offsets,analyzer=synonym",
|
||||
"field3", "type=string,analyzer=synonym"));
|
||||
@ -1375,23 +1370,21 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
logger.info("--> highlighting and searching on field0");
|
||||
SearchSourceBuilder source = searchSource()
|
||||
.query(matchPhrasePrefixQuery("field0", "quick bro"))
|
||||
.from(0).size(60).explain(true)
|
||||
.highlight(highlight().field("field0").order("score").preTags("<x>").postTags("</x>"));
|
||||
|
||||
SearchResponse searchResponse = client().search(searchRequest("test").source(source).searchType(QUERY_THEN_FETCH)).actionGet();
|
||||
SearchResponse searchResponse = client().search(searchRequest("test").source(source)).actionGet();
|
||||
|
||||
assertHighlight(searchResponse, 0, "field0", 0, 1, equalTo("The <x>quick</x> <x>brown</x> fox jumps over the lazy dog"));
|
||||
|
||||
logger.info("--> highlighting and searching on field1");
|
||||
source = searchSource()
|
||||
.query(matchPhrasePrefixQuery("field1", "quick bro"))
|
||||
.from(0).size(60).explain(true)
|
||||
.highlight(highlight().field("field1").order("score").preTags("<x>").postTags("</x>"));
|
||||
|
||||
searchResponse = client().search(searchRequest("test").source(source).searchType(QUERY_THEN_FETCH)).actionGet();
|
||||
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
|
||||
|
||||
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("The <x>quick browse</x> button is a fancy thing, right bro?"));
|
||||
assertHighlight(searchResponse, 1, "field1", 0, 1, equalTo("The <x>quick brown</x> fox jumps over the lazy dog"));
|
||||
assertHighlight(searchResponse, 0, "field1", 0, 1, anyOf(equalTo("The <x>quick browse</x> button is a fancy thing, right bro?"), equalTo("The <x>quick brown</x> fox jumps over the lazy dog")));
|
||||
assertHighlight(searchResponse, 1, "field1", 0, 1, anyOf(equalTo("The <x>quick browse</x> button is a fancy thing, right bro?"), equalTo("The <x>quick brown</x> fox jumps over the lazy dog")));
|
||||
|
||||
// with synonyms
|
||||
client().prepareIndex("test", "type2", "0")
|
||||
@ -1402,33 +1395,32 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
.setSource("field4", "a quick fast blue car").get();
|
||||
refresh();
|
||||
|
||||
source = searchSource().postFilter(typeFilter("type2")).query(matchPhrasePrefixQuery("field3", "fast bro")).from(0).size(60).explain(true)
|
||||
source = searchSource().postFilter(typeFilter("type2")).query(matchPhrasePrefixQuery("field3", "fast bro"))
|
||||
.highlight(highlight().field("field3").order("score").preTags("<x>").postTags("</x>"));
|
||||
|
||||
searchResponse = client().search(searchRequest("test").source(source).searchType(QUERY_THEN_FETCH)).actionGet();
|
||||
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
|
||||
|
||||
assertHighlight(searchResponse, 0, "field3", 0, 1, equalTo("The <x>quick</x> <x>brown</x> fox jumps over the lazy dog"));
|
||||
|
||||
logger.info("--> highlighting and searching on field4");
|
||||
source = searchSource().postFilter(typeFilter("type2")).query(matchPhrasePrefixQuery("field4", "the fast bro")).from(0).size(60).explain(true)
|
||||
source = searchSource().postFilter(typeFilter("type2")).query(matchPhrasePrefixQuery("field4", "the fast bro"))
|
||||
.highlight(highlight().field("field4").order("score").preTags("<x>").postTags("</x>"));
|
||||
searchResponse = client().search(searchRequest("test").source(source).searchType(QUERY_THEN_FETCH)).actionGet();
|
||||
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
|
||||
|
||||
assertHighlight(searchResponse, 0, "field4", 0, 1, equalTo("<x>The quick browse</x> button is a fancy thing, right bro?"));
|
||||
assertHighlight(searchResponse, 1, "field4", 0, 1, equalTo("<x>The quick brown</x> fox jumps over the lazy dog"));
|
||||
assertHighlight(searchResponse, 0, "field4", 0, 1, anyOf(equalTo("<x>The quick browse</x> button is a fancy thing, right bro?"), equalTo("<x>The quick brown</x> fox jumps over the lazy dog")));
|
||||
assertHighlight(searchResponse, 1, "field4", 0, 1, anyOf(equalTo("<x>The quick browse</x> button is a fancy thing, right bro?"), equalTo("<x>The quick brown</x> fox jumps over the lazy dog")));
|
||||
|
||||
logger.info("--> highlighting and searching on field4");
|
||||
source = searchSource().postFilter(typeFilter("type2")).query(matchPhrasePrefixQuery("field4", "a fast quick blue ca")).from(0).size(60).explain(true)
|
||||
source = searchSource().postFilter(typeFilter("type2")).query(matchPhrasePrefixQuery("field4", "a fast quick blue ca"))
|
||||
.highlight(highlight().field("field4").order("score").preTags("<x>").postTags("</x>"));
|
||||
searchResponse = client().search(searchRequest("test").source(source).searchType(QUERY_THEN_FETCH)).actionGet();
|
||||
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
|
||||
|
||||
assertHighlight(searchResponse, 0, "field4", 0, 1, equalTo("<x>a quick fast blue car</x>"));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testPlainHighlightDifferentFragmenter() throws Exception {
|
||||
assertAcked(prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "tags", "type=string"));
|
||||
ensureGreen();
|
||||
client().prepareIndex("test", "type1", "1")
|
||||
@ -1453,15 +1445,12 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
assertHighlight(response, 0, "tags", 0, equalTo("this is a really <em>long</em> <em>tag</em> i would like to highlight"));
|
||||
assertHighlight(response, 0, "tags", 1, 2, equalTo("here is another one that is very <em>long</em> <em>tag</em> and has the tag token near the end"));
|
||||
|
||||
try {
|
||||
client().prepareSearch("test")
|
||||
assertFailures(client().prepareSearch("test")
|
||||
.setQuery(QueryBuilders.matchQuery("tags", "long tag").type(MatchQueryBuilder.Type.PHRASE))
|
||||
.addHighlightedField(new HighlightBuilder.Field("tags")
|
||||
.fragmentSize(-1).numOfFragments(2).fragmenter("invalid")).get();
|
||||
fail("Shouldn't get here");
|
||||
} catch (SearchPhaseExecutionException e) {
|
||||
assertThat(e.shardFailures()[0].status(), equalTo(RestStatus.BAD_REQUEST));
|
||||
}
|
||||
.fragmentSize(-1).numOfFragments(2).fragmenter("invalid")),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("unknown fragmenter option [invalid] for the field [tags]"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@ -1483,7 +1472,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFastVectorHighlighterMultipleFields() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,term_vectors=with_positions_offsets", "field2", "type=string,term_vectors=with_positions_offsets"));
|
||||
ensureGreen();
|
||||
|
||||
@ -1501,8 +1490,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMissingStoredField() throws Exception {
|
||||
assertAcked(prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder()
|
||||
.put("index.number_of_shards", 1).put("index.number_of_replicas", 0))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "highlight_field", "type=string,store=yes"));
|
||||
ensureGreen();
|
||||
client().prepareIndex("test", "type1", "1")
|
||||
@ -1551,7 +1539,8 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
// https://github.com/elasticsearch/elasticsearch/issues/3200
|
||||
public void testResetTwice() throws Exception {
|
||||
assertAcked(prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.builder()
|
||||
.setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put("analysis.analyzer.my_analyzer.type", "pattern")
|
||||
.put("analysis.analyzer.my_analyzer.pattern", "\\s+")
|
||||
.build())
|
||||
@ -1903,7 +1892,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighter() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1")
|
||||
@ -1959,7 +1948,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterMultipleFields() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()).get());
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()).get());
|
||||
ensureGreen();
|
||||
|
||||
index("test", "type1", "1", "field1", "The <b>quick<b> brown fox. Second sentence.", "field2", "The <b>slow<b> brown fox. Second sentence.");
|
||||
@ -1976,7 +1965,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterNumberOfFragments() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1", "1")
|
||||
@ -2027,7 +2016,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterRequireFieldMatch() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1")
|
||||
@ -2096,7 +2085,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("field2").field("type", "string").field("index_options", "offsets").field("term_vector", "with_positions_offsets").endObject()
|
||||
.endObject()
|
||||
.endObject().endObject();
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", mapping));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", mapping));
|
||||
ensureGreen();
|
||||
client().prepareIndex("test", "type1")
|
||||
.setSource("field1", "The quick brown fox jumps over",
|
||||
@ -2121,7 +2110,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterOrderByScore() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1")
|
||||
@ -2160,7 +2149,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterEscapeHtml() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "title", "type=string," + randomStoreField() + "index_options=offsets"));
|
||||
ensureYellow();
|
||||
|
||||
@ -2183,7 +2172,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterMultiMapperWithStore() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1")
|
||||
//just to make sure that we hit the stored fields rather than the _source
|
||||
.startObject("_source").field("enabled", false).endObject()
|
||||
@ -2220,7 +2209,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterMultiMapperFromSource() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("title").field("type", "multi_field").startObject("fields")
|
||||
.startObject("title").field("type", "string").field("store", "no").field("index_options", "offsets").field("analyzer", "classic").endObject()
|
||||
@ -2250,7 +2239,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterShouldFailIfNoOffsets() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2))
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
|
||||
.startObject("title").field("type", "string").field("store", "yes").field("index_options", "docs").endObject()
|
||||
.endObject().endObject().endObject()));
|
||||
@ -2269,42 +2258,33 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
.get();
|
||||
assertNoFailures(search);
|
||||
|
||||
search = client().prepareSearch()
|
||||
assertFailures(client().prepareSearch()
|
||||
.setQuery(matchQuery("title", "this is a test"))
|
||||
.addHighlightedField("title")
|
||||
.setHighlighterType("postings-highlighter")
|
||||
.get();
|
||||
assertThat(search.getFailedShards(), equalTo(2));
|
||||
for (ShardSearchFailure shardSearchFailure : search.getShardFailures()) {
|
||||
assertThat(shardSearchFailure.reason(), containsString("the field [title] should be indexed with positions and offsets in the postings list to be used with postings highlighter"));
|
||||
}
|
||||
.setHighlighterType("postings-highlighter"),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("the field [title] should be indexed with positions and offsets in the postings list to be used with postings highlighter"));
|
||||
|
||||
search = client().prepareSearch()
|
||||
|
||||
|
||||
assertFailures(client().prepareSearch()
|
||||
.setQuery(matchQuery("title", "this is a test"))
|
||||
.addHighlightedField("title")
|
||||
.setHighlighterType("postings")
|
||||
.get();
|
||||
.setHighlighterType("postings"),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("the field [title] should be indexed with positions and offsets in the postings list to be used with postings highlighter"));
|
||||
|
||||
assertThat(search.getFailedShards(), equalTo(2));
|
||||
for (ShardSearchFailure shardSearchFailure : search.getShardFailures()) {
|
||||
assertThat(shardSearchFailure.reason(), containsString("the field [title] should be indexed with positions and offsets in the postings list to be used with postings highlighter"));
|
||||
}
|
||||
|
||||
search = client().prepareSearch()
|
||||
assertFailures(client().prepareSearch()
|
||||
.setQuery(matchQuery("title", "this is a test"))
|
||||
.addHighlightedField("tit*")
|
||||
.setHighlighterType("postings")
|
||||
.get();
|
||||
|
||||
assertThat(search.getFailedShards(), equalTo(2));
|
||||
for (ShardSearchFailure shardSearchFailure : search.getShardFailures()) {
|
||||
assertThat(shardSearchFailure.reason(), containsString("the field [title] should be indexed with positions and offsets in the postings list to be used with postings highlighter"));
|
||||
}
|
||||
.setHighlighterType("postings"),
|
||||
RestStatus.BAD_REQUEST,
|
||||
containsString("the field [title] should be indexed with positions and offsets in the postings list to be used with postings highlighter"));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterBoostingQuery() throws ElasticsearchException, IOException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.")
|
||||
.get();
|
||||
@ -2321,7 +2301,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterCommonTermsQuery() throws ElasticsearchException, IOException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
|
||||
@ -2350,7 +2330,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterPrefixQuery() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
|
||||
@ -2368,7 +2348,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterFuzzyQuery() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
|
||||
@ -2384,7 +2364,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterRegexpQuery() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
|
||||
@ -2402,7 +2382,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterWildcardQuery() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
|
||||
@ -2428,7 +2408,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterTermRangeQuery() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "aaab").get();
|
||||
@ -2444,7 +2424,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterQueryString() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
|
||||
@ -2463,7 +2443,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testPostingsHighlighterRegexpQueryWithinConstantScoreQuery() throws Exception {
|
||||
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "The photography word will get highlighted").get();
|
||||
@ -2482,7 +2462,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testPostingsHighlighterMultiTermQueryMultipleLevels() throws Exception {
|
||||
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "The photography word will get highlighted").get();
|
||||
@ -2504,7 +2484,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testPostingsHighlighterPrefixQueryWithinBooleanQuery() throws Exception {
|
||||
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "The photography word will get highlighted").get();
|
||||
@ -2523,7 +2503,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testPostingsHighlighterQueryStringWithinFilteredQuery() throws Exception {
|
||||
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "The photography word will get highlighted").get();
|
||||
@ -2542,7 +2522,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
@Slow
|
||||
public void testPostingsHighlighterManyDocs() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
ensureGreen();
|
||||
|
||||
int COUNT = between(20, 100);
|
||||
@ -2590,7 +2570,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test //https://github.com/elasticsearch/elasticsearch/issues/4116
|
||||
public void testPostingsHighlighterCustomIndexName() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,index_options=offsets,index_name=my_field"));
|
||||
ensureGreen();
|
||||
|
||||
@ -2613,7 +2593,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFastVectorHighlighterCustomIndexName() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,term_vector=with_positions_offsets,index_name=my_field"));
|
||||
ensureGreen();
|
||||
|
||||
@ -2636,7 +2616,7 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testPlainHighlighterCustomIndexName() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,index_name=my_field"));
|
||||
ensureGreen();
|
||||
|
||||
@ -2659,13 +2639,13 @@ public class HighlighterSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFastVectorHighlighterPhraseBoost() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
|
||||
phraseBoostTestCase("fvh");
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testPostingsHighlighterPhraseBoost() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
|
||||
phraseBoostTestCase("postings");
|
||||
}
|
||||
|
||||
|
@ -21,17 +21,14 @@ package org.elasticsearch.search.indicesboost;
|
||||
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.search.SearchType;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.elasticsearch.test.hamcrest.ElasticsearchAssertions;
|
||||
import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.client.Requests.*;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.index.query.QueryBuilders.termQuery;
|
||||
import static org.elasticsearch.search.builder.SearchSourceBuilder.searchSource;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
/**
|
||||
@ -39,14 +36,9 @@ import static org.hamcrest.Matchers.equalTo;
|
||||
*/
|
||||
public class SimpleIndicesBoostSearchTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
private static final Settings DEFAULT_SETTINGS = ImmutableSettings.settingsBuilder()
|
||||
.put(IndexMetaData.SETTING_NUMBER_OF_SHARDS, 1)
|
||||
.put(IndexMetaData.SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
.build();
|
||||
|
||||
@Test
|
||||
public void testIndicesBoost() throws Exception {
|
||||
ElasticsearchAssertions.assertHitCount(client().prepareSearch().setQuery(termQuery("test", "value")).get(), 0);
|
||||
assertHitCount(client().prepareSearch().setQuery(termQuery("test", "value")).get(), 0);
|
||||
|
||||
try {
|
||||
client().prepareSearch("test").setQuery(termQuery("test", "value")).execute().actionGet();
|
||||
@ -55,13 +47,13 @@ public class SimpleIndicesBoostSearchTests extends ElasticsearchIntegrationTest
|
||||
// ignore, no indices
|
||||
}
|
||||
|
||||
client().admin().indices().create(createIndexRequest("test1").settings(DEFAULT_SETTINGS)).actionGet();
|
||||
client().admin().indices().create(createIndexRequest("test2").settings(DEFAULT_SETTINGS)).actionGet();
|
||||
createIndex("test1", "test2");
|
||||
ensureGreen();
|
||||
client().index(indexRequest("test1").type("type1").id("1")
|
||||
.source(jsonBuilder().startObject().field("test", "value check").endObject())).actionGet();
|
||||
client().index(indexRequest("test2").type("type1").id("1")
|
||||
.source(jsonBuilder().startObject().field("test", "value beck").endObject())).actionGet();
|
||||
client().admin().indices().refresh(refreshRequest()).actionGet();
|
||||
refresh();
|
||||
|
||||
float indexBoost = 1.1f;
|
||||
|
||||
|
@ -24,7 +24,6 @@ import org.elasticsearch.action.admin.cluster.health.ClusterHealthStatus;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.search.SearchType;
|
||||
import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.common.Priority;
|
||||
import org.elasticsearch.rest.RestStatus;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
@ -38,11 +37,11 @@ public class SearchPreferenceTests extends ElasticsearchIntegrationTest {
|
||||
@Test // see #2896
|
||||
public void testStopOneNodePreferenceWithRedState() throws InterruptedException {
|
||||
client().admin().indices().prepareCreate("test").setSettings(settingsBuilder().put("index.number_of_shards", cluster().size()+2).put("index.number_of_replicas", 0)).execute().actionGet();
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
for (int i = 0; i < 10; i++) {
|
||||
client().prepareIndex("test", "type1", ""+i).setSource("field1", "value1").execute().actionGet();
|
||||
}
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
cluster().stopRandomNode();
|
||||
client().admin().cluster().prepareHealth().setWaitForStatus(ClusterHealthStatus.RED).execute().actionGet();
|
||||
String[] preferences = new String[] {"_primary", "_local", "_primary_first", "_only_local", "_prefer_node:somenode", "_prefer_node:server2"};
|
||||
@ -55,15 +54,14 @@ public class SearchPreferenceTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(pref, searchResponse.getFailedShards(), greaterThanOrEqualTo(0));
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void noPreferenceRandom() throws Exception {
|
||||
client().admin().indices().prepareCreate("test").setSettings(settingsBuilder().put("index.number_of_shards", 1).put("index.number_of_replicas", 1)).execute().actionGet();
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "value1").execute().actionGet();
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
final Client client = cluster().smartClient();
|
||||
SearchResponse searchResponse = client.prepareSearch("test").setQuery(matchAllQuery()).execute().actionGet();
|
||||
@ -77,10 +75,10 @@ public class SearchPreferenceTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void simplePreferenceTests() throws Exception {
|
||||
createIndex("test");
|
||||
client().admin().cluster().prepareHealth().setWaitForEvents(Priority.LANGUID).setWaitForGreenStatus().execute().actionGet();
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "type1").setSource("field1", "value1").execute().actionGet();
|
||||
client().admin().indices().prepareRefresh().execute().actionGet();
|
||||
refresh();
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch().setQuery(matchAllQuery()).setPreference("_local").execute().actionGet();
|
||||
assertThat(searchResponse.getHits().totalHits(), equalTo(1l));
|
||||
|
@ -52,6 +52,7 @@ public class MultiMatchQueryTests extends ElasticsearchIntegrationTest {
|
||||
@Before
|
||||
public void init() throws Exception {
|
||||
CreateIndexRequestBuilder builder = prepareCreate("test").setSettings(settingsBuilder()
|
||||
.put(indexSettings())
|
||||
.put(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
.put("index.analysis.analyzer.perfect_match.type", "custom")
|
||||
|
@ -30,6 +30,7 @@ import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.action.search.SearchType;
|
||||
import org.elasticsearch.action.search.ShardSearchFailure;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.common.settings.ImmutableSettings;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.index.mapper.MapperParsingException;
|
||||
import org.elasticsearch.index.query.*;
|
||||
@ -65,10 +66,12 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testOmitNormsOnAll() throws ExecutionException, InterruptedException, IOException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("_all").field("omit_norms", true).endObject()
|
||||
.endObject().endObject()));
|
||||
.endObject().endObject())
|
||||
.setSettings(IndexMetaData.SETTING_NUMBER_OF_SHARDS, between(3, DEFAULT_MAX_NUM_SHARDS)));
|
||||
ensureGreen();
|
||||
indexRandom(true, client().prepareIndex("test", "type1", "1").setSource("field1", "the quick brown fox jumps"),
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "quick brown"),
|
||||
client().prepareIndex("test", "type1", "3").setSource("field1", "quick"));
|
||||
@ -80,7 +83,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
assertThat(hits[0].score(), allOf(equalTo(hits[1].getScore()), equalTo(hits[2].getScore())));
|
||||
wipeIndices("test");
|
||||
|
||||
assertAcked(client().admin().indices().prepareCreate("test"));
|
||||
createIndex("test");
|
||||
indexRandom(true, client().prepareIndex("test", "type1", "1").setSource("field1", "the quick brown fox jumps"),
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "quick brown"),
|
||||
client().prepareIndex("test", "type1", "3").setSource("field1", "quick"));
|
||||
@ -104,7 +107,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test // see https://github.com/elasticsearch/elasticsearch/issues/3177
|
||||
public void testIssue3177() {
|
||||
assertAcked(prepareCreate("test").setSettings(settingsBuilder().put(SETTING_NUMBER_OF_SHARDS, 1)));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1").get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "value2").get();
|
||||
client().prepareIndex("test", "type1", "3").setSource("field1", "value3").get();
|
||||
@ -137,7 +140,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void passQueryAsStringTest() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1_1", "field2", "value2_1").setRefresh(true).get();
|
||||
|
||||
@ -147,20 +150,19 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testIndexOptions() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,index_options=docs")
|
||||
.setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,index_options=docs"));
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "quick brown fox", "field2", "quick brown fox").get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "quick lazy huge brown fox", "field2", "quick lazy huge brown fox").setRefresh(true).get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "quick lazy huge brown fox", "field2", "quick lazy huge brown fox").get();
|
||||
refresh();
|
||||
|
||||
SearchResponse searchResponse = client().prepareSearch().setQuery(matchQuery("field2", "quick brown").type(MatchQueryBuilder.Type.PHRASE).slop(0)).get();
|
||||
assertHitCount(searchResponse, 1l);
|
||||
try {
|
||||
client().prepareSearch().setQuery(matchQuery("field1", "quick brown").type(MatchQueryBuilder.Type.PHRASE).slop(0)).get();
|
||||
} catch (SearchPhaseExecutionException e) {
|
||||
assertTrue("wrong exception message " + e.getMessage(), e.getMessage().endsWith("IllegalStateException[field \"field1\" was indexed without position data; cannot run PhraseQuery (term=quick)]; }"));
|
||||
}
|
||||
|
||||
assertFailures(client().prepareSearch().setQuery(matchQuery("field1", "quick brown").type(Type.PHRASE).slop(0)),
|
||||
RestStatus.INTERNAL_SERVER_ERROR,
|
||||
containsString("field \"field1\" was indexed without position data; cannot run PhraseQuery (term=quick"));
|
||||
}
|
||||
|
||||
@Test // see #3521
|
||||
@ -235,7 +237,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test // see #3521
|
||||
public void testAllDocsQueryString() throws InterruptedException, ExecutionException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_REPLICAS, 0));
|
||||
createIndex("test");
|
||||
indexRandom(true, client().prepareIndex("test", "type1", "1").setSource("foo", "bar"),
|
||||
client().prepareIndex("test", "type1", "2").setSource("foo", "bar")
|
||||
);
|
||||
@ -384,7 +386,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void queryStringAnalyzedWildcard() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value_1", "field2", "value_2").get();
|
||||
refresh();
|
||||
@ -407,7 +409,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testLowercaseExpandedTerms() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value_1", "field2", "value_2").get();
|
||||
refresh();
|
||||
@ -428,7 +430,8 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testDateRangeInQueryString() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
String aMonthAgo = ISODateTimeFormat.yearMonthDay().print(new DateTime(DateTimeZone.UTC).minusMonths(1));
|
||||
String aMonthFromNow = ISODateTimeFormat.yearMonthDay().print(new DateTime(DateTimeZone.UTC).plusMonths(1));
|
||||
@ -443,9 +446,10 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
try {
|
||||
client().prepareSearch().setQuery(queryString("future:[now/D TO now+2M/d]").lowercaseExpandedTerms(false)).get();
|
||||
fail("D is an unsupported unit in date math");
|
||||
} catch (Exception e) {
|
||||
// expected
|
||||
fail("expected SearchPhaseExecutionException (total failure)");
|
||||
} catch (SearchPhaseExecutionException e) {
|
||||
assertThat(e.status(), equalTo(RestStatus.BAD_REQUEST));
|
||||
assertThat(e.getMessage(), containsString("unit [D] not supported for date math"));
|
||||
}
|
||||
}
|
||||
|
||||
@ -460,7 +464,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
private void typeFilterTests(String index) throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("_type").field("index", index).endObject()
|
||||
.endObject().endObject())
|
||||
@ -493,7 +497,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
private void idsFilterTests(String index) throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder().startObject().startObject("type1")
|
||||
.startObject("_id").field("index", index).endObject()
|
||||
.endObject().endObject()));
|
||||
@ -543,7 +547,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void filterExistsMissingTests() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
indexRandom(true,
|
||||
client().prepareIndex("test", "type1", "1").setSource(jsonBuilder().startObject().startObject("obj1").field("obj1_val", "1").endObject().field("x1", "x_1").field("field1", "value1_1").field("field2", "value2_1").endObject()),
|
||||
@ -610,7 +614,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void passQueryOrFilterAsJSONStringTest() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1_1", "field2", "value2_1").setRefresh(true).get();
|
||||
|
||||
@ -645,7 +649,8 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMatchQueryNumeric() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
indexRandom(true, client().prepareIndex("test", "type1", "1").setSource("long", 1l, "double", 1.0d),
|
||||
client().prepareIndex("test", "type1", "2").setSource("long", 2l, "double", 2.0d),
|
||||
@ -668,7 +673,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMultiMatchQuery() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
|
||||
indexRandom(true,
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1", "field2", "value4", "field3", "value3"),
|
||||
@ -734,7 +739,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMatchQueryZeroTermsQuery() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,analyzer=classic", "field2", "type=string,analyzer=classic"));
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1").get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "value2").get();
|
||||
@ -758,7 +763,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
public void testMultiMatchQueryZeroTermsQuery() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", "field1", "type=string,analyzer=classic", "field2", "type=string,analyzer=classic"));
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "value1", "field2", "value2").get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "value3", "field2", "value4").get();
|
||||
@ -784,7 +789,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testMultiMatchQueryMinShouldMatch() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", new String[]{"value1", "value2", "value3"}).get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field2", "value1").get();
|
||||
refresh();
|
||||
@ -830,7 +835,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testFuzzyQueryString() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("str", "kimchy", "date", "2012-02-01", "num", 12).get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("str", "shay", "date", "2012-02-05", "num", 20).get();
|
||||
refresh();
|
||||
@ -852,7 +857,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testQuotedQueryStringWithBoost() throws InterruptedException, ExecutionException {
|
||||
float boost = 10.0f;
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
indexRandom(true, client().prepareIndex("test", "type1", "1").setSource("important", "phrase match", "less_important", "nothing important"),
|
||||
client().prepareIndex("test", "type1", "2").setSource("important", "nothing important", "less_important", "phrase match")
|
||||
);
|
||||
@ -874,7 +879,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSpecialRangeSyntaxInQueryString() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("str", "kimchy", "date", "2012-02-01", "num", 12).get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("str", "shay", "date", "2012-02-05", "num", 20).get();
|
||||
refresh();
|
||||
@ -1174,8 +1179,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testNumericTermsAndRanges() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1",
|
||||
"num_byte", "type=byte", "num_short", "type=short",
|
||||
"num_integer", "type=integer", "num_long", "type=long",
|
||||
@ -1276,8 +1280,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testNumericRangeFilter_2826() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.setSettings(SETTING_NUMBER_OF_SHARDS, 1, SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1",
|
||||
"num_byte", "type=byte", "num_short", "type=short",
|
||||
"num_integer", "type=integer", "num_long", "type=long",
|
||||
@ -1324,8 +1327,9 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test // see #2926
|
||||
public void testMustNot() throws ElasticsearchException, IOException, ExecutionException, InterruptedException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.setSettings(SETTING_NUMBER_OF_SHARDS, 2, SETTING_NUMBER_OF_REPLICAS, 0));
|
||||
assertAcked(prepareCreate("test")
|
||||
//issue manifested only with shards>=2
|
||||
.setSettings(SETTING_NUMBER_OF_SHARDS, between(2, DEFAULT_MAX_NUM_SHARDS)));
|
||||
ensureGreen();
|
||||
|
||||
indexRandom(true, client().prepareIndex("test", "test", "1").setSource("description", "foo other anything bar"),
|
||||
@ -1346,8 +1350,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test // see #2994
|
||||
public void testSimpleSpan() throws ElasticsearchException, IOException, ExecutionException, InterruptedException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.setSettings(SETTING_NUMBER_OF_SHARDS, 1, SETTING_NUMBER_OF_REPLICAS, 0));
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
indexRandom(true, client().prepareIndex("test", "test", "1").setSource("description", "foo other anything bar"),
|
||||
@ -1373,7 +1376,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSpanMultiTermQuery() throws ElasticsearchException, IOException {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1, SETTING_NUMBER_OF_REPLICAS, 0));
|
||||
createIndex("test");
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("test", "test", "1").setSource("description", "foo other anything bar", "count", 1).get();
|
||||
@ -1406,7 +1409,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimpleDFSQuery() throws ElasticsearchException, IOException {
|
||||
assertAcked(prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 5, SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("s", jsonBuilder()
|
||||
.startObject()
|
||||
.startObject("s")
|
||||
@ -1484,8 +1487,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testMatchQueryWithSynonyms() throws IOException {
|
||||
CreateIndexRequestBuilder builder = prepareCreate("test").setSettings(settingsBuilder()
|
||||
.put(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
.put(indexSettings())
|
||||
.put("index.analysis.analyzer.index.type", "custom")
|
||||
.put("index.analysis.analyzer.index.tokenizer", "standard")
|
||||
.put("index.analysis.analyzer.index.filter", "lowercase")
|
||||
@ -1516,8 +1518,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testMatchQueryWithStackedStems() throws IOException {
|
||||
CreateIndexRequestBuilder builder = prepareCreate("test").setSettings(settingsBuilder()
|
||||
.put(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
.put(indexSettings())
|
||||
.put("index.analysis.analyzer.index.type", "custom")
|
||||
.put("index.analysis.analyzer.index.tokenizer", "standard")
|
||||
.put("index.analysis.analyzer.index.filter", "lowercase")
|
||||
@ -1542,8 +1543,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
public void testQueryStringWithSynonyms() throws IOException {
|
||||
CreateIndexRequestBuilder builder = prepareCreate("test").setSettings(settingsBuilder()
|
||||
.put(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
.put(indexSettings())
|
||||
.put("index.analysis.analyzer.index.type", "custom")
|
||||
.put("index.analysis.analyzer.index.tokenizer", "standard")
|
||||
.put("index.analysis.analyzer.index.filter", "lowercase")
|
||||
@ -1694,9 +1694,9 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
@Test // https://github.com/elasticsearch/elasticsearch/issues/2416
|
||||
public void testIndicesQuerySkipParsing() throws Exception {
|
||||
createIndex("simple");
|
||||
client().admin().indices().prepareCreate("related")
|
||||
assertAcked(prepareCreate("related")
|
||||
.addMapping("child", jsonBuilder().startObject().startObject("child").startObject("_parent").field("type", "parent")
|
||||
.endObject().endObject().endObject()).get();
|
||||
.endObject().endObject().endObject()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("simple", "lone").setId("1").setSource("text", "value1").get();
|
||||
@ -1727,9 +1727,9 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
@Test // https://github.com/elasticsearch/elasticsearch/issues/2416
|
||||
public void testIndicesFilterSkipParsing() throws Exception {
|
||||
createIndex("simple");
|
||||
client().admin().indices().prepareCreate("related")
|
||||
assertAcked(prepareCreate("related")
|
||||
.addMapping("child", jsonBuilder().startObject().startObject("child").startObject("_parent").field("type", "parent")
|
||||
.endObject().endObject().endObject()).get();
|
||||
.endObject().endObject().endObject()));
|
||||
ensureGreen();
|
||||
|
||||
client().prepareIndex("simple", "lone").setId("1").setSource("text", "value1").get();
|
||||
@ -1961,7 +1961,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimpleQueryString() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("body", "foo").get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("body", "bar").get();
|
||||
client().prepareIndex("test", "type1", "3").setSource("body", "foo bar").get();
|
||||
@ -1997,7 +1997,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimpleQueryStringLowercasing() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("body", "Professional").get();
|
||||
refresh();
|
||||
|
||||
@ -2021,7 +2021,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testQueryStringLocale() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("body", "bılly").get();
|
||||
refresh();
|
||||
|
||||
@ -2042,7 +2042,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testNestedFieldSimpleQueryString() throws IOException {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping("type1", jsonBuilder()
|
||||
.startObject()
|
||||
.startObject("type1")
|
||||
@ -2082,7 +2082,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimpleQueryStringFlags() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test");
|
||||
client().prepareIndex("test", "type1", "1").setSource("body", "foo").get();
|
||||
client().prepareIndex("test", "type1", "2").setSource("body", "bar").get();
|
||||
client().prepareIndex("test", "type1", "3").setSource("body", "foo bar").get();
|
||||
@ -2137,8 +2137,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSimpleQueryStringLenient() {
|
||||
assertAcked(client().admin().indices().prepareCreate("test1").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
assertAcked(client().admin().indices().prepareCreate("test2").setSettings(SETTING_NUMBER_OF_SHARDS, 1));
|
||||
createIndex("test1", "test2");
|
||||
client().prepareIndex("test1", "type1", "1").setSource("field", "foo").get();
|
||||
client().prepareIndex("test2", "type1", "10").setSource("field", 5).get();
|
||||
refresh();
|
||||
@ -2156,8 +2155,9 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testRangeFilterNoCacheWithNow() throws Exception {
|
||||
assertAcked(client().admin().indices().prepareCreate("test")
|
||||
.setSettings(SETTING_NUMBER_OF_SHARDS, 1, SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
assertAcked(prepareCreate("test")
|
||||
//no replicas to make sure we always hit the very same shard and verify the caching behaviour
|
||||
.setSettings(ImmutableSettings.builder().put(indexSettings()).put(SETTING_NUMBER_OF_REPLICAS, 0))
|
||||
.addMapping("type1", "date", "type=date,format=YYYY-mm-dd"));
|
||||
ensureGreen();
|
||||
|
||||
@ -2235,7 +2235,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testSearchEmptyDoc() {
|
||||
prepareCreate("test").setSettings("{\"index.analysis.analyzer.default.type\":\"keyword\"}").get();
|
||||
assertAcked(prepareCreate("test").setSettings("{\"index.analysis.analyzer.default.type\":\"keyword\"}"));
|
||||
client().prepareIndex("test", "type1", "1").setSource("{}").get();
|
||||
refresh();
|
||||
assertHitCount(client().prepareSearch().setQuery(matchAllQuery()).get(), 1l);
|
||||
@ -2244,8 +2244,7 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
|
||||
@Test // see #5120
|
||||
public void testNGramCopyField() {
|
||||
CreateIndexRequestBuilder builder = prepareCreate("test").setSettings(settingsBuilder()
|
||||
.put(SETTING_NUMBER_OF_SHARDS, 1)
|
||||
.put(SETTING_NUMBER_OF_REPLICAS, 0)
|
||||
.put(indexSettings())
|
||||
.put("index.analysis.analyzer.my_ngram_analyzer.type", "custom")
|
||||
.put("index.analysis.analyzer.my_ngram_analyzer.tokenizer", "my_ngram_tokenizer")
|
||||
.put("index.analysis.tokenizer.my_ngram_tokenizer.type", "nGram")
|
||||
|
@ -41,11 +41,11 @@ import org.elasticsearch.search.rescore.RescoreBuilder.QueryRescorer;
|
||||
import org.elasticsearch.test.ElasticsearchIntegrationTest;
|
||||
import org.junit.Test;
|
||||
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_NUMBER_OF_REPLICAS;
|
||||
import static org.elasticsearch.cluster.metadata.IndexMetaData.SETTING_NUMBER_OF_SHARDS;
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.*;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.greaterThanOrEqualTo;
|
||||
import static org.hamcrest.Matchers.notNullValue;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
/**
|
||||
*
|
||||
@ -54,17 +54,15 @@ public class QueryRescorerTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testEnforceWindowSize() {
|
||||
final int numShards = between(1, 5);
|
||||
assertAcked(client().admin()
|
||||
.indices()
|
||||
.prepareCreate("test")
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", numShards)));
|
||||
createIndex("test");
|
||||
// this
|
||||
int iters = atLeast(10);
|
||||
for (int i = 0; i < iters; i ++) {
|
||||
client().prepareIndex("test", "type", Integer.toString(i)).setSource("f", Integer.toString(i)).execute().actionGet();
|
||||
}
|
||||
refresh();
|
||||
|
||||
int numShards = getNumShards("test").numPrimaries;
|
||||
for (int j = 0 ; j < iters; j++) {
|
||||
SearchResponse searchResponse = client().prepareSearch()
|
||||
.setQuery(QueryBuilders.matchAllQuery())
|
||||
@ -88,14 +86,12 @@ public class QueryRescorerTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testRescorePhrase() throws Exception {
|
||||
client().admin()
|
||||
.indices()
|
||||
.prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping(
|
||||
"type1",
|
||||
jsonBuilder().startObject().startObject("type1").startObject("properties").startObject("field1")
|
||||
.field("analyzer", "whitespace").field("type", "string").endObject().endObject().endObject().endObject())
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2)).execute().actionGet();
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put(indexSettings()).put("index.number_of_shards", 2)));
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "the quick brown fox").execute().actionGet();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "the quick lazy huge brown fox jumps over the tree").execute()
|
||||
@ -147,8 +143,7 @@ public class QueryRescorerTests extends ElasticsearchIntegrationTest {
|
||||
.startObject("field1").field("type", "string").field("index_analyzer", "whitespace").field("search_analyzer", "synonym")
|
||||
.endObject().endObject().endObject().endObject();
|
||||
|
||||
client().admin().indices().prepareCreate("test").addMapping("type1", mapping).setSettings(builder.put("index.number_of_shards", 1))
|
||||
.execute().actionGet();
|
||||
assertAcked(client().admin().indices().prepareCreate("test").addMapping("type1", mapping).setSettings(builder.put("index.number_of_shards", 1)));
|
||||
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "massachusetts avenue boston massachusetts").execute().actionGet();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "lexington avenue boston massachusetts").execute().actionGet();
|
||||
@ -198,7 +193,7 @@ public class QueryRescorerTests extends ElasticsearchIntegrationTest {
|
||||
assertThirdHit(searchResponse, hasId("3"));
|
||||
}
|
||||
|
||||
private static final void assertEquivalent(String query, SearchResponse plain, SearchResponse rescored) {
|
||||
private static void assertEquivalent(String query, SearchResponse plain, SearchResponse rescored) {
|
||||
assertNoFailures(plain);
|
||||
assertNoFailures(rescored);
|
||||
SearchHits leftHits = plain.getHits();
|
||||
@ -218,7 +213,7 @@ public class QueryRescorerTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
}
|
||||
|
||||
private static final void assertEquivalentOrSubstringMatch(String query, SearchResponse plain, SearchResponse rescored) {
|
||||
private static void assertEquivalentOrSubstringMatch(String query, SearchResponse plain, SearchResponse rescored) {
|
||||
SearchHits leftHits = plain.getHits();
|
||||
SearchHits rightHits = rescored.getHits();
|
||||
assertThat(leftHits.getTotalHits(), equalTo(rightHits.getTotalHits()));
|
||||
@ -240,7 +235,7 @@ public class QueryRescorerTests extends ElasticsearchIntegrationTest {
|
||||
@Test
|
||||
// forces QUERY_THEN_FETCH because of https://github.com/elasticsearch/elasticsearch/issues/4829
|
||||
public void testEquivalence() throws Exception {
|
||||
int numDocs = indexRandomNumbers("whitespace", between(1,5));
|
||||
int numDocs = indexRandomNumbers("whitespace");
|
||||
|
||||
final int iters = atLeast(50);
|
||||
for (int i = 0; i < iters; i++) {
|
||||
@ -310,12 +305,12 @@ public class QueryRescorerTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testExplain() throws Exception {
|
||||
prepareCreate("test")
|
||||
assertAcked(prepareCreate("test")
|
||||
.addMapping(
|
||||
"type1",
|
||||
jsonBuilder().startObject().startObject("type1").startObject("properties").startObject("field1")
|
||||
.field("analyzer", "whitespace").field("type", "string").endObject().endObject().endObject().endObject())
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", 2)).execute().actionGet();
|
||||
);
|
||||
ensureGreen();
|
||||
client().prepareIndex("test", "type1", "1").setSource("field1", "the quick brown fox").execute().actionGet();
|
||||
client().prepareIndex("test", "type1", "2").setSource("field1", "the quick lazy huge brown fox jumps over the tree").execute()
|
||||
@ -410,7 +405,7 @@ public class QueryRescorerTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
@Test
|
||||
public void testScoring() throws Exception {
|
||||
int numDocs = indexRandomNumbers("keyword", between(1,5));
|
||||
int numDocs = indexRandomNumbers("keyword");
|
||||
|
||||
String[] scoreModes = new String[]{ "max", "min", "avg", "total", "multiply", "" };
|
||||
float primaryWeight = 1.1f;
|
||||
@ -531,7 +526,17 @@ public class QueryRescorerTests extends ElasticsearchIntegrationTest {
|
||||
assertSecondHit(response, hasScore(1001.0f)); // Not sure which one it is but it is ninety something
|
||||
}
|
||||
|
||||
private int indexRandomNumbers(String analyzer) throws Exception {
|
||||
return indexRandomNumbers(analyzer, -1);
|
||||
}
|
||||
|
||||
private int indexRandomNumbers(String analyzer, int shards) throws Exception {
|
||||
Builder builder = ImmutableSettings.settingsBuilder().put(indexSettings()).put(SETTING_NUMBER_OF_REPLICAS, between(0, 1));
|
||||
|
||||
if (shards > 0) {
|
||||
builder.put(SETTING_NUMBER_OF_SHARDS, shards);
|
||||
}
|
||||
|
||||
client().admin()
|
||||
.indices()
|
||||
.prepareCreate("test")
|
||||
@ -539,7 +544,7 @@ public class QueryRescorerTests extends ElasticsearchIntegrationTest {
|
||||
"type1",
|
||||
jsonBuilder().startObject().startObject("type1").startObject("properties").startObject("field1")
|
||||
.field("analyzer", analyzer).field("type", "string").endObject().endObject().endObject().endObject())
|
||||
.setSettings(ImmutableSettings.settingsBuilder().put("index.number_of_shards", shards).put("index.number_of_replicas", between(0,1))).get();
|
||||
.setSettings(builder).get();
|
||||
int numDocs = atLeast(100);
|
||||
IndexRequestBuilder[] docs = new IndexRequestBuilder[numDocs];
|
||||
for (int i = 0; i < numDocs; i++) {
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user