Merge branch 'master' into ccr

* master:
  Remove reference to non-existent store type (#32418)
  [TEST] Mute failing FlushIT test
  Fix ordering of bootstrap checks in docs (#32417)
  [TEST] Mute failing InternalEngineTests#testSeqNoAndCheckpoints
  [TEST] Mute failing testConvertLongHexError
  bump lucene version after backport
  Upgrade to Lucene-7.5.0-snapshot-608f0277b0 (#32390)
  [Kerberos] Avoid vagrant update on precommit (#32416)
  TESTS: Move netty leak detection to paranoid level (#32354)
  [DOCS] Fixes formatting of scope object in job resource
  Copy missing segment attributes in getSegmentInfo (#32396)
  AbstractQueryTestCase should run without type less often (#28936)
  INGEST: Fix Deprecation Warning in Script Proc. (#32407)
  Switch x-pack/plugin to new style Requests (#32327)
  Docs: Correcting a typo in tophits (#32359)
  Build: Stop double generating buildSrc pom (#32408)
  TEST: Avoid triggering merges in FlushIT
  Fix missing JavaDoc for @throws in several places in KerberosTicketValidator.
  Switch x-pack full restart to new style Requests (#32294)
  Release requests in cors handler (#32364)
  Painless: Clean Up PainlessClass Variables (#32380)
  Docs: Fix callouts in put license HL REST docs (#32363)
  [ML] Consistent pattern for strict/lenient parser names (#32399)
  Update update-settings.asciidoc (#31378)
  Remove some dead code (#31993)
  Introduce index store plugins (#32375)
  Rank-Eval: Reduce scope of an unchecked supression
  Make sure _forcemerge respects `max_num_segments`. (#32291)
  TESTS: Fix Buf Leaks in HttpReadWriteHandlerTests (#32377)
  Only enforce password hashing check if FIPS enabled (#32383)
This commit is contained in:
Nhat Nguyen 2018-07-27 16:23:58 -04:00
commit 2f756b00f6
172 changed files with 1477 additions and 1381 deletions

View File

@ -183,4 +183,12 @@ if (project != rootProject) {
testClass = 'org.elasticsearch.gradle.test.GradleUnitTestCase'
integTestClass = 'org.elasticsearch.gradle.test.GradleIntegrationTestCase'
}
/*
* We alread configure publication and we don't need or want this one that
* comes from the java-gradle-plugin.
*/
afterEvaluate {
generatePomFileForPluginMavenPublication.enabled = false
}
}

View File

@ -1,5 +1,5 @@
elasticsearch = 7.0.0-alpha1
lucene = 7.5.0-snapshot-b9e064b935
lucene = 7.5.0-snapshot-608f0277b0
# optional dependencies
spatial4j = 0.7

View File

@ -33,10 +33,9 @@ include-tagged::{doc-tests}/LicensingDocumentationIT.java[put-license-response]
--------------------------------------------------
<1> The status of the license
<2> Make sure that the license is valid.
<3> Check the acknowledge flag.
<4> It should be true if license is acknowledge.
<5> Otherwise we can see the acknowledge messages in `acknowledgeHeader()` and check
component-specific messages in `acknowledgeMessages()`.
<3> Check the acknowledge flag. It should be true if license is acknowledged.
<4> Otherwise we can see the acknowledge messages in `acknowledgeHeader()`
<5> and check component-specific messages in `acknowledgeMessages()`.
[[java-rest-high-put-license-async]]
==== Asynchronous Execution

View File

@ -172,7 +172,7 @@ In the example below we search across crawled webpages. For each webpage we stor
belong to. By defining a `terms` aggregator on the `domain` field we group the result set of webpages by domain. The
`top_hits` aggregator is then defined as sub-aggregator, so that the top matching hits are collected per bucket.
Also a `max` aggregator is defined which is used by the `terms` aggregator's order feature the return the buckets by
Also a `max` aggregator is defined which is used by the `terms` aggregator's order feature to return the buckets by
relevancy order of the most relevant document in a bucket.
[source,js]

View File

@ -1,9 +1,18 @@
[[cluster-update-settings]]
== Cluster Update Settings
Allows to update cluster wide specific settings. Settings updated can
either be persistent (applied across restarts) or transient (will not
survive a full cluster restart). Here is an example:
Use this API to review and change cluster-wide settings.
To review cluster settings:
[source,js]
--------------------------------------------------
GET /_cluster/settings
--------------------------------------------------
// CONSOLE
Updates to settings can be persistent, meaning they apply across restarts, or transient, where they don't
survive a full cluster restart. Here is an example of a persistent update:
[source,js]
--------------------------------------------------
@ -16,7 +25,7 @@ PUT /_cluster/settings
--------------------------------------------------
// CONSOLE
Or:
This update is transient:
[source,js]
--------------------------------------------------
@ -29,8 +38,7 @@ PUT /_cluster/settings?flat_settings=true
--------------------------------------------------
// CONSOLE
The cluster responds with the settings updated. So the response for the
last example will be:
The response to an update returns the changed setting, as in this response to the transient example:
[source,js]
--------------------------------------------------
@ -44,11 +52,14 @@ last example will be:
--------------------------------------------------
// TESTRESPONSE[s/\.\.\./"acknowledged": true,/]
Resetting persistent or transient settings can be done by assigning a
`null` value. If a transient setting is reset, the persistent setting
is applied if available. Otherwise Elasticsearch will fallback to the setting
defined at the configuration file or, if not existent, to the default
value. Here is an example:
You can reset persistent or transient settings by assigning a
`null` value. If a transient setting is reset, the first one of these values that is defined is applied:
* the persistent setting
* the setting in the configuration file
* the default value.
This example resets a setting:
[source,js]
--------------------------------------------------
@ -61,8 +72,7 @@ PUT /_cluster/settings
--------------------------------------------------
// CONSOLE
Reset settings will not be included in the cluster response. So
the response for the last example will be:
The response does not include settings that have been reset:
[source,js]
--------------------------------------------------
@ -74,8 +84,8 @@ the response for the last example will be:
--------------------------------------------------
// TESTRESPONSE[s/\.\.\./"acknowledged": true,/]
Settings can also be reset using simple wildcards. For instance to reset
all dynamic `indices.recovery` setting a prefix can be used:
You can also reset settings using wildcards. For example, to reset
all dynamic `indices.recovery` settings:
[source,js]
--------------------------------------------------
@ -88,25 +98,19 @@ PUT /_cluster/settings
--------------------------------------------------
// CONSOLE
Cluster wide settings can be returned using:
[source,js]
--------------------------------------------------
GET /_cluster/settings
--------------------------------------------------
// CONSOLE
[float]
=== Precedence of settings
=== Order of Precedence
Transient cluster settings take precedence over persistent cluster settings,
which take precedence over settings configured in the `elasticsearch.yml`
config file.
The order of precedence for cluster settings is:
For this reason it is preferrable to use the `elasticsearch.yml` file only
for local configurations, and set all cluster-wider settings with the
1. transient cluster settings
2. persistent cluster settings
3. settings in the `elasticsearch.yml` configuration file.
It's best to use the `elasticsearch.yml` file only
for local configurations, and set all cluster-wide settings with the
`settings` API.
A list of dynamically updatable settings can be found in the
<<modules,Modules>> documentation.
You can find the list of settings that you can dynamically update in <<modules,Modules>>.

View File

@ -67,11 +67,6 @@ process equal to the size of the file being mapped. Before using this
class, be sure you have allowed plenty of
<<vm-max-map-count,virtual address space>>.
[[default_fs]]`default_fs` deprecated[5.0.0, The `default_fs` store type is deprecated - use `fs` instead]::
The `default` type is deprecated and is aliased to `fs` for backward
compatibility.
=== Pre-loading data into the file system cache
NOTE: This is an expert setting, the details of which may change in the future.

View File

@ -118,6 +118,19 @@ least 4096 threads. This can be done via `/etc/security/limits.conf`
using the `nproc` setting (note that you might have to increase the
limits for the `root` user too).
=== Max file size check
The segment files that are the components of individual shards and the translog
generations that are components of the translog can get large (exceeding
multiple gigabytes). On systems where the max size of files that can be created
by the Elasticsearch process is limited, this can lead to failed
writes. Therefore, the safest option here is that the max file size is unlimited
and that is what the max file size bootstrap check enforces. To pass the max
file check, you must configure your system to allow the Elasticsearch process
the ability to write files of unlimited size. This can be done via
`/etc/security/limits.conf` using the `fsize` setting to `unlimited` (note that
you might have to increase the limits for the `root` user too).
[[max-size-virtual-memory-check]]
=== Maximum size virtual memory check
@ -133,19 +146,6 @@ address space. This can be done via `/etc/security/limits.conf` using
the `as` setting to `unlimited` (note that you might have to increase
the limits for the `root` user too).
=== Max file size check
The segment files that are the components of individual shards and the translog
generations that are components of the translog can get large (exceeding
multiple gigabytes). On systems where the max size of files that can be created
by the Elasticsearch process is limited, this can lead to failed
writes. Therefore, the safest option here is that the max file size is unlimited
and that is what the max file size bootstrap check enforces. To pass the max
file check, you must configure your system to allow the Elasticsearch process
the ability to write files of unlimited size. This can be done via
`/etc/security/limits.conf` using the `fsize` setting to `unlimited` (note that
you might have to increase the limits for the `root` user too).
=== Maximum map count check
Continuing from the previous <<max-size-virtual-memory-check,point>>, to

View File

@ -1,7 +1,7 @@
[[vm-max-map-count]]
=== Virtual memory
Elasticsearch uses a <<default_fs,`mmapfs`>> directory by
Elasticsearch uses a <<mmapfs,`mmapfs`>> directory by
default to store its indices. The default operating system limits on mmap
counts is likely to be too low, which may result in out of memory exceptions.

View File

@ -78,7 +78,6 @@ public abstract class ArrayValuesSourceParser<VS extends ValuesSource> implement
throws IOException {
List<String> fields = null;
ValueType valueType = null;
String format = null;
Map<String, Object> missingMap = null;
Map<ParseField, Object> otherOptions = new HashMap<>();
@ -145,9 +144,6 @@ public abstract class ArrayValuesSourceParser<VS extends ValuesSource> implement
if (fields != null) {
factory.fields(fields);
}
if (valueType != null) {
factory.valueType(valueType);
}
if (format != null) {
factory.format(format);
}

View File

@ -19,11 +19,6 @@
package org.elasticsearch.ingest.common;
import org.elasticsearch.ingest.IngestDocument;
import org.elasticsearch.ingest.Processor;
import org.elasticsearch.ingest.RandomDocumentPicks;
import org.elasticsearch.test.ESTestCase;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
@ -31,12 +26,17 @@ import java.util.List;
import java.util.Locale;
import java.util.Map;
import org.elasticsearch.ingest.IngestDocument;
import org.elasticsearch.ingest.Processor;
import org.elasticsearch.ingest.RandomDocumentPicks;
import org.elasticsearch.test.ESTestCase;
import static org.elasticsearch.ingest.IngestDocumentMatcher.assertIngestDocument;
import static org.elasticsearch.ingest.common.ConvertProcessor.Type;
import static org.hamcrest.Matchers.containsString;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.sameInstance;
import static org.hamcrest.Matchers.not;
import static org.hamcrest.Matchers.sameInstance;
public class ConvertProcessorTests extends ESTestCase {
@ -138,6 +138,7 @@ public class ConvertProcessorTests extends ESTestCase {
assertThat(ingestDocument.getFieldValue(fieldName, Long.class), equalTo(10L));
}
@AwaitsFix( bugUrl = "https://github.com/elastic/elasticsearch/issues/32370")
public void testConvertLongHexError() {
IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random());
String value = "0x" + randomAlphaOfLengthBetween(1, 10);

View File

@ -0,0 +1 @@
bd7d8078a2d0ad11a24f54156cc015630c96858a

View File

@ -1 +0,0 @@
3ef67619b7f6d891a0bc1f8c19bf6ab6a7106de1

View File

@ -19,11 +19,11 @@
package org.elasticsearch.painless.spi;
import org.elasticsearch.script.ScriptContext;
import java.util.List;
import java.util.Map;
import org.elasticsearch.script.ScriptContext;
public interface PainlessExtension {
Map<ScriptContext<?>, List<Whitelist>> getContextWhitelists();

View File

@ -421,7 +421,7 @@ public final class Def {
PainlessClass struct = painlessLookup.getPainlessStructFromJavaClass(clazz);
if (struct != null) {
MethodHandle handle = struct.getters.get(name);
MethodHandle handle = struct.getterMethodHandles.get(name);
if (handle != null) {
return handle;
}
@ -431,7 +431,7 @@ public final class Def {
struct = painlessLookup.getPainlessStructFromJavaClass(iface);
if (struct != null) {
MethodHandle handle = struct.getters.get(name);
MethodHandle handle = struct.getterMethodHandles.get(name);
if (handle != null) {
return handle;
}
@ -492,7 +492,7 @@ public final class Def {
PainlessClass struct = painlessLookup.getPainlessStructFromJavaClass(clazz);
if (struct != null) {
MethodHandle handle = struct.setters.get(name);
MethodHandle handle = struct.setterMethodHandles.get(name);
if (handle != null) {
return handle;
}
@ -502,7 +502,7 @@ public final class Def {
struct = painlessLookup.getPainlessStructFromJavaClass(iface);
if (struct != null) {
MethodHandle handle = struct.setters.get(name);
MethodHandle handle = struct.setterMethodHandles.get(name);
if (handle != null) {
return handle;
}

View File

@ -22,6 +22,7 @@ package org.elasticsearch.painless;
import org.elasticsearch.painless.api.Debug;
import org.elasticsearch.painless.lookup.PainlessClass;
import org.elasticsearch.painless.lookup.PainlessLookup;
import org.elasticsearch.painless.lookup.PainlessLookupUtility;
import org.elasticsearch.script.ScriptException;
import java.util.List;
@ -58,7 +59,7 @@ public class PainlessExplainError extends Error {
javaClassName = objectToExplain.getClass().getName();
PainlessClass struct = painlessLookup.getPainlessStructFromJavaClass(objectToExplain.getClass());
if (struct != null) {
painlessClassName = struct.name;
painlessClassName = PainlessLookupUtility.typeToCanonicalTypeName(objectToExplain.getClass());
}
}

View File

@ -19,47 +19,38 @@
package org.elasticsearch.painless.lookup;
import org.objectweb.asm.Type;
import java.lang.invoke.MethodHandle;
import java.util.Collections;
import java.util.Map;
public final class PainlessClass {
public final String name;
public final Class<?> clazz;
public final Type type;
public final Map<String, PainlessMethod> constructors;
public final Map<String, PainlessMethod> staticMethods;
public final Map<String, PainlessMethod> methods;
public final Map<String, PainlessField> staticMembers;
public final Map<String, PainlessField> members;
public final Map<String, PainlessField> staticFields;
public final Map<String, PainlessField> fields;
public final Map<String, MethodHandle> getters;
public final Map<String, MethodHandle> setters;
public final Map<String, MethodHandle> getterMethodHandles;
public final Map<String, MethodHandle> setterMethodHandles;
public final PainlessMethod functionalMethod;
PainlessClass(String name, Class<?> clazz, Type type,
Map<String, PainlessMethod> constructors, Map<String, PainlessMethod> staticMethods, Map<String, PainlessMethod> methods,
Map<String, PainlessField> staticMembers, Map<String, PainlessField> members,
Map<String, MethodHandle> getters, Map<String, MethodHandle> setters,
PainlessMethod functionalMethod) {
this.name = name;
this.clazz = clazz;
this.type = type;
PainlessClass(Map<String, PainlessMethod> constructors,
Map<String, PainlessMethod> staticMethods, Map<String, PainlessMethod> methods,
Map<String, PainlessField> staticFields, Map<String, PainlessField> fields,
Map<String, MethodHandle> getterMethodHandles, Map<String, MethodHandle> setterMethodHandles,
PainlessMethod functionalMethod) {
this.constructors = Collections.unmodifiableMap(constructors);
this.staticMethods = Collections.unmodifiableMap(staticMethods);
this.methods = Collections.unmodifiableMap(methods);
this.staticMembers = Collections.unmodifiableMap(staticMembers);
this.members = Collections.unmodifiableMap(members);
this.staticFields = Collections.unmodifiableMap(staticFields);
this.fields = Collections.unmodifiableMap(fields);
this.getters = Collections.unmodifiableMap(getters);
this.setters = Collections.unmodifiableMap(setters);
this.getterMethodHandles = Collections.unmodifiableMap(getterMethodHandles);
this.setterMethodHandles = Collections.unmodifiableMap(setterMethodHandles);
this.functionalMethod = functionalMethod;
}

View File

@ -19,52 +19,39 @@
package org.elasticsearch.painless.lookup;
import org.objectweb.asm.Type;
import java.lang.invoke.MethodHandle;
import java.util.HashMap;
import java.util.Map;
final class PainlessClassBuilder {
final String name;
final Class<?> clazz;
final Type type;
final Map<String, PainlessMethod> constructors;
final Map<String, PainlessMethod> staticMethods;
final Map<String, PainlessMethod> methods;
final Map<String, PainlessField> staticMembers;
final Map<String, PainlessField> members;
final Map<String, PainlessField> staticFields;
final Map<String, PainlessField> fields;
final Map<String, MethodHandle> getters;
final Map<String, MethodHandle> setters;
final Map<String, MethodHandle> getterMethodHandles;
final Map<String, MethodHandle> setterMethodHandles;
PainlessMethod functionalMethod;
PainlessClassBuilder(String name, Class<?> clazz, Type type) {
this.name = name;
this.clazz = clazz;
this.type = type;
PainlessClassBuilder() {
constructors = new HashMap<>();
staticMethods = new HashMap<>();
methods = new HashMap<>();
staticMembers = new HashMap<>();
members = new HashMap<>();
staticFields = new HashMap<>();
fields = new HashMap<>();
getters = new HashMap<>();
setters = new HashMap<>();
getterMethodHandles = new HashMap<>();
setterMethodHandles = new HashMap<>();
functionalMethod = null;
}
PainlessClass build() {
return new PainlessClass(name, clazz, type,
constructors, staticMethods, methods,
staticMembers, members,
getters, setters,
functionalMethod);
return new PainlessClass(constructors, staticMethods, methods, staticFields, fields,
getterMethodHandles, setterMethodHandles, functionalMethod);
}
}

View File

@ -29,8 +29,8 @@ import java.util.Map;
*/
public final class PainlessLookup {
public Collection<PainlessClass> getStructs() {
return classesToPainlessClasses.values();
public Collection<Class<?>> getStructs() {
return classesToPainlessClasses.keySet();
}
private final Map<String, Class<?>> canonicalClassNamesToClasses;

View File

@ -179,8 +179,7 @@ public class PainlessLookupBuilder {
classesToPainlessClassBuilders = new HashMap<>();
canonicalClassNamesToClasses.put(DEF_CLASS_NAME, def.class);
classesToPainlessClassBuilders.put(def.class,
new PainlessClassBuilder(DEF_CLASS_NAME, Object.class, org.objectweb.asm.Type.getType(Object.class)));
classesToPainlessClassBuilders.put(def.class, new PainlessClassBuilder());
}
private Class<?> canonicalTypeNameToType(String canonicalTypeName) {
@ -234,17 +233,21 @@ public class PainlessLookupBuilder {
throw new IllegalArgumentException("invalid class name [" + canonicalClassName + "]");
}
Class<?> existingClass = canonicalClassNamesToClasses.get(typeToCanonicalTypeName(clazz));
if (existingClass != null && existingClass != clazz) {
throw new IllegalArgumentException("class [" + canonicalClassName + "] " +
"cannot represent multiple java classes with the same name from different class loaders");
}
PainlessClassBuilder existingPainlessClassBuilder = classesToPainlessClassBuilders.get(clazz);
if (existingPainlessClassBuilder == null) {
PainlessClassBuilder painlessClassBuilder =
new PainlessClassBuilder(canonicalClassName, clazz, org.objectweb.asm.Type.getType(clazz));
PainlessClassBuilder painlessClassBuilder = new PainlessClassBuilder();
canonicalClassNamesToClasses.put(canonicalClassName, clazz);
classesToPainlessClassBuilders.put(clazz, painlessClassBuilder);
} else if (existingPainlessClassBuilder.clazz.equals(clazz) == false) {
throw new IllegalArgumentException("class [" + canonicalClassName + "] " +
"cannot represent multiple java classes with the same name from different class loaders");
}
String javaClassName = clazz.getName();
@ -265,7 +268,7 @@ public class PainlessLookupBuilder {
canonicalClassNamesToClasses.put(importedCanonicalClassName, clazz);
}
} else if (importedPainlessClass.equals(clazz) == false) {
} else if (importedPainlessClass != clazz) {
throw new IllegalArgumentException("imported class [" + importedCanonicalClassName + "] cannot represent multiple " +
"classes [" + canonicalClassName + "] and [" + typeToCanonicalTypeName(importedPainlessClass) + "]");
} else if (importClassName == false) {
@ -504,10 +507,10 @@ public class PainlessLookupBuilder {
if (painlessMethod == null) {
org.objectweb.asm.commons.Method asmMethod = org.objectweb.asm.commons.Method.getMethod(javaMethod);
MethodHandle javaMethodHandle;
MethodHandle methodHandle;
try {
javaMethodHandle = MethodHandles.publicLookup().in(targetClass).unreflect(javaMethod);
methodHandle = MethodHandles.publicLookup().in(targetClass).unreflect(javaMethod);
} catch (IllegalAccessException iae) {
throw new IllegalArgumentException("static method handle [[" + targetClass.getCanonicalName() + "], " +
"[" + methodName + "], " + typesToCanonicalTypeNames(typeParameters) + "] not found", iae);
@ -516,7 +519,7 @@ public class PainlessLookupBuilder {
painlessMethod = painlessMethodCache.computeIfAbsent(
new PainlessMethodCacheKey(targetClass, methodName, typeParameters),
key -> new PainlessMethod(methodName, targetClass, null, returnType,
typeParameters, asmMethod, javaMethod.getModifiers(), javaMethodHandle));
typeParameters, asmMethod, javaMethod.getModifiers(), methodHandle));
painlessClassBuilder.staticMethods.put(painlessMethodKey, painlessMethod);
} else if ((painlessMethod.name.equals(methodName) && painlessMethod.rtn == returnType &&
@ -535,18 +538,18 @@ public class PainlessLookupBuilder {
if (painlessMethod == null) {
org.objectweb.asm.commons.Method asmMethod = org.objectweb.asm.commons.Method.getMethod(javaMethod);
MethodHandle javaMethodHandle;
MethodHandle methodHandle;
if (augmentedClass == null) {
try {
javaMethodHandle = MethodHandles.publicLookup().in(targetClass).unreflect(javaMethod);
methodHandle = MethodHandles.publicLookup().in(targetClass).unreflect(javaMethod);
} catch (IllegalAccessException iae) {
throw new IllegalArgumentException("method handle [[" + targetClass.getCanonicalName() + "], " +
"[" + methodName + "], " + typesToCanonicalTypeNames(typeParameters) + "] not found", iae);
}
} else {
try {
javaMethodHandle = MethodHandles.publicLookup().in(augmentedClass).unreflect(javaMethod);
methodHandle = MethodHandles.publicLookup().in(augmentedClass).unreflect(javaMethod);
} catch (IllegalAccessException iae) {
throw new IllegalArgumentException("method handle [[" + targetClass.getCanonicalName() + "], " +
"[" + methodName + "], " + typesToCanonicalTypeNames(typeParameters) + "] not found " +
@ -557,7 +560,7 @@ public class PainlessLookupBuilder {
painlessMethod = painlessMethodCache.computeIfAbsent(
new PainlessMethodCacheKey(targetClass, methodName, typeParameters),
key -> new PainlessMethod(methodName, targetClass, augmentedClass, returnType,
typeParameters, asmMethod, javaMethod.getModifiers(), javaMethodHandle));
typeParameters, asmMethod, javaMethod.getModifiers(), methodHandle));
painlessClassBuilder.methods.put(painlessMethodKey, painlessMethod);
} else if ((painlessMethod.name.equals(methodName) && painlessMethod.rtn == returnType &&
@ -650,7 +653,7 @@ public class PainlessLookupBuilder {
throw new IllegalArgumentException("static field [[" + targetCanonicalClassName + "]. [" + fieldName + "]] must be final");
}
PainlessField painlessField = painlessClassBuilder.staticMembers.get(painlessFieldKey);
PainlessField painlessField = painlessClassBuilder.staticFields.get(painlessFieldKey);
if (painlessField == null) {
painlessField = painlessFieldCache.computeIfAbsent(
@ -658,7 +661,7 @@ public class PainlessLookupBuilder {
key -> new PainlessField(fieldName, javaField.getName(), targetClass,
typeParameter, javaField.getModifiers(), null, null));
painlessClassBuilder.staticMembers.put(painlessFieldKey, painlessField);
painlessClassBuilder.staticFields.put(painlessFieldKey, painlessField);
} else if (painlessField.clazz != typeParameter) {
throw new IllegalArgumentException("cannot have static fields " +
"[[" + targetCanonicalClassName + "], [" + fieldName + "], [" +
@ -674,7 +677,7 @@ public class PainlessLookupBuilder {
methodHandleGetter = MethodHandles.publicLookup().unreflectGetter(javaField);
} catch (IllegalAccessException iae) {
throw new IllegalArgumentException(
"method handle getter not found for field [[" + targetCanonicalClassName + "], [" + fieldName + "]]");
"getter method handle not found for field [[" + targetCanonicalClassName + "], [" + fieldName + "]]");
}
MethodHandle methodHandleSetter;
@ -683,10 +686,10 @@ public class PainlessLookupBuilder {
methodHandleSetter = MethodHandles.publicLookup().unreflectSetter(javaField);
} catch (IllegalAccessException iae) {
throw new IllegalArgumentException(
"method handle setter not found for field [[" + targetCanonicalClassName + "], [" + fieldName + "]]");
"setter method handle not found for field [[" + targetCanonicalClassName + "], [" + fieldName + "]]");
}
PainlessField painlessField = painlessClassBuilder.members.get(painlessFieldKey);
PainlessField painlessField = painlessClassBuilder.fields.get(painlessFieldKey);
if (painlessField == null) {
painlessField = painlessFieldCache.computeIfAbsent(
@ -694,7 +697,7 @@ public class PainlessLookupBuilder {
key -> new PainlessField(fieldName, javaField.getName(), targetClass,
typeParameter, javaField.getModifiers(), methodHandleGetter, methodHandleSetter));
painlessClassBuilder.members.put(fieldName, painlessField);
painlessClassBuilder.fields.put(fieldName, painlessField);
} else if (painlessField.clazz != typeParameter) {
throw new IllegalArgumentException("cannot have fields " +
"[[" + targetCanonicalClassName + "], [" + fieldName + "], [" +
@ -771,14 +774,14 @@ public class PainlessLookupBuilder {
}
}
for (Map.Entry<String, PainlessField> painlessFieldEntry : originalPainlessClassBuilder.members.entrySet()) {
for (Map.Entry<String, PainlessField> painlessFieldEntry : originalPainlessClassBuilder.fields.entrySet()) {
String painlessFieldKey = painlessFieldEntry.getKey();
PainlessField newPainlessField = painlessFieldEntry.getValue();
PainlessField existingPainlessField = targetPainlessClassBuilder.members.get(painlessFieldKey);
PainlessField existingPainlessField = targetPainlessClassBuilder.fields.get(painlessFieldKey);
if (existingPainlessField == null || existingPainlessField.target != newPainlessField.target &&
existingPainlessField.target.isAssignableFrom(newPainlessField.target)) {
targetPainlessClassBuilder.members.put(painlessFieldKey, newPainlessField);
targetPainlessClassBuilder.fields.put(painlessFieldKey, newPainlessField);
}
}
}
@ -796,34 +799,32 @@ public class PainlessLookupBuilder {
if (typeParametersSize == 0 && methodName.startsWith("get") && methodName.length() > 3 &&
Character.isUpperCase(methodName.charAt(3))) {
painlessClassBuilder.getters.putIfAbsent(
painlessClassBuilder.getterMethodHandles.putIfAbsent(
Character.toLowerCase(methodName.charAt(3)) + methodName.substring(4), painlessMethod.handle);
} else if (typeParametersSize == 0 && methodName.startsWith("is") && methodName.length() > 2 &&
Character.isUpperCase(methodName.charAt(2))) {
painlessClassBuilder.getters.putIfAbsent(
painlessClassBuilder.getterMethodHandles.putIfAbsent(
Character.toLowerCase(methodName.charAt(2)) + methodName.substring(3), painlessMethod.handle);
} else if (typeParametersSize == 1 && methodName.startsWith("set") && methodName.length() > 3 &&
Character.isUpperCase(methodName.charAt(3))) {
painlessClassBuilder.setters.putIfAbsent(
painlessClassBuilder.setterMethodHandles.putIfAbsent(
Character.toLowerCase(methodName.charAt(3)) + methodName.substring(4), painlessMethod.handle);
}
}
for (PainlessField painlessField : painlessClassBuilder.members.values()) {
painlessClassBuilder.getters.put(painlessField.name, painlessField.getter);
painlessClassBuilder.setters.put(painlessField.name, painlessField.setter);
for (PainlessField painlessField : painlessClassBuilder.fields.values()) {
painlessClassBuilder.getterMethodHandles.put(painlessField.name, painlessField.getter);
painlessClassBuilder.setterMethodHandles.put(painlessField.name, painlessField.setter);
}
}
private void setFunctionalInterfaceMethods() {
for (Map.Entry<Class<?>, PainlessClassBuilder> painlessClassBuilderEntry : classesToPainlessClassBuilders.entrySet()) {
setFunctionalInterfaceMethod(painlessClassBuilderEntry.getValue());
setFunctionalInterfaceMethod(painlessClassBuilderEntry.getKey(), painlessClassBuilderEntry.getValue());
}
}
private void setFunctionalInterfaceMethod(PainlessClassBuilder painlessClassBuilder) {
Class<?> targetClass = painlessClassBuilder.clazz;
private void setFunctionalInterfaceMethod(Class<?> targetClass, PainlessClassBuilder painlessClassBuilder) {
if (targetClass.isInterface()) {
List<java.lang.reflect.Method> javaMethods = new ArrayList<>();

View File

@ -72,8 +72,9 @@ public final class ENewObj extends AExpression {
constructor.arguments.toArray(types);
if (constructor.arguments.size() != arguments.size()) {
throw createError(new IllegalArgumentException("When calling constructor on type [" + struct.name + "]" +
" expected [" + constructor.arguments.size() + "] arguments, but found [" + arguments.size() + "]."));
throw createError(new IllegalArgumentException(
"When calling constructor on type [" + PainlessLookupUtility.typeToCanonicalTypeName(actual) + "] " +
"expected [" + constructor.arguments.size() + "] arguments, but found [" + arguments.size() + "]."));
}
for (int argument = 0; argument < arguments.size(); ++argument) {
@ -87,7 +88,8 @@ public final class ENewObj extends AExpression {
statement = true;
} else {
throw createError(new IllegalArgumentException("Unknown new call on type [" + struct.name + "]."));
throw createError(new IllegalArgumentException(
"Unknown new call on type [" + PainlessLookupUtility.typeToCanonicalTypeName(actual) + "]."));
}
}

View File

@ -63,9 +63,9 @@ public final class PBrace extends AStoreable {
} else if (prefix.actual == def.class) {
sub = new PSubDefArray(location, index);
} else if (Map.class.isAssignableFrom(prefix.actual)) {
sub = new PSubMapShortcut(location, locals.getPainlessLookup().getPainlessStructFromJavaClass(prefix.actual), index);
sub = new PSubMapShortcut(location, prefix.actual, index);
} else if (List.class.isAssignableFrom(prefix.actual)) {
sub = new PSubListShortcut(location, locals.getPainlessLookup().getPainlessStructFromJavaClass(prefix.actual), index);
sub = new PSubListShortcut(location, prefix.actual, index);
} else {
throw createError(new IllegalArgumentException("Illegal array access on type " +
"[" + PainlessLookupUtility.typeToCanonicalTypeName(prefix.actual) + "]."));

View File

@ -84,8 +84,8 @@ public final class PCallInvoke extends AExpression {
} else if (prefix.actual == def.class) {
sub = new PSubDefCall(location, name, arguments);
} else {
throw createError(new IllegalArgumentException(
"Unknown call [" + name + "] with [" + arguments.size() + "] arguments on type [" + struct.name + "]."));
throw createError(new IllegalArgumentException("Unknown call [" + name + "] with [" + arguments.size() + "] arguments " +
"on type [" + PainlessLookupUtility.typeToCanonicalTypeName(prefix.actual) + "]."));
}
if (nullSafe) {

View File

@ -68,7 +68,7 @@ public final class PField extends AStoreable {
sub = new PSubDefField(location, value);
} else {
PainlessClass struct = locals.getPainlessLookup().getPainlessStructFromJavaClass(prefix.actual);
PainlessField field = prefix instanceof EStatic ? struct.staticMembers.get(value) : struct.members.get(value);
PainlessField field = prefix instanceof EStatic ? struct.staticFields.get(value) : struct.fields.get(value);
if (field != null) {
sub = new PSubField(location, field);
@ -92,11 +92,11 @@ public final class PField extends AStoreable {
index.analyze(locals);
if (Map.class.isAssignableFrom(prefix.actual)) {
sub = new PSubMapShortcut(location, struct, index);
sub = new PSubMapShortcut(location, prefix.actual, index);
}
if (List.class.isAssignableFrom(prefix.actual)) {
sub = new PSubListShortcut(location, struct, index);
sub = new PSubListShortcut(location, prefix.actual, index);
}
}
}

View File

@ -36,16 +36,16 @@ import java.util.Set;
*/
final class PSubListShortcut extends AStoreable {
private final PainlessClass struct;
private final Class<?> targetClass;
private AExpression index;
private PainlessMethod getter;
private PainlessMethod setter;
PSubListShortcut(Location location, PainlessClass struct, AExpression index) {
PSubListShortcut(Location location, Class<?> targetClass, AExpression index) {
super(location);
this.struct = Objects.requireNonNull(struct);
this.targetClass = Objects.requireNonNull(targetClass);
this.index = Objects.requireNonNull(index);
}
@ -56,16 +56,19 @@ final class PSubListShortcut extends AStoreable {
@Override
void analyze(Locals locals) {
PainlessClass struct = locals.getPainlessLookup().getPainlessStructFromJavaClass(targetClass);
String canonicalClassName = PainlessLookupUtility.typeToCanonicalTypeName(targetClass);
getter = struct.methods.get(PainlessLookupUtility.buildPainlessMethodKey("get", 1));
setter = struct.methods.get(PainlessLookupUtility.buildPainlessMethodKey("set", 2));
if (getter != null && (getter.rtn == void.class || getter.arguments.size() != 1 ||
getter.arguments.get(0) != int.class)) {
throw createError(new IllegalArgumentException("Illegal list get shortcut for type [" + struct.name + "]."));
throw createError(new IllegalArgumentException("Illegal list get shortcut for type [" + canonicalClassName + "]."));
}
if (setter != null && (setter.arguments.size() != 2 || setter.arguments.get(0) != int.class)) {
throw createError(new IllegalArgumentException("Illegal list set shortcut for type [" + struct.name + "]."));
throw createError(new IllegalArgumentException("Illegal list set shortcut for type [" + canonicalClassName + "]."));
}
if (getter != null && setter != null && (!getter.arguments.get(0).equals(setter.arguments.get(0))
@ -80,7 +83,7 @@ final class PSubListShortcut extends AStoreable {
actual = setter != null ? setter.arguments.get(1) : getter.rtn;
} else {
throw createError(new IllegalArgumentException("Illegal list shortcut for type [" + struct.name + "]."));
throw createError(new IllegalArgumentException("Illegal list shortcut for type [" + canonicalClassName + "]."));
}
}

View File

@ -35,16 +35,16 @@ import java.util.Set;
*/
final class PSubMapShortcut extends AStoreable {
private final PainlessClass struct;
private final Class<?> targetClass;
private AExpression index;
private PainlessMethod getter;
private PainlessMethod setter;
PSubMapShortcut(Location location, PainlessClass struct, AExpression index) {
PSubMapShortcut(Location location, Class<?> targetClass, AExpression index) {
super(location);
this.struct = Objects.requireNonNull(struct);
this.targetClass = Objects.requireNonNull(targetClass);
this.index = Objects.requireNonNull(index);
}
@ -55,15 +55,18 @@ final class PSubMapShortcut extends AStoreable {
@Override
void analyze(Locals locals) {
PainlessClass struct = locals.getPainlessLookup().getPainlessStructFromJavaClass(targetClass);
String canonicalClassName = PainlessLookupUtility.typeToCanonicalTypeName(targetClass);
getter = struct.methods.get(PainlessLookupUtility.buildPainlessMethodKey("get", 1));
setter = struct.methods.get(PainlessLookupUtility.buildPainlessMethodKey("put", 2));
if (getter != null && (getter.rtn == void.class || getter.arguments.size() != 1)) {
throw createError(new IllegalArgumentException("Illegal map get shortcut for type [" + struct.name + "]."));
throw createError(new IllegalArgumentException("Illegal map get shortcut for type [" + canonicalClassName + "]."));
}
if (setter != null && setter.arguments.size() != 2) {
throw createError(new IllegalArgumentException("Illegal map set shortcut for type [" + struct.name + "]."));
throw createError(new IllegalArgumentException("Illegal map set shortcut for type [" + canonicalClassName + "]."));
}
if (getter != null && setter != null &&
@ -78,7 +81,7 @@ final class PSubMapShortcut extends AStoreable {
actual = setter != null ? setter.arguments.get(1) : getter.rtn;
} else {
throw createError(new IllegalArgumentException("Illegal map shortcut for type [" + struct.name + "]."));
throw createError(new IllegalArgumentException("Illegal map shortcut for type [" + canonicalClassName + "]."));
}
}

View File

@ -29,6 +29,7 @@ import org.elasticsearch.painless.lookup.PainlessLookup;
import org.elasticsearch.painless.lookup.PainlessLookupBuilder;
import org.elasticsearch.painless.lookup.PainlessLookupUtility;
import org.elasticsearch.painless.lookup.PainlessMethod;
import org.elasticsearch.painless.lookup.def;
import org.elasticsearch.painless.spi.Whitelist;
import java.io.IOException;
@ -71,52 +72,54 @@ public class PainlessDocGenerator {
Files.newOutputStream(indexPath, StandardOpenOption.CREATE_NEW, StandardOpenOption.WRITE),
false, StandardCharsets.UTF_8.name())) {
emitGeneratedWarning(indexStream);
List<PainlessClass> structs = PAINLESS_LOOKUP.getStructs().stream().sorted(comparing(t -> t.name)).collect(toList());
for (PainlessClass struct : structs) {
if (struct.clazz.isPrimitive()) {
List<Class<?>> classes = PAINLESS_LOOKUP.getStructs().stream().sorted(comparing(Class::getCanonicalName)).collect(toList());
for (Class<?> clazz : classes) {
PainlessClass struct = PAINLESS_LOOKUP.getPainlessStructFromJavaClass(clazz);
String canonicalClassName = PainlessLookupUtility.typeToCanonicalTypeName(clazz);
if (clazz.isPrimitive()) {
// Primitives don't have methods to reference
continue;
}
if ("def".equals(struct.name)) {
if (clazz == def.class) {
// def is special but doesn't have any methods all of its own.
continue;
}
indexStream.print("include::");
indexStream.print(struct.name);
indexStream.print(canonicalClassName);
indexStream.println(".asciidoc[]");
Path typePath = apiRootPath.resolve(struct.name + ".asciidoc");
logger.info("Writing [{}.asciidoc]", struct.name);
Path typePath = apiRootPath.resolve(canonicalClassName + ".asciidoc");
logger.info("Writing [{}.asciidoc]", canonicalClassName);
try (PrintStream typeStream = new PrintStream(
Files.newOutputStream(typePath, StandardOpenOption.CREATE_NEW, StandardOpenOption.WRITE),
false, StandardCharsets.UTF_8.name())) {
emitGeneratedWarning(typeStream);
typeStream.print("[[");
emitAnchor(typeStream, struct.clazz);
emitAnchor(typeStream, clazz);
typeStream.print("]]++");
typeStream.print(struct.name);
typeStream.print(canonicalClassName);
typeStream.println("++::");
Consumer<PainlessField> documentField = field -> PainlessDocGenerator.documentField(typeStream, field);
Consumer<PainlessMethod> documentMethod = method -> PainlessDocGenerator.documentMethod(typeStream, method);
struct.staticMembers.values().stream().sorted(FIELD_NAME).forEach(documentField);
struct.members.values().stream().sorted(FIELD_NAME).forEach(documentField);
struct.staticFields.values().stream().sorted(FIELD_NAME).forEach(documentField);
struct.fields.values().stream().sorted(FIELD_NAME).forEach(documentField);
struct.staticMethods.values().stream().sorted(METHOD_NAME.thenComparing(NUMBER_OF_ARGS)).forEach(documentMethod);
struct.constructors.values().stream().sorted(NUMBER_OF_ARGS).forEach(documentMethod);
Map<String, PainlessClass> inherited = new TreeMap<>();
Map<String, Class<?>> inherited = new TreeMap<>();
struct.methods.values().stream().sorted(METHOD_NAME.thenComparing(NUMBER_OF_ARGS)).forEach(method -> {
if (method.target == struct.clazz) {
if (method.target == clazz) {
documentMethod(typeStream, method);
} else {
PainlessClass painlessClass = PAINLESS_LOOKUP.getPainlessStructFromJavaClass(method.target);
inherited.put(painlessClass.name, painlessClass);
inherited.put(canonicalClassName, method.target);
}
});
if (false == inherited.isEmpty()) {
typeStream.print("* Inherits methods from ");
boolean first = true;
for (PainlessClass inheritsFrom : inherited.values()) {
for (Class<?> inheritsFrom : inherited.values()) {
if (first) {
first = false;
} else {
@ -242,7 +245,7 @@ public class PainlessDocGenerator {
an internal link with the text.
*/
private static void emitType(PrintStream stream, Class<?> clazz) {
emitStruct(stream, PAINLESS_LOOKUP.getPainlessStructFromJavaClass(clazz));
emitStruct(stream, clazz);
while ((clazz = clazz.getComponentType()) != null) {
stream.print("[]");
}
@ -252,15 +255,17 @@ public class PainlessDocGenerator {
* Emit a {@link PainlessClass}. If the {@linkplain PainlessClass} is primitive or def this just emits the name of the struct.
* Otherwise this emits an internal link with the name.
*/
private static void emitStruct(PrintStream stream, PainlessClass struct) {
if (false == struct.clazz.isPrimitive() && false == struct.name.equals("def")) {
private static void emitStruct(PrintStream stream, Class<?> clazz) {
String canonicalClassName = PainlessLookupUtility.typeToCanonicalTypeName(clazz);
if (false == clazz.isPrimitive() && clazz != def.class) {
stream.print("<<");
emitAnchor(stream, struct.clazz);
emitAnchor(stream, clazz);
stream.print(',');
stream.print(struct.name);
stream.print(canonicalClassName);
stream.print(">>");
} else {
stream.print(struct.name);
stream.print(canonicalClassName);
}
}

View File

@ -460,7 +460,7 @@ public class NodeToStringTests extends ESTestCase {
public void testPSubField() {
Location l = new Location(getTestName(), 0);
PainlessClass s = painlessLookup.getPainlessStructFromJavaClass(Boolean.class);
PainlessField f = s.staticMembers.get("TRUE");
PainlessField f = s.staticFields.get("TRUE");
PSubField node = new PSubField(l, f);
node.prefix = new EStatic(l, "Boolean");
assertEquals("(PSubField (EStatic Boolean) TRUE)", node.toString());
@ -469,32 +469,28 @@ public class NodeToStringTests extends ESTestCase {
public void testPSubListShortcut() {
Location l = new Location(getTestName(), 0);
PainlessClass s = painlessLookup.getPainlessStructFromJavaClass(List.class);
PSubListShortcut node = new PSubListShortcut(l, s, new EConstant(l, 1));
PSubListShortcut node = new PSubListShortcut(l, List.class, new EConstant(l, 1));
node.prefix = new EVariable(l, "a");
assertEquals("(PSubListShortcut (EVariable a) (EConstant Integer 1))", node.toString());
assertEquals("(PSubNullSafeCallInvoke (PSubListShortcut (EVariable a) (EConstant Integer 1)))",
new PSubNullSafeCallInvoke(l, node).toString());
l = new Location(getTestName(), 0);
s = painlessLookup.getPainlessStructFromJavaClass(List.class);
node = new PSubListShortcut(l, s, new EBinary(l, Operation.ADD, new EConstant(l, 1), new EConstant(l, 4)));
node = new PSubListShortcut(l, List.class, new EBinary(l, Operation.ADD, new EConstant(l, 1), new EConstant(l, 4)));
node.prefix = new EVariable(l, "a");
assertEquals("(PSubListShortcut (EVariable a) (EBinary (EConstant Integer 1) + (EConstant Integer 4)))", node.toString());
}
public void testPSubMapShortcut() {
Location l = new Location(getTestName(), 0);
PainlessClass s = painlessLookup.getPainlessStructFromJavaClass(Map.class);
PSubMapShortcut node = new PSubMapShortcut(l, s, new EConstant(l, "cat"));
PSubMapShortcut node = new PSubMapShortcut(l, Map.class, new EConstant(l, "cat"));
node.prefix = new EVariable(l, "a");
assertEquals("(PSubMapShortcut (EVariable a) (EConstant String 'cat'))", node.toString());
assertEquals("(PSubNullSafeCallInvoke (PSubMapShortcut (EVariable a) (EConstant String 'cat')))",
new PSubNullSafeCallInvoke(l, node).toString());
l = new Location(getTestName(), 1);
s = painlessLookup.getPainlessStructFromJavaClass(Map.class);
node = new PSubMapShortcut(l, s, new EBinary(l, Operation.ADD, new EConstant(l, 1), new EConstant(l, 4)));
node = new PSubMapShortcut(l, Map.class, new EBinary(l, Operation.ADD, new EConstant(l, 1), new EConstant(l, 4)));
node.prefix = new EVariable(l, "a");
assertEquals("(PSubMapShortcut (EVariable a) (EBinary (EConstant Integer 1) + (EConstant Integer 4)))", node.toString());
}

View File

@ -45,12 +45,10 @@ public class FeatureQueryBuilderTests extends AbstractQueryTestCase<FeatureQuery
@Override
protected void initializeAdditionalMappings(MapperService mapperService) throws IOException {
for (String type : getCurrentTypes()) {
mapperService.merge(type, new CompressedXContent(Strings.toString(PutMappingRequest.buildFromSimplifiedDef(type,
"my_feature_field", "type=feature",
"my_negative_feature_field", "type=feature,positive_score_impact=false",
"my_feature_vector_field", "type=feature_vector"))), MapperService.MergeReason.MAPPING_UPDATE);
}
mapperService.merge("_doc", new CompressedXContent(Strings.toString(PutMappingRequest.buildFromSimplifiedDef("_doc",
"my_feature_field", "type=feature",
"my_negative_feature_field", "type=feature,positive_score_impact=false",
"my_feature_vector_field", "type=feature_vector"))), MapperService.MergeReason.MAPPING_UPDATE);
}
@Override
@ -87,7 +85,7 @@ public class FeatureQueryBuilderTests extends AbstractQueryTestCase<FeatureQuery
if (mayUseNegativeField) {
fields.add("my_negative_feature_field");
}
final String field = randomFrom(fields);
return new FeatureQueryBuilder(field, function);
}
@ -99,7 +97,6 @@ public class FeatureQueryBuilderTests extends AbstractQueryTestCase<FeatureQuery
}
public void testDefaultScoreFunction() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"feature\" : {\n" +
" \"field\": \"my_feature_field\"\n" +
@ -110,7 +107,6 @@ public class FeatureQueryBuilderTests extends AbstractQueryTestCase<FeatureQuery
}
public void testIllegalField() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"feature\" : {\n" +
" \"field\": \"" + STRING_FIELD_NAME + "\"\n" +
@ -121,7 +117,6 @@ public class FeatureQueryBuilderTests extends AbstractQueryTestCase<FeatureQuery
}
public void testIllegalCombination() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"feature\" : {\n" +
" \"field\": \"my_negative_feature_field\",\n" +

View File

@ -70,7 +70,6 @@ import java.util.Set;
* }
* </pre>
*/
@SuppressWarnings("unchecked")
public class RatedRequest implements Writeable, ToXContentObject {
private final String id;
private final List<String> summaryFields;
@ -250,6 +249,7 @@ public class RatedRequest implements Writeable, ToXContentObject {
private static final ParseField FIELDS_FIELD = new ParseField("summary_fields");
private static final ParseField TEMPLATE_ID_FIELD = new ParseField("template_id");
@SuppressWarnings("unchecked")
private static final ConstructingObjectParser<RatedRequest, Void> PARSER = new ConstructingObjectParser<>("request",
a -> new RatedRequest((String) a[0], (List<RatedDocument>) a[1], (SearchSourceBuilder) a[2], (Map<String, Object>) a[3],
(String) a[4]));

View File

@ -24,6 +24,7 @@ import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelPromise;
import io.netty.handler.codec.http.DefaultFullHttpResponse;
import io.netty.handler.codec.http.FullHttpRequest;
import io.netty.handler.codec.http.HttpHeaderNames;
import io.netty.handler.codec.http.HttpHeaders;
import io.netty.handler.codec.http.HttpMethod;
@ -50,7 +51,7 @@ public class Netty4CorsHandler extends ChannelDuplexHandler {
private static Pattern SCHEME_PATTERN = Pattern.compile("^https?://");
private final Netty4CorsConfig config;
private HttpRequest request;
private FullHttpRequest request;
/**
* Creates a new instance with the specified {@link Netty4CorsConfig}.
@ -64,15 +65,24 @@ public class Netty4CorsHandler extends ChannelDuplexHandler {
@Override
public void channelRead(ChannelHandlerContext ctx, Object msg) throws Exception {
if (config.isCorsSupportEnabled() && msg instanceof HttpRequest) {
request = (HttpRequest) msg;
assert msg instanceof FullHttpRequest : "Invalid message type: " + msg.getClass();
if (config.isCorsSupportEnabled()) {
request = (FullHttpRequest) msg;
if (isPreflightRequest(request)) {
handlePreflight(ctx, request);
return;
try {
handlePreflight(ctx, request);
return;
} finally {
releaseRequest();
}
}
if (config.isShortCircuit() && !validateOrigin()) {
forbidden(ctx, request);
return;
try {
forbidden(ctx, request);
return;
} finally {
releaseRequest();
}
}
}
ctx.fireChannelRead(msg);
@ -123,6 +133,11 @@ public class Netty4CorsHandler extends ChannelDuplexHandler {
}
}
private void releaseRequest() {
request.release();
request = null;
}
private static void forbidden(final ChannelHandlerContext ctx, final HttpRequest request) {
ctx.writeAndFlush(new DefaultFullHttpResponse(request.protocolVersion(), HttpResponseStatus.FORBIDDEN))
.addListener(ChannelFutureListener.CLOSE);

View File

@ -0,0 +1 @@
7a37816def72a748416c4ae8b0f6817e30efb99f

View File

@ -1 +0,0 @@
332410cfb49be3ec445ccf4a10164baf8e1b2f81

View File

@ -0,0 +1 @@
ca7437178cdbf7b8bfe0d75c75e3c8eb93925724

View File

@ -1 +0,0 @@
b77a0a1c35a76cff47a61134da83fb1b414312cf

View File

@ -0,0 +1 @@
3f5dec44f380d6d58bc1c8aec51964fcb5390b60

View File

@ -1 +0,0 @@
644f7bfdd2a35784c64e45a924859d85776955a8

View File

@ -0,0 +1 @@
453bf1d60df0415439095624e0b3e42492ad4716

View File

@ -1 +0,0 @@
6bbe2a89387a4aa7ddca3ec519e03df75b21020a

View File

@ -0,0 +1 @@
70095a45257bca9f46629b5fb6cedf9eff5e2b07

View File

@ -1 +0,0 @@
dd36711a33c2ba2b32f3cf16892923c69b573075

View File

@ -0,0 +1 @@
7199d6962d268b7877f7b5160e98e4ff21cce5c7

View File

@ -1 +0,0 @@
1d3aa77ed0374da5f1e3224f9548caaa61494ce7

View File

@ -0,0 +1 @@
12aff508d39d206a1aead5013ecd11882062eb06

View File

@ -1 +0,0 @@
aeeca8f686c68c940451322104392ba3ed3d602c

View File

@ -19,16 +19,26 @@
package org.elasticsearch.plugin.store.smb;
import org.elasticsearch.index.IndexModule;
import org.elasticsearch.index.IndexSettings;
import org.elasticsearch.index.store.IndexStore;
import org.elasticsearch.index.store.smbmmapfs.SmbMmapFsIndexStore;
import org.elasticsearch.index.store.smbsimplefs.SmbSimpleFsIndexStore;
import org.elasticsearch.plugins.IndexStorePlugin;
import org.elasticsearch.plugins.Plugin;
public class SMBStorePlugin extends Plugin {
import java.util.Collections;
import java.util.HashMap;
import java.util.Map;
import java.util.function.Function;
public class SMBStorePlugin extends Plugin implements IndexStorePlugin {
@Override
public void onIndexModule(IndexModule indexModule) {
indexModule.addIndexStore("smb_mmap_fs", SmbMmapFsIndexStore::new);
indexModule.addIndexStore("smb_simple_fs", SmbSimpleFsIndexStore::new);
public Map<String, Function<IndexSettings, IndexStore>> getIndexStoreFactories() {
final Map<String, Function<IndexSettings, IndexStore>> indexStoreFactories = new HashMap<>(2);
indexStoreFactories.put("smb_mmap_fs", SmbMmapFsIndexStore::new);
indexStoreFactories.put("smb_simple_fs", SmbSimpleFsIndexStore::new);
return Collections.unmodifiableMap(indexStoreFactories);
}
}

View File

@ -30,4 +30,12 @@ public class SmbMMapDirectoryTests extends EsBaseDirectoryTestCase {
protected Directory getDirectory(Path file) throws IOException {
return new SmbDirectoryWrapper(new MMapDirectory(file));
}
@Override
public void testCreateOutputForExistingFile() throws IOException {
/**
* This test is disabled because {@link SmbDirectoryWrapper} opens existing file
* with an explicit StandardOpenOption.TRUNCATE_EXISTING option.
*/
}
}

View File

@ -30,4 +30,12 @@ public class SmbSimpleFSDirectoryTests extends EsBaseDirectoryTestCase {
protected Directory getDirectory(Path file) throws IOException {
return new SmbDirectoryWrapper(new SimpleFSDirectory(file));
}
@Override
public void testCreateOutputForExistingFile() throws IOException {
/**
* This test is disabled because {@link SmbDirectoryWrapper} opens existing file
* with an explicit StandardOpenOption.TRUNCATE_EXISTING option.
*/
}
}

View File

@ -24,6 +24,7 @@ import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelPromise;
import io.netty.handler.codec.http.DefaultFullHttpResponse;
import io.netty.handler.codec.http.FullHttpRequest;
import io.netty.handler.codec.http.HttpHeaderNames;
import io.netty.handler.codec.http.HttpHeaders;
import io.netty.handler.codec.http.HttpMethod;
@ -50,7 +51,7 @@ public class NioCorsHandler extends ChannelDuplexHandler {
private static Pattern SCHEME_PATTERN = Pattern.compile("^https?://");
private final NioCorsConfig config;
private HttpRequest request;
private FullHttpRequest request;
/**
* Creates a new instance with the specified {@link NioCorsConfig}.
@ -64,15 +65,24 @@ public class NioCorsHandler extends ChannelDuplexHandler {
@Override
public void channelRead(ChannelHandlerContext ctx, Object msg) throws Exception {
if (config.isCorsSupportEnabled() && msg instanceof HttpRequest) {
request = (HttpRequest) msg;
assert msg instanceof FullHttpRequest : "Invalid message type: " + msg.getClass();
if (config.isCorsSupportEnabled()) {
request = (FullHttpRequest) msg;
if (isPreflightRequest(request)) {
handlePreflight(ctx, request);
return;
try {
handlePreflight(ctx, request);
return;
} finally {
releaseRequest();
}
}
if (config.isShortCircuit() && !validateOrigin()) {
forbidden(ctx, request);
return;
try {
forbidden(ctx, request);
return;
} finally {
releaseRequest();
}
}
}
ctx.fireChannelRead(msg);
@ -109,6 +119,11 @@ public class NioCorsHandler extends ChannelDuplexHandler {
}
}
private void releaseRequest() {
request.release();
request = null;
}
private void handlePreflight(final ChannelHandlerContext ctx, final HttpRequest request) {
final HttpResponse response = new DefaultFullHttpResponse(request.protocolVersion(), HttpResponseStatus.OK, true, true);
if (setOrigin(response)) {

View File

@ -21,6 +21,7 @@ package org.elasticsearch.http.nio;
import io.netty.buffer.ByteBuf;
import io.netty.buffer.Unpooled;
import io.netty.channel.ChannelPromise;
import io.netty.channel.embedded.EmbeddedChannel;
import io.netty.handler.codec.http.DefaultFullHttpRequest;
import io.netty.handler.codec.http.FullHttpResponse;
@ -116,21 +117,27 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
ByteBuf buf = requestEncoder.encode(httpRequest);
int slicePoint = randomInt(buf.writerIndex() - 1);
ByteBuf slicedBuf = buf.retainedSlice(0, slicePoint);
ByteBuf slicedBuf2 = buf.retainedSlice(slicePoint, buf.writerIndex());
handler.consumeReads(toChannelBuffer(slicedBuf));
try {
handler.consumeReads(toChannelBuffer(slicedBuf));
verify(transport, times(0)).incomingRequest(any(HttpRequest.class), any(NioHttpChannel.class));
verify(transport, times(0)).incomingRequest(any(HttpRequest.class), any(NioHttpChannel.class));
handler.consumeReads(toChannelBuffer(slicedBuf2));
handler.consumeReads(toChannelBuffer(slicedBuf2));
ArgumentCaptor<HttpRequest> requestCaptor = ArgumentCaptor.forClass(HttpRequest.class);
verify(transport).incomingRequest(requestCaptor.capture(), any(NioHttpChannel.class));
ArgumentCaptor<HttpRequest> requestCaptor = ArgumentCaptor.forClass(HttpRequest.class);
verify(transport).incomingRequest(requestCaptor.capture(), any(NioHttpChannel.class));
HttpRequest nioHttpRequest = requestCaptor.getValue();
assertEquals(HttpRequest.HttpVersion.HTTP_1_1, nioHttpRequest.protocolVersion());
assertEquals(RestRequest.Method.GET, nioHttpRequest.method());
HttpRequest nioHttpRequest = requestCaptor.getValue();
assertEquals(HttpRequest.HttpVersion.HTTP_1_1, nioHttpRequest.protocolVersion());
assertEquals(RestRequest.Method.GET, nioHttpRequest.method());
} finally {
handler.close();
buf.release();
slicedBuf.release();
slicedBuf2.release();
}
}
public void testDecodeHttpRequestError() throws IOException {
@ -138,16 +145,20 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
io.netty.handler.codec.http.HttpRequest httpRequest = new DefaultFullHttpRequest(HttpVersion.HTTP_1_1, HttpMethod.GET, uri);
ByteBuf buf = requestEncoder.encode(httpRequest);
buf.setByte(0, ' ');
buf.setByte(1, ' ');
buf.setByte(2, ' ');
try {
buf.setByte(0, ' ');
buf.setByte(1, ' ');
buf.setByte(2, ' ');
handler.consumeReads(toChannelBuffer(buf));
handler.consumeReads(toChannelBuffer(buf));
ArgumentCaptor<Exception> exceptionCaptor = ArgumentCaptor.forClass(Exception.class);
verify(transport).incomingRequestError(any(HttpRequest.class), any(NioHttpChannel.class), exceptionCaptor.capture());
ArgumentCaptor<Exception> exceptionCaptor = ArgumentCaptor.forClass(Exception.class);
verify(transport).incomingRequestError(any(HttpRequest.class), any(NioHttpChannel.class), exceptionCaptor.capture());
assertTrue(exceptionCaptor.getValue() instanceof IllegalArgumentException);
assertTrue(exceptionCaptor.getValue() instanceof IllegalArgumentException);
} finally {
buf.release();
}
}
public void testDecodeHttpRequestContentLengthToLongGeneratesOutboundMessage() throws IOException {
@ -157,9 +168,11 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
HttpUtil.setKeepAlive(httpRequest, false);
ByteBuf buf = requestEncoder.encode(httpRequest);
handler.consumeReads(toChannelBuffer(buf));
try {
handler.consumeReads(toChannelBuffer(buf));
} finally {
buf.release();
}
verify(transport, times(0)).incomingRequestError(any(), any(), any());
verify(transport, times(0)).incomingRequest(any(), any());
@ -168,13 +181,17 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
FlushOperation flushOperation = flushOperations.get(0);
FullHttpResponse response = responseDecoder.decode(Unpooled.wrappedBuffer(flushOperation.getBuffersToWrite()));
assertEquals(HttpVersion.HTTP_1_1, response.protocolVersion());
assertEquals(HttpResponseStatus.REQUEST_ENTITY_TOO_LARGE, response.status());
try {
assertEquals(HttpVersion.HTTP_1_1, response.protocolVersion());
assertEquals(HttpResponseStatus.REQUEST_ENTITY_TOO_LARGE, response.status());
flushOperation.getListener().accept(null, null);
// Since we have keep-alive set to false, we should close the channel after the response has been
// flushed
verify(nioHttpChannel).close();
flushOperation.getListener().accept(null, null);
// Since we have keep-alive set to false, we should close the channel after the response has been
// flushed
verify(nioHttpChannel).close();
} finally {
response.release();
}
}
@SuppressWarnings("unchecked")
@ -189,11 +206,15 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
SocketChannelContext context = mock(SocketChannelContext.class);
HttpWriteOperation writeOperation = new HttpWriteOperation(context, httpResponse, mock(BiConsumer.class));
List<FlushOperation> flushOperations = handler.writeToBytes(writeOperation);
FullHttpResponse response = responseDecoder.decode(Unpooled.wrappedBuffer(flushOperations.get(0).getBuffersToWrite()));
assertEquals(HttpResponseStatus.OK, response.status());
assertEquals(HttpVersion.HTTP_1_1, response.protocolVersion());
FlushOperation operation = flushOperations.get(0);
FullHttpResponse response = responseDecoder.decode(Unpooled.wrappedBuffer(operation.getBuffersToWrite()));
((ChannelPromise) operation.getListener()).setSuccess();
try {
assertEquals(HttpResponseStatus.OK, response.status());
assertEquals(HttpVersion.HTTP_1_1, response.protocolVersion());
} finally {
response.release();
}
}
public void testCorsEnabledWithoutAllowOrigins() throws IOException {
@ -201,9 +222,13 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
Settings settings = Settings.builder()
.put(HttpTransportSettings.SETTING_CORS_ENABLED.getKey(), true)
.build();
io.netty.handler.codec.http.HttpResponse response = executeCorsRequest(settings, "remote-host", "request-host");
// inspect response and validate
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), nullValue());
FullHttpResponse response = executeCorsRequest(settings, "remote-host", "request-host");
try {
// inspect response and validate
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), nullValue());
} finally {
response.release();
}
}
public void testCorsEnabledWithAllowOrigins() throws IOException {
@ -213,11 +238,15 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
.put(SETTING_CORS_ENABLED.getKey(), true)
.put(SETTING_CORS_ALLOW_ORIGIN.getKey(), originValue)
.build();
io.netty.handler.codec.http.HttpResponse response = executeCorsRequest(settings, originValue, "request-host");
// inspect response and validate
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
String allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
FullHttpResponse response = executeCorsRequest(settings, originValue, "request-host");
try {
// inspect response and validate
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
String allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
} finally {
response.release();
}
}
public void testCorsAllowOriginWithSameHost() throws IOException {
@ -228,29 +257,44 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
.put(SETTING_CORS_ENABLED.getKey(), true)
.build();
FullHttpResponse response = executeCorsRequest(settings, originValue, host);
// inspect response and validate
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
String allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
String allowedOrigins;
try {
// inspect response and validate
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
} finally {
response.release();
}
originValue = "http://" + originValue;
response = executeCorsRequest(settings, originValue, host);
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
try {
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
} finally {
response.release();
}
originValue = originValue + ":5555";
host = host + ":5555";
response = executeCorsRequest(settings, originValue, host);
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
try {
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
} finally {
response.release();
}
originValue = originValue.replace("http", "https");
response = executeCorsRequest(settings, originValue, host);
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
try {
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
} finally {
response.release();
}
}
public void testThatStringLiteralWorksOnMatch() throws IOException {
@ -261,12 +305,16 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
.put(SETTING_CORS_ALLOW_METHODS.getKey(), "get, options, post")
.put(SETTING_CORS_ALLOW_CREDENTIALS.getKey(), true)
.build();
io.netty.handler.codec.http.HttpResponse response = executeCorsRequest(settings, originValue, "request-host");
// inspect response and validate
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
String allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_CREDENTIALS), equalTo("true"));
FullHttpResponse response = executeCorsRequest(settings, originValue, "request-host");
try {
// inspect response and validate
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
String allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_CREDENTIALS), equalTo("true"));
} finally {
response.release();
}
}
public void testThatAnyOriginWorks() throws IOException {
@ -275,12 +323,16 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
.put(SETTING_CORS_ENABLED.getKey(), true)
.put(SETTING_CORS_ALLOW_ORIGIN.getKey(), originValue)
.build();
io.netty.handler.codec.http.HttpResponse response = executeCorsRequest(settings, originValue, "request-host");
// inspect response and validate
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
String allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_CREDENTIALS), nullValue());
FullHttpResponse response = executeCorsRequest(settings, originValue, "request-host");
try {
// inspect response and validate
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN), notNullValue());
String allowedOrigins = response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_ORIGIN);
assertThat(allowedOrigins, is(originValue));
assertThat(response.headers().get(HttpHeaderNames.ACCESS_CONTROL_ALLOW_CREDENTIALS), nullValue());
} finally {
response.release();
}
}
private FullHttpResponse executeCorsRequest(final Settings settings, final String originValue, final String host) throws IOException {
@ -300,8 +352,9 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
SocketChannelContext context = mock(SocketChannelContext.class);
List<FlushOperation> flushOperations = handler.writeToBytes(handler.createWriteOperation(context, response, (v, e) -> {}));
handler.close();
FlushOperation flushOperation = flushOperations.get(0);
((ChannelPromise) flushOperation.getListener()).setSuccess();
return responseDecoder.decode(Unpooled.wrappedBuffer(flushOperation.getBuffersToWrite()));
}
@ -314,8 +367,11 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
io.netty.handler.codec.http.HttpRequest request = new DefaultFullHttpRequest(version, method, uri);
ByteBuf buf = requestEncoder.encode(request);
handler.consumeReads(toChannelBuffer(buf));
try {
handler.consumeReads(toChannelBuffer(buf));
} finally {
buf.release();
}
ArgumentCaptor<NioHttpRequest> requestCaptor = ArgumentCaptor.forClass(NioHttpRequest.class);
verify(transport, atLeastOnce()).incomingRequest(requestCaptor.capture(), any(HttpChannel.class));

View File

@ -0,0 +1 @@
d27958843ca118db2ffd2c242ae3761bd5a47328

View File

@ -1 +0,0 @@
0cf871ea421568e15d4e1082bbd578cabef563ae

View File

@ -0,0 +1 @@
7ea220ba8e4accb8b04e280463042ad470e23bc0

View File

@ -1 +0,0 @@
996a66191c4f06769374204dd16be8b4f8d3cbd6

View File

@ -0,0 +1 @@
471096d6e92338b208aa91f3a85feb2f9cfc4afd

View File

@ -1 +0,0 @@
d954838b60bc2681f4eddc1af76958b2cb36e385

View File

@ -0,0 +1 @@
f0af947c60d24f779c22f774e81ebd7dd91cc932

View File

@ -1 +0,0 @@
0b4fa3b0e68a9933bbb4f97be4c821060ec33ead

View File

@ -0,0 +1 @@
fbc83ac5a0139ed7e7faf6c95a2718f46f28c641

View File

@ -1 +0,0 @@
1ac754cc9e64e994fcfb3a494bd22d9e63f51e57

View File

@ -0,0 +1 @@
30adfe493982b0db059dc243e269eea38d850d46

View File

@ -1 +0,0 @@
5f632e7809257ff31613e04cdc81be8181e7b76a

View File

@ -0,0 +1 @@
656f304261d9aad05070fb68593beffafe9147e3

View File

@ -1 +0,0 @@
b1414f30a441c74acdf36270a6f7197fa12b6bbe

View File

@ -0,0 +1 @@
8bf22ad81a7480c255b55bada401eb131bfdb4df

View File

@ -1 +0,0 @@
b34840bcf4d0e754df8214b4a4ea646cdb6a5dfa

View File

@ -0,0 +1 @@
edb3de4d68a34c1e1ca08f79fe4d103b10e98ad1

View File

@ -1 +0,0 @@
1e303bc0028a5d81408f41ad5d96dec1ae141dd8

View File

@ -0,0 +1 @@
7ece30d5f1e18d96f61644451c858c3d9960558f

View File

@ -1 +0,0 @@
d873d397f02fb5c1c014276f7108193a378942e1

View File

@ -0,0 +1 @@
ad3bd0c2ed96556193c7215bef328e689d0b157f

View File

@ -1 +0,0 @@
b9e10faf31108795ae6fd445de3743a5f5e7e672

View File

@ -0,0 +1 @@
8a6bd97e39ee5af60126adbe8c8375dc41b1ea8e

View File

@ -1 +0,0 @@
6e375f38d95d8ae1bacb3259c89e7ef0a89c7ac0

View File

@ -0,0 +1 @@
07e748d2d80000a7a213f3405b82b6e26b452948

View File

@ -1 +0,0 @@
d335c114a2fe51dea8fc7ddb6d60b1c5b2ae29d2

View File

@ -0,0 +1 @@
fd737bd5562f3943618ee7e73a0aaffb6319fdb2

View File

@ -1 +0,0 @@
8648cf47ca1300e578059834c9ad9c0b566ee2cf

View File

@ -0,0 +1 @@
ff3f260d1dc8c18bc67f3c33aa84a0ad290daac5

View File

@ -1 +0,0 @@
504eb2913530aaf4f314254ec121a01581a7f5d3

View File

@ -179,7 +179,7 @@ public class Version implements Comparable<Version>, ToXContentFragment {
public static final int V_6_4_0_ID = 6040099;
public static final Version V_6_4_0 = new Version(V_6_4_0_ID, org.apache.lucene.util.Version.LUCENE_7_4_0);
public static final int V_6_5_0_ID = 6050099;
public static final Version V_6_5_0 = new Version(V_6_5_0_ID, org.apache.lucene.util.Version.LUCENE_7_4_0);
public static final Version V_6_5_0 = new Version(V_6_5_0_ID, org.apache.lucene.util.Version.LUCENE_7_5_0);
public static final int V_7_0_0_alpha1_ID = 7000001;
public static final Version V_7_0_0_alpha1 =
new Version(V_7_0_0_alpha1_ID, org.apache.lucene.util.Version.LUCENE_7_5_0);

View File

@ -22,7 +22,6 @@ package org.elasticsearch.action.support.nodes;
import org.apache.logging.log4j.message.ParameterizedMessage;
import org.elasticsearch.action.ActionListener;
import org.elasticsearch.action.FailedNodeException;
import org.elasticsearch.action.NoSuchNodeException;
import org.elasticsearch.action.support.ActionFilters;
import org.elasticsearch.action.support.HandledTransportAction;
import org.elasticsearch.cluster.ClusterState;
@ -179,37 +178,33 @@ public abstract class TransportNodesAction<NodesRequest extends BaseNodesRequest
final DiscoveryNode node = nodes[i];
final String nodeId = node.getId();
try {
if (node == null) {
onFailure(idx, nodeId, new NoSuchNodeException(nodeId));
} else {
TransportRequest nodeRequest = newNodeRequest(nodeId, request);
if (task != null) {
nodeRequest.setParentTask(clusterService.localNode().getId(), task.getId());
}
transportService.sendRequest(node, transportNodeAction, nodeRequest, builder.build(),
new TransportResponseHandler<NodeResponse>() {
@Override
public NodeResponse newInstance() {
return newNodeResponse();
}
@Override
public void handleResponse(NodeResponse response) {
onOperation(idx, response);
}
@Override
public void handleException(TransportException exp) {
onFailure(idx, node.getId(), exp);
}
@Override
public String executor() {
return ThreadPool.Names.SAME;
}
});
TransportRequest nodeRequest = newNodeRequest(nodeId, request);
if (task != null) {
nodeRequest.setParentTask(clusterService.localNode().getId(), task.getId());
}
transportService.sendRequest(node, transportNodeAction, nodeRequest, builder.build(),
new TransportResponseHandler<NodeResponse>() {
@Override
public NodeResponse newInstance() {
return newNodeResponse();
}
@Override
public void handleResponse(NodeResponse response) {
onOperation(idx, response);
}
@Override
public void handleException(TransportException exp) {
onFailure(idx, node.getId(), exp);
}
@Override
public String executor() {
return ThreadPool.Names.SAME;
}
});
} catch (Exception e) {
onFailure(idx, nodeId, e);
}

View File

@ -18,7 +18,6 @@
*/
package org.elasticsearch.common.geo.parsers;
import org.locationtech.jts.geom.Coordinate;
import org.elasticsearch.ElasticsearchParseException;
import org.elasticsearch.common.Explicit;
import org.elasticsearch.common.geo.GeoPoint;
@ -29,6 +28,7 @@ import org.elasticsearch.common.geo.builders.ShapeBuilder;
import org.elasticsearch.common.unit.DistanceUnit;
import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.index.mapper.GeoShapeFieldMapper;
import org.locationtech.jts.geom.Coordinate;
import java.io.IOException;
import java.util.ArrayList;
@ -130,10 +130,6 @@ abstract class GeoJsonParser {
CircleBuilder.TYPE);
}
if (shapeType == null) {
throw new ElasticsearchParseException("shape type [{}] not included", shapeType);
}
if (shapeType.equals(GeoShapeType.GEOMETRYCOLLECTION)) {
return geometryCollections;
}

View File

@ -0,0 +1,111 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index;
import org.apache.lucene.index.FilterMergePolicy;
import org.apache.lucene.index.SegmentCommitInfo;
import org.apache.lucene.index.SegmentInfos;
import org.apache.lucene.index.TieredMergePolicy;
import java.io.IOException;
import java.util.Map;
/**
* Wrapper around {@link TieredMergePolicy} which doesn't respect
* {@link TieredMergePolicy#setMaxMergedSegmentMB(double)} on forced merges.
* See https://issues.apache.org/jira/browse/LUCENE-7976.
*/
final class EsTieredMergePolicy extends FilterMergePolicy {
final TieredMergePolicy regularMergePolicy;
final TieredMergePolicy forcedMergePolicy;
EsTieredMergePolicy() {
super(new TieredMergePolicy());
regularMergePolicy = (TieredMergePolicy) in;
forcedMergePolicy = new TieredMergePolicy();
forcedMergePolicy.setMaxMergedSegmentMB(Double.POSITIVE_INFINITY); // unlimited
}
@Override
public MergeSpecification findForcedMerges(SegmentInfos infos, int maxSegmentCount,
Map<SegmentCommitInfo, Boolean> segmentsToMerge, MergeContext mergeContext) throws IOException {
return forcedMergePolicy.findForcedMerges(infos, maxSegmentCount, segmentsToMerge, mergeContext);
}
@Override
public MergeSpecification findForcedDeletesMerges(SegmentInfos infos, MergeContext mergeContext) throws IOException {
return forcedMergePolicy.findForcedDeletesMerges(infos, mergeContext);
}
public void setForceMergeDeletesPctAllowed(double forceMergeDeletesPctAllowed) {
regularMergePolicy.setForceMergeDeletesPctAllowed(forceMergeDeletesPctAllowed);
forcedMergePolicy.setForceMergeDeletesPctAllowed(forceMergeDeletesPctAllowed);
}
public double getForceMergeDeletesPctAllowed() {
return forcedMergePolicy.getForceMergeDeletesPctAllowed();
}
public void setFloorSegmentMB(double mbFrac) {
regularMergePolicy.setFloorSegmentMB(mbFrac);
forcedMergePolicy.setFloorSegmentMB(mbFrac);
}
public double getFloorSegmentMB() {
return regularMergePolicy.getFloorSegmentMB();
}
public void setMaxMergeAtOnce(int maxMergeAtOnce) {
regularMergePolicy.setMaxMergeAtOnce(maxMergeAtOnce);
forcedMergePolicy.setMaxMergeAtOnce(maxMergeAtOnce);
}
public int getMaxMergeAtOnce() {
return regularMergePolicy.getMaxMergeAtOnce();
}
public void setMaxMergeAtOnceExplicit(int maxMergeAtOnceExplicit) {
regularMergePolicy.setMaxMergeAtOnceExplicit(maxMergeAtOnceExplicit);
forcedMergePolicy.setMaxMergeAtOnceExplicit(maxMergeAtOnceExplicit);
}
public int getMaxMergeAtOnceExplicit() {
return forcedMergePolicy.getMaxMergeAtOnceExplicit();
}
// only setter that must NOT delegate to the forced merge policy
public void setMaxMergedSegmentMB(double mbFrac) {
regularMergePolicy.setMaxMergedSegmentMB(mbFrac);
}
public double getMaxMergedSegmentMB() {
return regularMergePolicy.getMaxMergedSegmentMB();
}
public void setSegmentsPerTier(double segmentsPerTier) {
regularMergePolicy.setSegmentsPerTier(segmentsPerTier);
forcedMergePolicy.setSegmentsPerTier(segmentsPerTier);
}
public double getSegmentsPerTier() {
return regularMergePolicy.getSegmentsPerTier();
}
}

View File

@ -49,6 +49,7 @@ import org.elasticsearch.indices.IndicesQueryCache;
import org.elasticsearch.indices.breaker.CircuitBreakerService;
import org.elasticsearch.indices.fielddata.cache.IndicesFieldDataCache;
import org.elasticsearch.indices.mapper.MapperRegistry;
import org.elasticsearch.plugins.IndexStorePlugin;
import org.elasticsearch.script.ScriptService;
import org.elasticsearch.threadpool.ThreadPool;
@ -74,7 +75,7 @@ import java.util.function.Function;
* {@link #addSimilarity(String, TriFunction)} while existing Providers can be referenced through Settings under the
* {@link IndexModule#SIMILARITY_SETTINGS_PREFIX} prefix along with the "type" value. For example, to reference the
* {@link BM25Similarity}, the configuration {@code "index.similarity.my_similarity.type : "BM25"} can be used.</li>
* <li>{@link IndexStore} - Custom {@link IndexStore} instances can be registered via {@link #addIndexStore(String, Function)}</li>
* <li>{@link IndexStore} - Custom {@link IndexStore} instances can be registered via {@link IndexStorePlugin}</li>
* <li>{@link IndexEventListener} - Custom {@link IndexEventListener} instances can be registered via
* {@link #addIndexEventListener(IndexEventListener)}</li>
* <li>Settings update listener - Custom settings update listener can be registered via
@ -109,7 +110,7 @@ public final class IndexModule {
private SetOnce<IndexSearcherWrapperFactory> indexSearcherWrapper = new SetOnce<>();
private final Set<IndexEventListener> indexEventListeners = new HashSet<>();
private final Map<String, TriFunction<Settings, Version, ScriptService, Similarity>> similarities = new HashMap<>();
private final Map<String, Function<IndexSettings, IndexStore>> storeTypes = new HashMap<>();
private final Map<String, Function<IndexSettings, IndexStore>> indexStoreFactories;
private final SetOnce<BiFunction<IndexSettings, IndicesQueryCache, QueryCache>> forceQueryCacheProvider = new SetOnce<>();
private final List<SearchOperationListener> searchOperationListeners = new ArrayList<>();
private final List<IndexingOperationListener> indexOperationListeners = new ArrayList<>();
@ -119,16 +120,22 @@ public final class IndexModule {
* Construct the index module for the index with the specified index settings. The index module contains extension points for plugins
* via {@link org.elasticsearch.plugins.PluginsService#onIndexModule(IndexModule)}.
*
* @param indexSettings the index settings
* @param analysisRegistry the analysis registry
* @param engineFactory the engine factory
* @param indexSettings the index settings
* @param analysisRegistry the analysis registry
* @param engineFactory the engine factory
* @param indexStoreFactories the available store types
*/
public IndexModule(final IndexSettings indexSettings, final AnalysisRegistry analysisRegistry, final EngineFactory engineFactory) {
public IndexModule(
final IndexSettings indexSettings,
final AnalysisRegistry analysisRegistry,
final EngineFactory engineFactory,
final Map<String, Function<IndexSettings, IndexStore>> indexStoreFactories) {
this.indexSettings = indexSettings;
this.analysisRegistry = analysisRegistry;
this.engineFactory = Objects.requireNonNull(engineFactory);
this.searchOperationListeners.add(new SearchSlowLog(indexSettings));
this.indexOperationListeners.add(new IndexingSlowLog(indexSettings));
this.indexStoreFactories = Collections.unmodifiableMap(indexStoreFactories);
}
/**
@ -245,25 +252,6 @@ public final class IndexModule {
this.indexOperationListeners.add(listener);
}
/**
* Adds an {@link IndexStore} type to this index module. Typically stores are registered with a reference to
* it's constructor:
* <pre>
* indexModule.addIndexStore("my_store_type", MyStore::new);
* </pre>
*
* @param type the type to register
* @param provider the instance provider / factory method
*/
public void addIndexStore(String type, Function<IndexSettings, IndexStore> provider) {
ensureNotFrozen();
if (storeTypes.containsKey(type)) {
throw new IllegalArgumentException("key [" + type +"] already registered");
}
storeTypes.put(type, provider);
}
/**
* Registers the given {@link Similarity} with the given name.
* The function takes as parameters:<ul>
@ -360,7 +348,7 @@ public final class IndexModule {
if (Strings.isEmpty(storeType) || isBuiltinType(storeType)) {
store = new IndexStore(indexSettings);
} else {
Function<IndexSettings, IndexStore> factory = storeTypes.get(storeType);
Function<IndexSettings, IndexStore> factory = indexStoreFactories.get(storeType);
if (factory == null) {
throw new IllegalArgumentException("Unknown store type [" + storeType + "]");
}

View File

@ -435,7 +435,6 @@ public final class IndexSettings {
scopedSettings.addSettingsUpdateConsumer(MergePolicyConfig.INDEX_MERGE_POLICY_MAX_MERGE_AT_ONCE_EXPLICIT_SETTING, mergePolicyConfig::setMaxMergesAtOnceExplicit);
scopedSettings.addSettingsUpdateConsumer(MergePolicyConfig.INDEX_MERGE_POLICY_MAX_MERGED_SEGMENT_SETTING, mergePolicyConfig::setMaxMergedSegment);
scopedSettings.addSettingsUpdateConsumer(MergePolicyConfig.INDEX_MERGE_POLICY_SEGMENTS_PER_TIER_SETTING, mergePolicyConfig::setSegmentsPerTier);
scopedSettings.addSettingsUpdateConsumer(MergePolicyConfig.INDEX_MERGE_POLICY_RECLAIM_DELETES_WEIGHT_SETTING, mergePolicyConfig::setReclaimDeletesWeight);
scopedSettings.addSettingsUpdateConsumer(MergeSchedulerConfig.MAX_THREAD_COUNT_SETTING, MergeSchedulerConfig.MAX_MERGE_COUNT_SETTING,
mergeSchedulerConfig::setMaxThreadAndMergeCount);

View File

@ -115,7 +115,7 @@ import org.elasticsearch.common.unit.ByteSizeValue;
*/
public final class MergePolicyConfig {
private final TieredMergePolicy mergePolicy = new TieredMergePolicy();
private final EsTieredMergePolicy mergePolicy = new EsTieredMergePolicy();
private final Logger logger;
private final boolean mergesEnabled;
@ -150,7 +150,7 @@ public final class MergePolicyConfig {
Property.Dynamic, Property.IndexScope);
public static final Setting<Double> INDEX_MERGE_POLICY_RECLAIM_DELETES_WEIGHT_SETTING =
Setting.doubleSetting("index.merge.policy.reclaim_deletes_weight", DEFAULT_RECLAIM_DELETES_WEIGHT, 0.0d,
Property.Dynamic, Property.IndexScope);
Property.Dynamic, Property.IndexScope, Property.Deprecated);
public static final String INDEX_MERGE_ENABLED = "index.merge.enabled"; // don't convert to Setting<> and register... we only set this in tests and register via a plugin
@ -176,17 +176,12 @@ public final class MergePolicyConfig {
mergePolicy.setMaxMergeAtOnceExplicit(maxMergeAtOnceExplicit);
mergePolicy.setMaxMergedSegmentMB(maxMergedSegment.getMbFrac());
mergePolicy.setSegmentsPerTier(segmentsPerTier);
mergePolicy.setReclaimDeletesWeight(reclaimDeletesWeight);
if (logger.isTraceEnabled()) {
logger.trace("using [tiered] merge mergePolicy with expunge_deletes_allowed[{}], floor_segment[{}], max_merge_at_once[{}], max_merge_at_once_explicit[{}], max_merged_segment[{}], segments_per_tier[{}], reclaim_deletes_weight[{}]",
forceMergeDeletesPctAllowed, floorSegment, maxMergeAtOnce, maxMergeAtOnceExplicit, maxMergedSegment, segmentsPerTier, reclaimDeletesWeight);
}
}
void setReclaimDeletesWeight(Double reclaimDeletesWeight) {
mergePolicy.setReclaimDeletesWeight(reclaimDeletesWeight);
}
void setSegmentsPerTier(Double segmentsPerTier) {
mergePolicy.setSegmentsPerTier(segmentsPerTier);
}

View File

@ -819,6 +819,8 @@ public abstract class Engine implements Closeable {
} catch (IOException e) {
logger.trace(() -> new ParameterizedMessage("failed to get size for [{}]", info.info.name), e);
}
segment.segmentSort = info.info.getIndexSort();
segment.attributes = info.info.getAttributes();
segments.put(info.info.name, segment);
} else {
segment.committed = true;

View File

@ -100,6 +100,7 @@ import org.elasticsearch.index.shard.IndexShardState;
import org.elasticsearch.index.shard.IndexingOperationListener;
import org.elasticsearch.index.shard.IndexingStats;
import org.elasticsearch.index.shard.ShardId;
import org.elasticsearch.index.store.IndexStore;
import org.elasticsearch.indices.breaker.CircuitBreakerService;
import org.elasticsearch.indices.cluster.IndicesClusterStateService;
import org.elasticsearch.indices.fielddata.cache.IndicesFieldDataCache;
@ -181,6 +182,7 @@ public class IndicesService extends AbstractLifecycleComponent
private final IndicesQueryCache indicesQueryCache;
private final MetaStateService metaStateService;
private final Collection<Function<IndexSettings, Optional<EngineFactory>>> engineFactoryProviders;
private final Map<String, Function<IndexSettings, IndexStore>> indexStoreFactories;
@Override
protected void doStart() {
@ -193,7 +195,8 @@ public class IndicesService extends AbstractLifecycleComponent
MapperRegistry mapperRegistry, NamedWriteableRegistry namedWriteableRegistry, ThreadPool threadPool,
IndexScopedSettings indexScopedSettings, CircuitBreakerService circuitBreakerService, BigArrays bigArrays,
ScriptService scriptService, Client client, MetaStateService metaStateService,
Collection<Function<IndexSettings, Optional<EngineFactory>>> engineFactoryProviders) {
Collection<Function<IndexSettings, Optional<EngineFactory>>> engineFactoryProviders,
Map<String, Function<IndexSettings, IndexStore>> indexStoreFactories) {
super(settings);
this.threadPool = threadPool;
this.pluginsService = pluginsService;
@ -225,6 +228,7 @@ public class IndicesService extends AbstractLifecycleComponent
this.cacheCleaner = new CacheCleaner(indicesFieldDataCache, indicesRequestCache, logger, threadPool, this.cleanInterval);
this.metaStateService = metaStateService;
this.engineFactoryProviders = engineFactoryProviders;
this.indexStoreFactories = indexStoreFactories;
}
@Override
@ -464,7 +468,7 @@ public class IndicesService extends AbstractLifecycleComponent
idxSettings.getNumberOfReplicas(),
reason);
final IndexModule indexModule = new IndexModule(idxSettings, analysisRegistry, getEngineFactory(idxSettings));
final IndexModule indexModule = new IndexModule(idxSettings, analysisRegistry, getEngineFactory(idxSettings), indexStoreFactories);
for (IndexingOperationListener operationListener : indexingOperationListeners) {
indexModule.addIndexOperationListener(operationListener);
}
@ -524,7 +528,7 @@ public class IndicesService extends AbstractLifecycleComponent
*/
public synchronized MapperService createIndexMapperService(IndexMetaData indexMetaData) throws IOException {
final IndexSettings idxSettings = new IndexSettings(indexMetaData, this.settings, indexScopedSettings);
final IndexModule indexModule = new IndexModule(idxSettings, analysisRegistry, getEngineFactory(idxSettings));
final IndexModule indexModule = new IndexModule(idxSettings, analysisRegistry, getEngineFactory(idxSettings), indexStoreFactories);
pluginsService.onIndexModule(indexModule);
return indexModule.newIndexMapperService(xContentRegistry, mapperRegistry, scriptService);
}

View File

@ -361,7 +361,7 @@ public final class ConfigurationUtils {
return readProcessor(processorFactories, type, (Map<String, Object>) config);
} else if (config instanceof String && "script".equals(type)) {
Map<String, Object> normalizedScript = new HashMap<>(1);
normalizedScript.put(ScriptType.INLINE.getName(), config);
normalizedScript.put(ScriptType.INLINE.getParseField().getPreferredName(), config);
return readProcessor(processorFactories, type, normalizedScript);
} else {
throw newConfigurationException(type, null, null,

View File

@ -26,8 +26,8 @@ import org.elasticsearch.Build;
import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.ElasticsearchTimeoutException;
import org.elasticsearch.Version;
import org.elasticsearch.action.ActionModule;
import org.elasticsearch.action.Action;
import org.elasticsearch.action.ActionModule;
import org.elasticsearch.action.search.SearchExecutionStatsCollector;
import org.elasticsearch.action.search.SearchPhaseController;
import org.elasticsearch.action.search.SearchTransportService;
@ -94,6 +94,7 @@ import org.elasticsearch.http.HttpServerTransport;
import org.elasticsearch.index.IndexSettings;
import org.elasticsearch.index.analysis.AnalysisRegistry;
import org.elasticsearch.index.engine.EngineFactory;
import org.elasticsearch.index.store.IndexStore;
import org.elasticsearch.indices.IndicesModule;
import org.elasticsearch.indices.IndicesService;
import org.elasticsearch.indices.analysis.AnalysisModule;
@ -117,6 +118,7 @@ import org.elasticsearch.plugins.AnalysisPlugin;
import org.elasticsearch.plugins.ClusterPlugin;
import org.elasticsearch.plugins.DiscoveryPlugin;
import org.elasticsearch.plugins.EnginePlugin;
import org.elasticsearch.plugins.IndexStorePlugin;
import org.elasticsearch.plugins.IngestPlugin;
import org.elasticsearch.plugins.MapperPlugin;
import org.elasticsearch.plugins.MetaDataUpgrader;
@ -407,11 +409,19 @@ public class Node implements Closeable {
enginePlugins.stream().map(plugin -> plugin::getEngineFactory))
.collect(Collectors.toList());
final Map<String, Function<IndexSettings, IndexStore>> indexStoreFactories =
pluginsService.filterPlugins(IndexStorePlugin.class)
.stream()
.map(IndexStorePlugin::getIndexStoreFactories)
.flatMap(m -> m.entrySet().stream())
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
final IndicesService indicesService =
new IndicesService(settings, pluginsService, nodeEnvironment, xContentRegistry, analysisModule.getAnalysisRegistry(),
clusterModule.getIndexNameExpressionResolver(), indicesModule.getMapperRegistry(), namedWriteableRegistry,
threadPool, settingsModule.getIndexScopedSettings(), circuitBreakerService, bigArrays,
scriptModule.getScriptService(), client, metaStateService, engineFactoryProviders);
scriptModule.getScriptService(), client, metaStateService, engineFactoryProviders, indexStoreFactories);
Collection<Object> pluginComponents = pluginsService.filterPlugins(Plugin.class).stream()
.flatMap(p -> p.createComponents(client, clusterService, threadPool, resourceWatcherService,

View File

@ -0,0 +1,42 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.plugins;
import org.elasticsearch.index.IndexSettings;
import org.elasticsearch.index.store.IndexStore;
import java.util.Map;
import java.util.function.Function;
/**
* A plugin that provides alternative index store implementations.
*/
public interface IndexStorePlugin {
/**
* The index store factories for this plugin. When an index is created the store type setting
* {@link org.elasticsearch.index.IndexModule#INDEX_STORE_TYPE_SETTING} on the index will be examined and either use the default or a
* built-in type, or looked up among all the index store factories from {@link IndexStore} plugins.
*
* @return a map from store type to an index store factory
*/
Map<String, Function<IndexSettings, IndexStore>> getIndexStoreFactories();
}

View File

@ -1555,11 +1555,8 @@ public abstract class BlobStoreRepository extends AbstractLifecycleComponent imp
filesToRecover.add(fileInfo);
recoveryState.getIndex().addFileDetail(fileInfo.name(), fileInfo.length(), false);
if (logger.isTraceEnabled()) {
if (md == null) {
logger.trace("[{}] [{}] recovering [{}] from [{}], does not exists in local store", shardId, snapshotId, fileInfo.physicalName(), fileInfo.name());
} else {
logger.trace("[{}] [{}] recovering [{}] from [{}], exists in local store but is different", shardId, snapshotId, fileInfo.physicalName(), fileInfo.name());
}
logger.trace("[{}] [{}] recovering [{}] from [{}], exists in local store but is different", shardId, snapshotId,
fileInfo.physicalName(), fileInfo.name());
}
}

View File

@ -0,0 +1,72 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index;
import org.apache.lucene.index.TieredMergePolicy;
import org.elasticsearch.test.ESTestCase;
public class EsTieredMergePolicyTests extends ESTestCase {
public void testDefaults() {
EsTieredMergePolicy policy = new EsTieredMergePolicy();
assertEquals(
new TieredMergePolicy().getMaxMergedSegmentMB(),
policy.regularMergePolicy.getMaxMergedSegmentMB(), 0d);
assertEquals(Long.MAX_VALUE / 1024.0 / 1024.0, policy.forcedMergePolicy.getMaxMergedSegmentMB(), 0d);
}
public void testSetMaxMergedSegmentMB() {
EsTieredMergePolicy policy = new EsTieredMergePolicy();
policy.setMaxMergedSegmentMB(10 * 1024);
assertEquals(10 * 1024, policy.regularMergePolicy.getMaxMergedSegmentMB(), 0d);
assertEquals(Long.MAX_VALUE / 1024.0 / 1024.0, policy.forcedMergePolicy.getMaxMergedSegmentMB(), 0d);
}
public void testSetForceMergeDeletesPctAllowed() {
EsTieredMergePolicy policy = new EsTieredMergePolicy();
policy.setForceMergeDeletesPctAllowed(42);
assertEquals(42, policy.forcedMergePolicy.getForceMergeDeletesPctAllowed(), 0);
}
public void testSetFloorSegmentMB() {
EsTieredMergePolicy policy = new EsTieredMergePolicy();
policy.setFloorSegmentMB(42);
assertEquals(42, policy.regularMergePolicy.getFloorSegmentMB(), 0);
assertEquals(42, policy.forcedMergePolicy.getFloorSegmentMB(), 0);
}
public void testSetMaxMergeAtOnce() {
EsTieredMergePolicy policy = new EsTieredMergePolicy();
policy.setMaxMergeAtOnce(42);
assertEquals(42, policy.regularMergePolicy.getMaxMergeAtOnce());
}
public void testSetMaxMergeAtOnceExplicit() {
EsTieredMergePolicy policy = new EsTieredMergePolicy();
policy.setMaxMergeAtOnceExplicit(42);
assertEquals(42, policy.forcedMergePolicy.getMaxMergeAtOnceExplicit());
}
public void testSetSegmentsPerTier() {
EsTieredMergePolicy policy = new EsTieredMergePolicy();
policy.setSegmentsPerTier(42);
assertEquals(42, policy.regularMergePolicy.getSegmentsPerTier(), 0);
}
}

View File

@ -81,10 +81,13 @@ import org.elasticsearch.threadpool.ThreadPool;
import java.io.IOException;
import java.util.Collections;
import java.util.Map;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.function.Function;
import static java.util.Collections.emptyMap;
import static org.hamcrest.Matchers.instanceOf;
public class IndexModuleTests extends ESTestCase {
private Index index;
@ -147,7 +150,8 @@ public class IndexModuleTests extends ESTestCase {
}
public void testWrapperIsBound() throws IOException {
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new MockEngineFactory(AssertingDirectoryReader.class));
final MockEngineFactory engineFactory = new MockEngineFactory(AssertingDirectoryReader.class);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, engineFactory, Collections.emptyMap());
module.setSearcherWrapper((s) -> new Wrapper());
IndexService indexService = newIndexService(module);
@ -164,18 +168,12 @@ public class IndexModuleTests extends ESTestCase {
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
.put(IndexModule.INDEX_STORE_TYPE_SETTING.getKey(), "foo_store")
.build();
IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory());
module.addIndexStore("foo_store", FooStore::new);
try {
module.addIndexStore("foo_store", FooStore::new);
fail("already registered");
} catch (IllegalArgumentException ex) {
// fine
}
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
final Map<String, Function<IndexSettings, IndexStore>> indexStoreFactories = Collections.singletonMap("foo_store", FooStore::new);
final IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), indexStoreFactories);
IndexService indexService = newIndexService(module);
assertTrue(indexService.getIndexStore() instanceof FooStore);
final IndexService indexService = newIndexService(module);
assertThat(indexService.getIndexStore(), instanceOf(FooStore.class));
indexService.close("simon says", false);
}
@ -189,7 +187,7 @@ public class IndexModuleTests extends ESTestCase {
}
};
IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory());
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
module.addIndexEventListener(eventListener);
IndexService indexService = newIndexService(module);
IndexSettings x = indexService.getIndexSettings();
@ -204,7 +202,7 @@ public class IndexModuleTests extends ESTestCase {
public void testListener() throws IOException {
Setting<Boolean> booleanSetting = Setting.boolSetting("index.foo.bar", false, Property.Dynamic, Property.IndexScope);
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings, booleanSetting);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory());
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
Setting<Boolean> booleanSetting2 = Setting.boolSetting("index.foo.bar.baz", false, Property.Dynamic, Property.IndexScope);
AtomicBoolean atomicBoolean = new AtomicBoolean(false);
module.addSettingsUpdateConsumer(booleanSetting, atomicBoolean::set);
@ -223,8 +221,8 @@ public class IndexModuleTests extends ESTestCase {
}
public void testAddIndexOperationListener() throws IOException {
IndexModule module =
new IndexModule(IndexSettingsModule.newIndexSettings(index, settings), emptyAnalysisRegistry, new InternalEngineFactory());
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
AtomicBoolean executed = new AtomicBoolean(false);
IndexingOperationListener listener = new IndexingOperationListener() {
@Override
@ -254,8 +252,8 @@ public class IndexModuleTests extends ESTestCase {
}
public void testAddSearchOperationListener() throws IOException {
IndexModule module =
new IndexModule(IndexSettingsModule.newIndexSettings(index, settings), emptyAnalysisRegistry, new InternalEngineFactory());
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
AtomicBoolean executed = new AtomicBoolean(false);
SearchOperationListener listener = new SearchOperationListener() {
@ -288,8 +286,9 @@ public class IndexModuleTests extends ESTestCase {
.put("index.similarity.my_similarity.key", "there is a key")
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
.build();
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
IndexModule module =
new IndexModule(IndexSettingsModule.newIndexSettings("foo", settings), emptyAnalysisRegistry, new InternalEngineFactory());
new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
module.addSimilarity("test_similarity",
(providerSettings, indexCreatedVersion, scriptService) -> new TestSimilarity(providerSettings.get("key")));
@ -303,8 +302,8 @@ public class IndexModuleTests extends ESTestCase {
}
public void testFrozen() {
IndexModule module =
new IndexModule(IndexSettingsModule.newIndexSettings(index, settings), emptyAnalysisRegistry, new InternalEngineFactory());
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings(index, settings);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
module.freeze();
String msg = "Can't modify IndexModule once the index service has been created";
assertEquals(msg, expectThrows(IllegalStateException.class, () -> module.addSearchOperationListener(null)).getMessage());
@ -313,7 +312,6 @@ public class IndexModuleTests extends ESTestCase {
assertEquals(msg, expectThrows(IllegalStateException.class, () -> module.addSimilarity(null, null)).getMessage());
assertEquals(msg, expectThrows(IllegalStateException.class, () -> module.setSearcherWrapper(null)).getMessage());
assertEquals(msg, expectThrows(IllegalStateException.class, () -> module.forceQueryCacheProvider(null)).getMessage());
assertEquals(msg, expectThrows(IllegalStateException.class, () -> module.addIndexStore("foo", null)).getMessage());
}
public void testSetupUnknownSimilarity() throws IOException {
@ -322,8 +320,9 @@ public class IndexModuleTests extends ESTestCase {
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT)
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
.build();
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
IndexModule module =
new IndexModule(IndexSettingsModule.newIndexSettings("foo", settings), emptyAnalysisRegistry, new InternalEngineFactory());
new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
Exception ex = expectThrows(IllegalArgumentException.class, () -> newIndexService(module));
assertEquals("Unknown Similarity type [test_similarity] for [my_similarity]", ex.getMessage());
}
@ -334,8 +333,8 @@ public class IndexModuleTests extends ESTestCase {
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT)
.build();
IndexModule module =
new IndexModule(IndexSettingsModule.newIndexSettings("foo", settings), emptyAnalysisRegistry, new InternalEngineFactory());
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
Exception ex = expectThrows(IllegalArgumentException.class, () -> newIndexService(module));
assertEquals("Similarity [my_similarity] must have an associated type", ex.getMessage());
}
@ -344,8 +343,8 @@ public class IndexModuleTests extends ESTestCase {
Settings settings = Settings.builder()
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT).build();
IndexModule module =
new IndexModule(IndexSettingsModule.newIndexSettings("foo", settings), emptyAnalysisRegistry, new InternalEngineFactory());
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
module.forceQueryCacheProvider((a, b) -> new CustomQueryCache());
expectThrows(AlreadySetException.class, () -> module.forceQueryCacheProvider((a, b) -> new CustomQueryCache()));
IndexService indexService = newIndexService(module);
@ -357,8 +356,8 @@ public class IndexModuleTests extends ESTestCase {
Settings settings = Settings.builder()
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT).build();
IndexModule module =
new IndexModule(IndexSettingsModule.newIndexSettings("foo", settings), emptyAnalysisRegistry, new InternalEngineFactory());
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
IndexService indexService = newIndexService(module);
assertTrue(indexService.cache().query() instanceof IndexQueryCache);
indexService.close("simon says", false);
@ -369,8 +368,8 @@ public class IndexModuleTests extends ESTestCase {
.put(IndexModule.INDEX_QUERY_CACHE_ENABLED_SETTING.getKey(), false)
.put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString())
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT).build();
IndexModule module =
new IndexModule(IndexSettingsModule.newIndexSettings("foo", settings), emptyAnalysisRegistry, new InternalEngineFactory());
final IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("foo", settings);
IndexModule module = new IndexModule(indexSettings, emptyAnalysisRegistry, new InternalEngineFactory(), Collections.emptyMap());
module.forceQueryCacheProvider((a, b) -> new CustomQueryCache());
IndexService indexService = newIndexService(module);
assertTrue(indexService.cache().query() instanceof DisabledQueryCache);

View File

@ -19,7 +19,6 @@
package org.elasticsearch.index;
import org.apache.lucene.index.NoMergePolicy;
import org.apache.lucene.index.TieredMergePolicy;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.unit.ByteSizeUnit;
import org.elasticsearch.common.unit.ByteSizeValue;
@ -76,43 +75,38 @@ public class MergePolicySettingsTests extends ESTestCase {
public void testTieredMergePolicySettingsUpdate() throws IOException {
IndexSettings indexSettings = indexSettings(Settings.EMPTY);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getForceMergeDeletesPctAllowed(), MergePolicyConfig.DEFAULT_EXPUNGE_DELETES_ALLOWED, 0.0d);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getForceMergeDeletesPctAllowed(), MergePolicyConfig.DEFAULT_EXPUNGE_DELETES_ALLOWED, 0.0d);
indexSettings.updateIndexMetaData(newIndexMeta("index", Settings.builder().put(MergePolicyConfig.INDEX_MERGE_POLICY_EXPUNGE_DELETES_ALLOWED_SETTING.getKey(), MergePolicyConfig.DEFAULT_EXPUNGE_DELETES_ALLOWED + 1.0d).build()));
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getForceMergeDeletesPctAllowed(), MergePolicyConfig.DEFAULT_EXPUNGE_DELETES_ALLOWED + 1.0d, 0.0d);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getForceMergeDeletesPctAllowed(), MergePolicyConfig.DEFAULT_EXPUNGE_DELETES_ALLOWED + 1.0d, 0.0d);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getFloorSegmentMB(), MergePolicyConfig.DEFAULT_FLOOR_SEGMENT.getMbFrac(), 0);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getFloorSegmentMB(), MergePolicyConfig.DEFAULT_FLOOR_SEGMENT.getMbFrac(), 0);
indexSettings.updateIndexMetaData(newIndexMeta("index", Settings.builder().put(MergePolicyConfig.INDEX_MERGE_POLICY_FLOOR_SEGMENT_SETTING.getKey(), new ByteSizeValue(MergePolicyConfig.DEFAULT_FLOOR_SEGMENT.getMb() + 1, ByteSizeUnit.MB)).build()));
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getFloorSegmentMB(), new ByteSizeValue(MergePolicyConfig.DEFAULT_FLOOR_SEGMENT.getMb() + 1, ByteSizeUnit.MB).getMbFrac(), 0.001);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getFloorSegmentMB(), new ByteSizeValue(MergePolicyConfig.DEFAULT_FLOOR_SEGMENT.getMb() + 1, ByteSizeUnit.MB).getMbFrac(), 0.001);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnce(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnce(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE);
indexSettings.updateIndexMetaData(newIndexMeta("index", Settings.builder().put(MergePolicyConfig.INDEX_MERGE_POLICY_MAX_MERGE_AT_ONCE_SETTING.getKey(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE - 1).build()));
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnce(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE - 1);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnce(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE - 1);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnceExplicit(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE_EXPLICIT);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnceExplicit(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE_EXPLICIT);
indexSettings.updateIndexMetaData(newIndexMeta("index", Settings.builder().put(MergePolicyConfig.INDEX_MERGE_POLICY_MAX_MERGE_AT_ONCE_EXPLICIT_SETTING.getKey(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE_EXPLICIT - 1).build()));
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnceExplicit(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE_EXPLICIT-1);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnceExplicit(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE_EXPLICIT-1);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergedSegmentMB(), MergePolicyConfig.DEFAULT_MAX_MERGED_SEGMENT.getMbFrac(), 0.0001);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergedSegmentMB(), MergePolicyConfig.DEFAULT_MAX_MERGED_SEGMENT.getMbFrac(), 0.0001);
indexSettings.updateIndexMetaData(newIndexMeta("index", Settings.builder().put(MergePolicyConfig.INDEX_MERGE_POLICY_MAX_MERGED_SEGMENT_SETTING.getKey(), new ByteSizeValue(MergePolicyConfig.DEFAULT_MAX_MERGED_SEGMENT.getBytes() + 1)).build()));
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergedSegmentMB(), new ByteSizeValue(MergePolicyConfig.DEFAULT_MAX_MERGED_SEGMENT.getBytes() + 1).getMbFrac(), 0.0001);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergedSegmentMB(), new ByteSizeValue(MergePolicyConfig.DEFAULT_MAX_MERGED_SEGMENT.getBytes() + 1).getMbFrac(), 0.0001);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getReclaimDeletesWeight(), MergePolicyConfig.DEFAULT_RECLAIM_DELETES_WEIGHT, 0);
indexSettings.updateIndexMetaData(newIndexMeta("index", Settings.builder().put(MergePolicyConfig.INDEX_MERGE_POLICY_RECLAIM_DELETES_WEIGHT_SETTING.getKey(), MergePolicyConfig.DEFAULT_RECLAIM_DELETES_WEIGHT + 1).build()));
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getReclaimDeletesWeight(), MergePolicyConfig.DEFAULT_RECLAIM_DELETES_WEIGHT + 1, 0);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getSegmentsPerTier(), MergePolicyConfig.DEFAULT_SEGMENTS_PER_TIER, 0);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getSegmentsPerTier(), MergePolicyConfig.DEFAULT_SEGMENTS_PER_TIER, 0);
indexSettings.updateIndexMetaData(newIndexMeta("index", Settings.builder().put(MergePolicyConfig.INDEX_MERGE_POLICY_SEGMENTS_PER_TIER_SETTING.getKey(), MergePolicyConfig.DEFAULT_SEGMENTS_PER_TIER + 1).build()));
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getSegmentsPerTier(), MergePolicyConfig.DEFAULT_SEGMENTS_PER_TIER + 1, 0);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getSegmentsPerTier(), MergePolicyConfig.DEFAULT_SEGMENTS_PER_TIER + 1, 0);
indexSettings.updateIndexMetaData(newIndexMeta("index", EMPTY_SETTINGS)); // see if defaults are restored
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getForceMergeDeletesPctAllowed(), MergePolicyConfig.DEFAULT_EXPUNGE_DELETES_ALLOWED, 0.0d);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getFloorSegmentMB(), new ByteSizeValue(MergePolicyConfig.DEFAULT_FLOOR_SEGMENT.getMb(), ByteSizeUnit.MB).getMbFrac(), 0.00);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnce(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnceExplicit(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE_EXPLICIT);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergedSegmentMB(), new ByteSizeValue(MergePolicyConfig.DEFAULT_MAX_MERGED_SEGMENT.getBytes() + 1).getMbFrac(), 0.0001);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getReclaimDeletesWeight(), MergePolicyConfig.DEFAULT_RECLAIM_DELETES_WEIGHT, 0);
assertEquals(((TieredMergePolicy) indexSettings.getMergePolicy()).getSegmentsPerTier(), MergePolicyConfig.DEFAULT_SEGMENTS_PER_TIER, 0);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getForceMergeDeletesPctAllowed(), MergePolicyConfig.DEFAULT_EXPUNGE_DELETES_ALLOWED, 0.0d);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getFloorSegmentMB(), new ByteSizeValue(MergePolicyConfig.DEFAULT_FLOOR_SEGMENT.getMb(), ByteSizeUnit.MB).getMbFrac(), 0.00);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnce(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergeAtOnceExplicit(), MergePolicyConfig.DEFAULT_MAX_MERGE_AT_ONCE_EXPLICIT);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getMaxMergedSegmentMB(), new ByteSizeValue(MergePolicyConfig.DEFAULT_MAX_MERGED_SEGMENT.getBytes() + 1).getMbFrac(), 0.0001);
assertEquals(((EsTieredMergePolicy) indexSettings.getMergePolicy()).getSegmentsPerTier(), MergePolicyConfig.DEFAULT_SEGMENTS_PER_TIER, 0);
}
public Settings build(String value) {

View File

@ -19,9 +19,42 @@
package org.elasticsearch.index.engine;
import java.io.Closeable;
import java.io.IOException;
import java.io.UncheckedIOException;
import java.nio.charset.Charset;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Base64;
import java.util.Collections;
import java.util.Comparator;
import java.util.HashSet;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Queue;
import java.util.Set;
import java.util.concurrent.BrokenBarrierException;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.CyclicBarrier;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicLong;
import java.util.concurrent.atomic.AtomicReference;
import java.util.function.BiFunction;
import java.util.function.Function;
import java.util.function.LongSupplier;
import java.util.function.Supplier;
import java.util.function.ToLongBiFunction;
import java.util.stream.Collectors;
import java.util.stream.LongStream;
import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
import com.carrotsearch.randomizedtesting.generators.RandomNumbers;
import org.apache.logging.log4j.Level;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
@ -128,40 +161,6 @@ import org.elasticsearch.test.IndexSettingsModule;
import org.hamcrest.MatcherAssert;
import org.hamcrest.Matchers;
import java.io.Closeable;
import java.io.IOException;
import java.io.UncheckedIOException;
import java.nio.charset.Charset;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Base64;
import java.util.Collections;
import java.util.Comparator;
import java.util.HashSet;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Queue;
import java.util.Set;
import java.util.concurrent.BrokenBarrierException;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.CyclicBarrier;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicLong;
import java.util.concurrent.atomic.AtomicReference;
import java.util.function.BiFunction;
import java.util.function.Function;
import java.util.function.LongSupplier;
import java.util.function.Supplier;
import java.util.function.ToLongBiFunction;
import java.util.stream.Collectors;
import java.util.stream.LongStream;
import static java.util.Collections.emptyMap;
import static java.util.Collections.shuffle;
import static org.elasticsearch.index.engine.Engine.Operation.Origin.LOCAL_TRANSLOG_RECOVERY;
@ -2050,6 +2049,7 @@ public class InternalEngineTests extends EngineTestCase {
}
}
@AwaitsFix(bugUrl = "https://github.com/elastic/elasticsearch/issues/32430")
public void testSeqNoAndCheckpoints() throws IOException {
final int opCount = randomIntBetween(1, 256);
long primarySeqNo = SequenceNumbers.NO_OPS_PERFORMED;

View File

@ -89,7 +89,6 @@ public class DisMaxQueryBuilderTests extends AbstractQueryTestCase<DisMaxQueryBu
}
public void testToQueryInnerPrefixQuery() throws Exception {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String queryAsString = "{\n" +
" \"dis_max\":{\n" +
" \"queries\":[\n" +

View File

@ -68,11 +68,7 @@ public class ExistsQueryBuilderTests extends AbstractQueryTestCase<ExistsQueryBu
Collection<String> fields = context.getQueryShardContext().simpleMatchToIndexNames(fieldPattern);
Collection<String> mappedFields = fields.stream().filter((field) -> context.getQueryShardContext().getObjectMapper(field) != null
|| context.getQueryShardContext().getMapperService().fullName(field) != null).collect(Collectors.toList());
if (getCurrentTypes().length == 0) {
assertThat(query, instanceOf(MatchNoDocsQuery.class));
MatchNoDocsQuery matchNoDocsQuery = (MatchNoDocsQuery) query;
assertThat(matchNoDocsQuery.toString(null), containsString("Missing types in \"exists\" query."));
} else if (context.mapperService().getIndexSettings().getIndexVersionCreated().before(Version.V_6_1_0)) {
if (context.mapperService().getIndexSettings().getIndexVersionCreated().before(Version.V_6_1_0)) {
if (fields.size() == 1) {
assertThat(query, instanceOf(ConstantScoreQuery.class));
ConstantScoreQuery constantScoreQuery = (ConstantScoreQuery) query;

View File

@ -105,7 +105,6 @@ public class FuzzyQueryBuilderTests extends AbstractQueryTestCase<FuzzyQueryBuil
}
public void testToQueryWithStringField() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"fuzzy\":{\n" +
" \"" + STRING_FIELD_NAME + "\":{\n" +
@ -128,7 +127,6 @@ public class FuzzyQueryBuilderTests extends AbstractQueryTestCase<FuzzyQueryBuil
}
public void testToQueryWithStringFieldDefinedFuzziness() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"fuzzy\":{\n" +
" \"" + STRING_FIELD_NAME + "\":{\n" +
@ -151,7 +149,6 @@ public class FuzzyQueryBuilderTests extends AbstractQueryTestCase<FuzzyQueryBuil
}
public void testToQueryWithStringFieldDefinedWrongFuzziness() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String queryMissingFuzzinessUpLimit = "{\n" +
" \"fuzzy\":{\n" +
" \"" + STRING_FIELD_NAME + "\":{\n" +
@ -214,7 +211,6 @@ public class FuzzyQueryBuilderTests extends AbstractQueryTestCase<FuzzyQueryBuil
}
public void testToQueryWithNumericField() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"fuzzy\":{\n" +
" \"" + INT_FIELD_NAME + "\":{\n" +
@ -299,7 +295,6 @@ public class FuzzyQueryBuilderTests extends AbstractQueryTestCase<FuzzyQueryBuil
}
public void testToQueryWithTranspositions() throws Exception {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
Query query = new FuzzyQueryBuilder(STRING_FIELD_NAME, "text").toQuery(createShardContext());
assertThat(query, instanceOf(FuzzyQuery.class));
assertEquals(FuzzyQuery.defaultTranspositions, ((FuzzyQuery)query).getTranspositions());

View File

@ -39,7 +39,6 @@ import java.io.IOException;
import static org.hamcrest.CoreMatchers.containsString;
import static org.hamcrest.CoreMatchers.instanceOf;
import static org.hamcrest.CoreMatchers.notNullValue;
import static org.hamcrest.Matchers.startsWith;
public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBoundingBoxQueryBuilder> {
/** Randomly generate either NaN or one of the two infinity values. */
@ -110,16 +109,12 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
assertEquals("cannot parse type from null string", e.getMessage());
}
@Override
public void testToQuery() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
super.testToQuery();
}
public void testExceptionOnMissingTypes() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length == 0);
QueryShardException e = expectThrows(QueryShardException.class, super::testToQuery);
assertThat(e.getMessage(), startsWith("failed to find geo_point field [mapped_geo_point"));
public void testExceptionOnMissingTypes() {
QueryShardContext context = createShardContextWithNoType();
GeoBoundingBoxQueryBuilder qb = createTestQueryBuilder();
qb.ignoreUnmapped(false);
QueryShardException e = expectThrows(QueryShardException.class, () -> qb.toQuery(context));
assertEquals("failed to find geo_point field [" + qb.fieldName() + "]", e.getMessage());
}
public void testBrokenCoordinateCannotBeSet() {
@ -295,7 +290,6 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
}
public void testParsingAndToQuery1() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"geo_bounding_box\":{\n" +
" \"" + GEO_POINT_FIELD_NAME+ "\":{\n" +
@ -308,7 +302,6 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
}
public void testParsingAndToQuery2() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"geo_bounding_box\":{\n" +
" \"" + GEO_POINT_FIELD_NAME+ "\":{\n" +
@ -327,7 +320,6 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
}
public void testParsingAndToQuery3() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"geo_bounding_box\":{\n" +
" \"" + GEO_POINT_FIELD_NAME+ "\":{\n" +
@ -340,7 +332,6 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
}
public void testParsingAndToQuery4() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"geo_bounding_box\":{\n" +
" \"" + GEO_POINT_FIELD_NAME+ "\":{\n" +
@ -353,7 +344,6 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
}
public void testParsingAndToQuery5() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"geo_bounding_box\":{\n" +
" \"" + GEO_POINT_FIELD_NAME+ "\":{\n" +
@ -366,7 +356,6 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
}
public void testParsingAndToQuery6() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"geo_bounding_box\":{\n" +
" \"" + GEO_POINT_FIELD_NAME+ "\":{\n" +
@ -513,7 +502,6 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
}
public void testHonorsCoercion() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
String query = "{\n" +
" \"geo_bounding_box\": {\n" +
" \"validation_method\": \"COERCE\",\n" +
@ -534,7 +522,6 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
@Override
public void testMustRewrite() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
super.testMustRewrite();
}

Some files were not shown because too many files have changed in this diff Show More