Merge branch 'master' into index-lifecycle

This commit is contained in:
Colin Goodheart-Smithe 2018-06-04 10:14:17 +01:00
commit 1c40da8fbf
No known key found for this signature in database
GPG Key ID: F975E7BDD739B3C7
66 changed files with 898 additions and 190 deletions

View File

@ -5,7 +5,7 @@ setlocal enableextensions
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.common.settings.KeyStoreCli ^ org.elasticsearch.common.settings.KeyStoreCli ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal

View File

@ -6,7 +6,7 @@ setlocal enableextensions
set ES_ADDITIONAL_CLASSPATH_DIRECTORIES=lib/tools/plugin-cli set ES_ADDITIONAL_CLASSPATH_DIRECTORIES=lib/tools/plugin-cli
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.plugins.PluginCli ^ org.elasticsearch.plugins.PluginCli ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal

View File

@ -5,7 +5,7 @@ setlocal enableextensions
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.index.translog.TranslogToolCli ^ org.elasticsearch.index.translog.TranslogToolCli ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal

View File

@ -1,47 +1,23 @@
[[mapping-field-names-field]] [[mapping-field-names-field]]
=== `_field_names` field === `_field_names` field
The `_field_names` field indexes the names of every field in a document that The `_field_names` field used to index the names of every field in a document that
contains any value other than `null`. This field is used by the contains any value other than `null`. This field was used by the
<<query-dsl-exists-query,`exists`>> query to find documents that <<query-dsl-exists-query,`exists`>> query to find documents that
either have or don't have any non-+null+ value for a particular field. either have or don't have any non-+null+ value for a particular field.
The value of the `_field_names` field is accessible in queries: Now the `_field_names` field only indexes the names of fields that have
`doc_values` and `norms` disabled. For fields which have either `doc_values`
[source,js] or `norm` enabled the <<query-dsl-exists-query,`exists`>> query will still
-------------------------- be available but will not use the `_field_names` field.
# Example documents
PUT my_index/_doc/1
{
"title": "This is a document"
}
PUT my_index/_doc/2?refresh=true
{
"title": "This is another document",
"body": "This document has a body"
}
GET my_index/_search
{
"query": {
"terms": {
"_field_names": [ "title" ] <1>
}
}
}
--------------------------
// CONSOLE
<1> Querying on the `_field_names` field (also see the <<query-dsl-exists-query,`exists`>> query)
==== Disabling `_field_names` ==== Disabling `_field_names`
Because `_field_names` introduce some index-time overhead, you might want to Disabling `_field_names` is often not necessary because it no longer
disable this field if you want to optimize for indexing speed and do not need carries the index overhead it once did. If you have a lot of fields
`exists` queries. which have `doc_values` and `norms` disabled and you do not need to
execute `exists` queries using those fields you might want to disable
`_field_names` be adding the following to the mappings:
[source,js] [source,js]
-------------------------------------------------- --------------------------------------------------

View File

@ -175,7 +175,7 @@ PUT /example
"location": { "location": {
"type": "geo_shape", "type": "geo_shape",
"tree": "quadtree", "tree": "quadtree",
"precision": "1m" "precision": "100m"
} }
} }
} }
@ -186,8 +186,8 @@ PUT /example
// TESTSETUP // TESTSETUP
This mapping maps the location field to the geo_shape type using the This mapping maps the location field to the geo_shape type using the
quad_tree implementation and a precision of 1m. Elasticsearch translates quad_tree implementation and a precision of 100m. Elasticsearch translates
this into a tree_levels setting of 26. this into a tree_levels setting of 20.
[float] [float]
===== Performance considerations ===== Performance considerations
@ -364,7 +364,6 @@ POST /example/doc
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
The following is an example of a Polygon with a hole in WKT: The following is an example of a Polygon with a hole in WKT:
@ -376,7 +375,6 @@ POST /example/doc
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
*IMPORTANT NOTE:* WKT does not enforce a specific order for vertices thus *IMPORTANT NOTE:* WKT does not enforce a specific order for vertices thus
ambiguous polygons around the dateline and poles are possible. ambiguous polygons around the dateline and poles are possible.
@ -411,7 +409,7 @@ POST /example/doc
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836] // TEST[catch:/mapper_parsing_exception/]
An `orientation` parameter can be defined when setting the geo_shape mapping (see <<geo-shape-mapping-options>>). This will define vertex An `orientation` parameter can be defined when setting the geo_shape mapping (see <<geo-shape-mapping-options>>). This will define vertex
order for the coordinate list on the mapped geo_shape field. It can also be overridden on each document. The following is an example for order for the coordinate list on the mapped geo_shape field. It can also be overridden on each document. The following is an example for
@ -425,14 +423,12 @@ POST /example/doc
"type" : "polygon", "type" : "polygon",
"orientation" : "clockwise", "orientation" : "clockwise",
"coordinates" : [ "coordinates" : [
[ [-177.0, 10.0], [176.0, 15.0], [172.0, 0.0], [176.0, -15.0], [-177.0, -10.0], [-177.0, 10.0] ], [ [100.0, 0.0], [100.0, 1.0], [101.0, 1.0], [101.0, 0.0], [100.0, 0.0] ]
[ [178.2, 8.2], [-178.8, 8.2], [-180.8, -8.8], [178.2, 8.8] ]
] ]
} }
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
[float] [float]
===== http://www.geojson.org/geojson-spec.html#id5[MultiPoint] ===== http://www.geojson.org/geojson-spec.html#id5[MultiPoint]
@ -484,7 +480,6 @@ POST /example/doc
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
The following is an example of a list of WKT linestrings: The following is an example of a list of WKT linestrings:
@ -496,7 +491,6 @@ POST /example/doc
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
[float] [float]
===== http://www.geojson.org/geojson-spec.html#id7[MultiPolygon] ===== http://www.geojson.org/geojson-spec.html#id7[MultiPolygon]
@ -518,7 +512,6 @@ POST /example/doc
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
The following is an example of a list of WKT polygons (second polygon contains a hole): The following is an example of a list of WKT polygons (second polygon contains a hole):
@ -530,7 +523,6 @@ POST /example/doc
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
[float] [float]
===== http://geojson.org/geojson-spec.html#geometrycollection[Geometry Collection] ===== http://geojson.org/geojson-spec.html#geometrycollection[Geometry Collection]
@ -557,7 +549,6 @@ POST /example/doc
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
The following is an example of a collection of WKT geometry objects: The following is an example of a collection of WKT geometry objects:
@ -569,7 +560,6 @@ POST /example/doc
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
[float] [float]
@ -585,12 +575,11 @@ POST /example/doc
{ {
"location" : { "location" : {
"type" : "envelope", "type" : "envelope",
"coordinates" : [ [-45.0, 45.0], [45.0, -45.0] ] "coordinates" : [ [100.0, 1.0], [101.0, 0.0] ]
} }
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
The following is an example of an envelope using the WKT BBOX format: The following is an example of an envelope using the WKT BBOX format:
@ -600,11 +589,10 @@ The following is an example of an envelope using the WKT BBOX format:
-------------------------------------------------- --------------------------------------------------
POST /example/doc POST /example/doc
{ {
"location" : "BBOX (-45.0, 45.0, 45.0, -45.0)" "location" : "BBOX (100.0, 102.0, 2.0, 0.0)"
} }
-------------------------------------------------- --------------------------------------------------
// CONSOLE // CONSOLE
// TEST[skip:https://github.com/elastic/elasticsearch/issues/23836]
[float] [float]
===== Circle ===== Circle
@ -618,7 +606,7 @@ POST /example/doc
{ {
"location" : { "location" : {
"type" : "circle", "type" : "circle",
"coordinates" : [-45.0, 45.0], "coordinates" : [101.0, 1.0],
"radius" : "100m" "radius" : "100m"
} }
} }

View File

@ -96,6 +96,14 @@ The following parameters are accepted by `text` fields:
the expense of a larger index. Accepts an the expense of a larger index. Accepts an
<<index-prefix-config,`index-prefix configuration block`>> <<index-prefix-config,`index-prefix configuration block`>>
<<index-phrases,`index_phrases`>>::
If enabled, two-term word combinations ('shingles') are indexed into a separate
field. This allows exact phrase queries to run more efficiently, at the expense
of a larger index. Note that this works best when stopwords are not removed,
as phrases containing stopwords will not use the subsidiary field and will fall
back to a standard phrase query. Accepts `true` or `false` (default).
<<norms,`norms`>>:: <<norms,`norms`>>::
Whether field-length should be taken into account when scoring queries. Whether field-length should be taken into account when scoring queries.

View File

@ -80,12 +80,12 @@ The accounting circuit breaker allows Elasticsearch to limit the memory
usage of things held in memory that are not released when a request is usage of things held in memory that are not released when a request is
completed. This includes things like the Lucene segment memory. completed. This includes things like the Lucene segment memory.
`network.breaker.accounting.limit`:: `indices.breaker.accounting.limit`::
Limit for accounting breaker, defaults to 100% of JVM heap. This means that it is bound Limit for accounting breaker, defaults to 100% of JVM heap. This means that it is bound
by the limit configured for the parent circuit breaker. by the limit configured for the parent circuit breaker.
`network.breaker.accounting.overhead`:: `indices.breaker.accounting.overhead`::
A constant that all accounting estimations are multiplied with to determine a A constant that all accounting estimations are multiplied with to determine a
final estimation. Defaults to 1 final estimation. Defaults to 1

View File

@ -119,7 +119,19 @@ public class Netty4HttpRequest extends RestRequest {
return Method.OPTIONS; return Method.OPTIONS;
} }
return Method.GET; if (httpMethod == HttpMethod.PATCH) {
return Method.PATCH;
}
if (httpMethod == HttpMethod.TRACE) {
return Method.TRACE;
}
if (httpMethod == HttpMethod.CONNECT) {
return Method.CONNECT;
}
throw new IllegalArgumentException("Unexpected http method: " + httpMethod);
} }
@Override @Override

View File

@ -105,7 +105,7 @@ final class ESLoggingHandler extends LoggingHandler {
context.readHeaders(in); context.readHeaders(in);
} }
// now we decode the features // now we decode the features
if (in.getVersion().onOrAfter(Version.V_7_0_0_alpha1)) { if (in.getVersion().onOrAfter(Version.V_6_3_0)) {
in.readStringArray(); in.readStringArray();
} }
// now we can decode the action name // now we can decode the action name

View File

@ -84,7 +84,19 @@ public class NioHttpRequest extends RestRequest {
return Method.OPTIONS; return Method.OPTIONS;
} }
return Method.GET; if (httpMethod == HttpMethod.PATCH) {
return Method.PATCH;
}
if (httpMethod == HttpMethod.TRACE) {
return Method.TRACE;
}
if (httpMethod == HttpMethod.CONNECT) {
return Method.CONNECT;
}
throw new IllegalArgumentException("Unexpected http method: " + httpMethod);
} }
@Override @Override

View File

@ -0,0 +1,67 @@
---
"search with indexed phrases":
- skip:
version: " - 6.99.99"
reason: index_phrase is only available as of 7.0.0
- do:
indices.create:
index: test
body:
mappings:
test:
properties:
text:
type: text
index_phrases: true
- do:
index:
index: test
type: test
id: 1
body: { text: "peter piper picked a peck of pickled peppers" }
- do:
indices.refresh:
index: [test]
- do:
search:
index: test
body:
query:
match_phrase:
text:
query: "peter piper"
- match: {hits.total: 1}
- do:
search:
index: test
q: '"peter piper"~1'
df: text
- match: {hits.total: 1}
- do:
search:
index: test
body:
query:
match_phrase:
text: "peter piper picked"
- match: {hits.total: 1}
- do:
search:
index: test
body:
query:
match_phrase:
text: "piper"
- match: {hits.total: 1}

View File

@ -22,7 +22,6 @@ package org.elasticsearch.cluster;
import com.carrotsearch.hppc.cursors.IntObjectCursor; import com.carrotsearch.hppc.cursors.IntObjectCursor;
import com.carrotsearch.hppc.cursors.ObjectCursor; import com.carrotsearch.hppc.cursors.ObjectCursor;
import com.carrotsearch.hppc.cursors.ObjectObjectCursor; import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
import org.elasticsearch.client.transport.TransportClient; import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.cluster.block.ClusterBlock; import org.elasticsearch.cluster.block.ClusterBlock;
import org.elasticsearch.cluster.block.ClusterBlocks; import org.elasticsearch.cluster.block.ClusterBlocks;
@ -50,6 +49,7 @@ import org.elasticsearch.common.io.stream.NamedWriteableAwareStreamInput;
import org.elasticsearch.common.io.stream.NamedWriteableRegistry; import org.elasticsearch.common.io.stream.NamedWriteableRegistry;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.io.stream.VersionedNamedWriteable;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.ToXContentFragment; import org.elasticsearch.common.xcontent.ToXContentFragment;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
@ -122,7 +122,7 @@ public class ClusterState implements ToXContentFragment, Diffable<ClusterState>
* @param <T> the type of the custom * @param <T> the type of the custom
* @return true if the custom should be serialized and false otherwise * @return true if the custom should be serialized and false otherwise
*/ */
static <T extends NamedDiffable & FeatureAware> boolean shouldSerializeCustom(final StreamOutput out, final T custom) { static <T extends VersionedNamedWriteable & FeatureAware> boolean shouldSerialize(final StreamOutput out, final T custom) {
if (out.getVersion().before(custom.getMinimalSupportedVersion())) { if (out.getVersion().before(custom.getMinimalSupportedVersion())) {
return false; return false;
} }
@ -748,13 +748,13 @@ public class ClusterState implements ToXContentFragment, Diffable<ClusterState>
// filter out custom states not supported by the other node // filter out custom states not supported by the other node
int numberOfCustoms = 0; int numberOfCustoms = 0;
for (final ObjectCursor<Custom> cursor : customs.values()) { for (final ObjectCursor<Custom> cursor : customs.values()) {
if (FeatureAware.shouldSerializeCustom(out, cursor.value)) { if (FeatureAware.shouldSerialize(out, cursor.value)) {
numberOfCustoms++; numberOfCustoms++;
} }
} }
out.writeVInt(numberOfCustoms); out.writeVInt(numberOfCustoms);
for (final ObjectCursor<Custom> cursor : customs.values()) { for (final ObjectCursor<Custom> cursor : customs.values()) {
if (FeatureAware.shouldSerializeCustom(out, cursor.value)) { if (FeatureAware.shouldSerialize(out, cursor.value)) {
out.writeNamedWriteable(cursor.value); out.writeNamedWriteable(cursor.value);
} }
} }

View File

@ -19,17 +19,10 @@
package org.elasticsearch.cluster; package org.elasticsearch.cluster;
import org.elasticsearch.Version; import org.elasticsearch.common.io.stream.VersionedNamedWriteable;
import org.elasticsearch.common.io.stream.NamedWriteable;
/** /**
* Diff that also support NamedWriteable interface * Diff that also support {@link VersionedNamedWriteable} interface
*/ */
public interface NamedDiffable<T> extends Diffable<T>, NamedWriteable { public interface NamedDiffable<T> extends Diffable<T>, VersionedNamedWriteable {
/**
* The minimal version of the recipient this custom object can be sent to
*/
default Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumIndexCompatibilityVersion();
}
} }

View File

@ -20,14 +20,15 @@
package org.elasticsearch.cluster; package org.elasticsearch.cluster;
import com.carrotsearch.hppc.cursors.ObjectObjectCursor; import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
import org.elasticsearch.Version;
import org.elasticsearch.cluster.ClusterState.Custom; import org.elasticsearch.cluster.ClusterState.Custom;
import org.elasticsearch.snapshots.Snapshot;
import org.elasticsearch.common.collect.ImmutableOpenMap; import org.elasticsearch.common.collect.ImmutableOpenMap;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.xcontent.ToXContent; import org.elasticsearch.common.xcontent.ToXContent;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.index.shard.ShardId; import org.elasticsearch.index.shard.ShardId;
import org.elasticsearch.snapshots.Snapshot;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -382,6 +383,11 @@ public class RestoreInProgress extends AbstractNamedDiffable<Custom> implements
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumCompatibilityVersion();
}
public static NamedDiff<Custom> readDiffFrom(StreamInput in) throws IOException { public static NamedDiff<Custom> readDiffFrom(StreamInput in) throws IOException {
return readDiffFrom(Custom.class, TYPE, in); return readDiffFrom(Custom.class, TYPE, in);
} }

View File

@ -395,6 +395,11 @@ public class SnapshotsInProgress extends AbstractNamedDiffable<Custom> implement
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumCompatibilityVersion();
}
public static NamedDiff<Custom> readDiffFrom(StreamInput in) throws IOException { public static NamedDiff<Custom> readDiffFrom(StreamInput in) throws IOException {
return readDiffFrom(Custom.class, TYPE, in); return readDiffFrom(Custom.class, TYPE, in);
} }

View File

@ -19,6 +19,7 @@
package org.elasticsearch.cluster.metadata; package org.elasticsearch.cluster.metadata;
import org.elasticsearch.Version;
import org.elasticsearch.cluster.Diff; import org.elasticsearch.cluster.Diff;
import org.elasticsearch.cluster.NamedDiff; import org.elasticsearch.cluster.NamedDiff;
import org.elasticsearch.common.ParseField; import org.elasticsearch.common.ParseField;
@ -34,8 +35,6 @@ import org.elasticsearch.common.xcontent.ToXContentObject;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.index.Index; import org.elasticsearch.index.Index;
import org.joda.time.DateTime;
import org.joda.time.DateTimeZone;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -44,7 +43,6 @@ import java.util.Collections;
import java.util.EnumSet; import java.util.EnumSet;
import java.util.List; import java.util.List;
import java.util.Objects; import java.util.Objects;
import java.util.concurrent.TimeUnit;
/** /**
* A collection of tombstones for explicitly marking indices as deleted in the cluster state. * A collection of tombstones for explicitly marking indices as deleted in the cluster state.
@ -97,6 +95,11 @@ public final class IndexGraveyard implements MetaData.Custom {
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumCompatibilityVersion();
}
@Override @Override
public EnumSet<MetaData.XContentContext> context() { public EnumSet<MetaData.XContentContext> context() {
return MetaData.API_AND_GATEWAY; return MetaData.API_AND_GATEWAY;

View File

@ -786,13 +786,13 @@ public class MetaData implements Iterable<IndexMetaData>, Diffable<MetaData>, To
// filter out custom states not supported by the other node // filter out custom states not supported by the other node
int numberOfCustoms = 0; int numberOfCustoms = 0;
for (final ObjectCursor<Custom> cursor : customs.values()) { for (final ObjectCursor<Custom> cursor : customs.values()) {
if (FeatureAware.shouldSerializeCustom(out, cursor.value)) { if (FeatureAware.shouldSerialize(out, cursor.value)) {
numberOfCustoms++; numberOfCustoms++;
} }
} }
out.writeVInt(numberOfCustoms); out.writeVInt(numberOfCustoms);
for (final ObjectCursor<Custom> cursor : customs.values()) { for (final ObjectCursor<Custom> cursor : customs.values()) {
if (FeatureAware.shouldSerializeCustom(out, cursor.value)) { if (FeatureAware.shouldSerialize(out, cursor.value)) {
out.writeNamedWriteable(cursor.value); out.writeNamedWriteable(cursor.value);
} }
} }

View File

@ -20,6 +20,7 @@
package org.elasticsearch.cluster.metadata; package org.elasticsearch.cluster.metadata;
import org.elasticsearch.ElasticsearchParseException; import org.elasticsearch.ElasticsearchParseException;
import org.elasticsearch.Version;
import org.elasticsearch.cluster.AbstractNamedDiffable; import org.elasticsearch.cluster.AbstractNamedDiffable;
import org.elasticsearch.cluster.NamedDiff; import org.elasticsearch.cluster.NamedDiff;
import org.elasticsearch.cluster.metadata.MetaData.Custom; import org.elasticsearch.cluster.metadata.MetaData.Custom;
@ -103,6 +104,11 @@ public class RepositoriesMetaData extends AbstractNamedDiffable<Custom> implemen
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumCompatibilityVersion();
}
public RepositoriesMetaData(StreamInput in) throws IOException { public RepositoriesMetaData(StreamInput in) throws IOException {
RepositoryMetaData[] repository = new RepositoryMetaData[in.readVInt()]; RepositoryMetaData[] repository = new RepositoryMetaData[in.readVInt()];
for (int i = 0; i < repository.length; i++) { for (int i = 0; i < repository.length; i++) {

View File

@ -0,0 +1,38 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.common.io.stream;
import org.elasticsearch.Version;
/**
* A {@link NamedWriteable} that has a minimum version associated with it.
*/
public interface VersionedNamedWriteable extends NamedWriteable {
/**
* Returns the name of the writeable object
*/
String getWriteableName();
/**
* The minimal version of the recipient this object can be sent to
*/
Version getMinimalSupportedVersion();
}

View File

@ -19,6 +19,7 @@
package org.elasticsearch.index.mapper; package org.elasticsearch.index.mapper;
import org.apache.lucene.analysis.TokenStream;
import org.apache.lucene.document.FieldType; import org.apache.lucene.document.FieldType;
import org.apache.lucene.index.IndexOptions; import org.apache.lucene.index.IndexOptions;
import org.apache.lucene.index.IndexReader; import org.apache.lucene.index.IndexReader;
@ -43,6 +44,7 @@ import org.elasticsearch.index.fielddata.IndexFieldData;
import org.elasticsearch.index.query.QueryRewriteContext; import org.elasticsearch.index.query.QueryRewriteContext;
import org.elasticsearch.index.query.QueryShardContext; import org.elasticsearch.index.query.QueryShardContext;
import org.elasticsearch.index.query.QueryShardException; import org.elasticsearch.index.query.QueryShardException;
import org.elasticsearch.index.search.MatchQuery;
import org.elasticsearch.index.similarity.SimilarityProvider; import org.elasticsearch.index.similarity.SimilarityProvider;
import org.elasticsearch.search.DocValueFormat; import org.elasticsearch.search.DocValueFormat;
import org.joda.time.DateTimeZone; import org.joda.time.DateTimeZone;
@ -353,6 +355,14 @@ public abstract class MappedFieldType extends FieldType {
public abstract Query existsQuery(QueryShardContext context); public abstract Query existsQuery(QueryShardContext context);
public Query phraseQuery(String field, TokenStream stream, int slop, boolean enablePositionIncrements) throws IOException {
throw new IllegalArgumentException("Can only use phrase queries on text fields - not on [" + name + "] which is of type [" + typeName() + "]");
}
public Query multiPhraseQuery(String field, TokenStream stream, int slop, boolean enablePositionIncrements) throws IOException {
throw new IllegalArgumentException("Can only use phrase queries on text fields - not on [" + name + "] which is of type [" + typeName() + "]");
}
/** /**
* An enum used to describe the relation between the range of terms in a * An enum used to describe the relation between the range of terms in a
* shard when compared with a query range * shard when compared with a query range

View File

@ -19,20 +19,29 @@
package org.elasticsearch.index.mapper; package org.elasticsearch.index.mapper;
import org.apache.logging.log4j.Logger;
import org.apache.lucene.analysis.Analyzer; import org.apache.lucene.analysis.Analyzer;
import org.apache.lucene.analysis.AnalyzerWrapper; import org.apache.lucene.analysis.AnalyzerWrapper;
import org.apache.lucene.analysis.CachingTokenFilter;
import org.apache.lucene.analysis.TokenFilter; import org.apache.lucene.analysis.TokenFilter;
import org.apache.lucene.analysis.TokenStream;
import org.apache.lucene.analysis.ngram.EdgeNGramTokenFilter; import org.apache.lucene.analysis.ngram.EdgeNGramTokenFilter;
import org.apache.lucene.analysis.shingle.FixedShingleFilter;
import org.apache.lucene.analysis.tokenattributes.PositionIncrementAttribute;
import org.apache.lucene.analysis.tokenattributes.TermToBytesRefAttribute;
import org.apache.lucene.document.Field; import org.apache.lucene.document.Field;
import org.apache.lucene.index.IndexOptions; import org.apache.lucene.index.IndexOptions;
import org.apache.lucene.index.IndexableField; import org.apache.lucene.index.IndexableField;
import org.apache.lucene.index.Term; import org.apache.lucene.index.Term;
import org.apache.lucene.search.ConstantScoreQuery; import org.apache.lucene.search.ConstantScoreQuery;
import org.apache.lucene.search.MultiPhraseQuery;
import org.apache.lucene.search.MultiTermQuery; import org.apache.lucene.search.MultiTermQuery;
import org.apache.lucene.search.NormsFieldExistsQuery; import org.apache.lucene.search.NormsFieldExistsQuery;
import org.apache.lucene.search.PhraseQuery;
import org.apache.lucene.search.Query; import org.apache.lucene.search.Query;
import org.apache.lucene.search.TermQuery; import org.apache.lucene.search.TermQuery;
import org.elasticsearch.common.collect.Iterators; import org.elasticsearch.common.collect.Iterators;
import org.elasticsearch.common.logging.ESLoggerFactory;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.support.XContentMapValues; import org.elasticsearch.common.xcontent.support.XContentMapValues;
@ -43,7 +52,7 @@ import org.elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData;
import org.elasticsearch.index.query.QueryShardContext; import org.elasticsearch.index.query.QueryShardContext;
import java.io.IOException; import java.io.IOException;
import java.util.Collections; import java.util.ArrayList;
import java.util.Iterator; import java.util.Iterator;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
@ -54,9 +63,13 @@ import static org.elasticsearch.index.mapper.TypeParsers.parseTextField;
/** A {@link FieldMapper} for full-text fields. */ /** A {@link FieldMapper} for full-text fields. */
public class TextFieldMapper extends FieldMapper { public class TextFieldMapper extends FieldMapper {
private static final Logger logger = ESLoggerFactory.getLogger(TextFieldMapper.class);
public static final String CONTENT_TYPE = "text"; public static final String CONTENT_TYPE = "text";
private static final int POSITION_INCREMENT_GAP_USE_ANALYZER = -1; private static final int POSITION_INCREMENT_GAP_USE_ANALYZER = -1;
public static final String FAST_PHRASE_SUFFIX = "._index_phrase";
public static class Defaults { public static class Defaults {
public static final double FIELDDATA_MIN_FREQUENCY = 0; public static final double FIELDDATA_MIN_FREQUENCY = 0;
public static final double FIELDDATA_MAX_FREQUENCY = Integer.MAX_VALUE; public static final double FIELDDATA_MAX_FREQUENCY = Integer.MAX_VALUE;
@ -105,6 +118,11 @@ public class TextFieldMapper extends FieldMapper {
return builder; return builder;
} }
public Builder indexPhrases(boolean indexPhrases) {
fieldType().setIndexPhrases(indexPhrases);
return builder;
}
@Override @Override
public Builder docValues(boolean docValues) { public Builder docValues(boolean docValues) {
if (docValues) { if (docValues) {
@ -166,8 +184,16 @@ public class TextFieldMapper extends FieldMapper {
prefixFieldType.setAnalyzer(fieldType.indexAnalyzer()); prefixFieldType.setAnalyzer(fieldType.indexAnalyzer());
prefixMapper = new PrefixFieldMapper(prefixFieldType, context.indexSettings()); prefixMapper = new PrefixFieldMapper(prefixFieldType, context.indexSettings());
} }
if (fieldType().indexPhrases) {
if (fieldType().isSearchable() == false) {
throw new IllegalArgumentException("Cannot set index_phrases on unindexed field [" + name() + "]");
}
if (fieldType.indexOptions().compareTo(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS) < 0) {
throw new IllegalArgumentException("Cannot set index_phrases on field [" + name() + "] if positions are not enabled");
}
}
return new TextFieldMapper( return new TextFieldMapper(
name, fieldType, defaultFieldType, positionIncrementGap, prefixMapper, name, fieldType(), defaultFieldType, positionIncrementGap, prefixMapper,
context.indexSettings(), multiFieldsBuilder.build(this, context), copyTo); context.indexSettings(), multiFieldsBuilder.build(this, context), copyTo);
} }
} }
@ -211,12 +237,35 @@ public class TextFieldMapper extends FieldMapper {
builder.indexPrefixes(minChars, maxChars); builder.indexPrefixes(minChars, maxChars);
DocumentMapperParser.checkNoRemainingFields(propName, indexPrefix, parserContext.indexVersionCreated()); DocumentMapperParser.checkNoRemainingFields(propName, indexPrefix, parserContext.indexVersionCreated());
iterator.remove(); iterator.remove();
} else if (propName.equals("index_phrases")) {
builder.indexPhrases(XContentMapValues.nodeBooleanValue(propNode, "index_phrases"));
iterator.remove();
} }
} }
return builder; return builder;
} }
} }
private static class PhraseWrappedAnalyzer extends AnalyzerWrapper {
private final Analyzer delegate;
PhraseWrappedAnalyzer(Analyzer delegate) {
super(delegate.getReuseStrategy());
this.delegate = delegate;
}
@Override
protected Analyzer getWrappedAnalyzer(String fieldName) {
return delegate;
}
@Override
protected TokenStreamComponents wrapComponents(String fieldName, TokenStreamComponents components) {
return new TokenStreamComponents(components.getTokenizer(), new FixedShingleFilter(components.getTokenStream(), 2));
}
}
private static class PrefixWrappedAnalyzer extends AnalyzerWrapper { private static class PrefixWrappedAnalyzer extends AnalyzerWrapper {
private final int minChars; private final int minChars;
@ -242,6 +291,46 @@ public class TextFieldMapper extends FieldMapper {
} }
} }
private static final class PhraseFieldType extends StringFieldType {
final TextFieldType parent;
PhraseFieldType(TextFieldType parent) {
setTokenized(true);
setIndexOptions(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS);
if (parent.indexOptions() == IndexOptions.DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS) {
setIndexOptions(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS);
}
if (parent.storeTermVectorOffsets()) {
setStoreTermVectors(true);
setStoreTermVectorPositions(true);
setStoreTermVectorOffsets(true);
}
setAnalyzer(parent.indexAnalyzer().name(), parent.indexAnalyzer().analyzer());
setName(parent.name() + FAST_PHRASE_SUFFIX);
this.parent = parent;
}
void setAnalyzer(String name, Analyzer delegate) {
setIndexAnalyzer(new NamedAnalyzer(name, AnalyzerScope.INDEX, new PhraseWrappedAnalyzer(delegate)));
}
@Override
public MappedFieldType clone() {
return new PhraseFieldType(parent);
}
@Override
public String typeName() {
return "phrase";
}
@Override
public Query existsQuery(QueryShardContext context) {
throw new UnsupportedOperationException();
}
}
static final class PrefixFieldType extends StringFieldType { static final class PrefixFieldType extends StringFieldType {
final int minChars; final int minChars;
@ -310,6 +399,23 @@ public class TextFieldMapper extends FieldMapper {
} }
} }
private static final class PhraseFieldMapper extends FieldMapper {
PhraseFieldMapper(PhraseFieldType fieldType, Settings indexSettings) {
super(fieldType.name(), fieldType, fieldType, indexSettings, MultiFields.empty(), CopyTo.empty());
}
@Override
protected void parseCreateField(ParseContext context, List<IndexableField> fields) throws IOException {
throw new UnsupportedOperationException();
}
@Override
protected String contentType() {
return "phrase";
}
}
private static final class PrefixFieldMapper extends FieldMapper { private static final class PrefixFieldMapper extends FieldMapper {
protected PrefixFieldMapper(PrefixFieldType fieldType, Settings indexSettings) { protected PrefixFieldMapper(PrefixFieldType fieldType, Settings indexSettings) {
@ -343,6 +449,7 @@ public class TextFieldMapper extends FieldMapper {
private double fielddataMaxFrequency; private double fielddataMaxFrequency;
private int fielddataMinSegmentSize; private int fielddataMinSegmentSize;
private PrefixFieldType prefixFieldType; private PrefixFieldType prefixFieldType;
private boolean indexPhrases = false;
public TextFieldType() { public TextFieldType() {
setTokenized(true); setTokenized(true);
@ -358,6 +465,7 @@ public class TextFieldMapper extends FieldMapper {
this.fielddataMinFrequency = ref.fielddataMinFrequency; this.fielddataMinFrequency = ref.fielddataMinFrequency;
this.fielddataMaxFrequency = ref.fielddataMaxFrequency; this.fielddataMaxFrequency = ref.fielddataMaxFrequency;
this.fielddataMinSegmentSize = ref.fielddataMinSegmentSize; this.fielddataMinSegmentSize = ref.fielddataMinSegmentSize;
this.indexPhrases = ref.indexPhrases;
if (ref.prefixFieldType != null) { if (ref.prefixFieldType != null) {
this.prefixFieldType = ref.prefixFieldType.clone(); this.prefixFieldType = ref.prefixFieldType.clone();
} }
@ -374,6 +482,7 @@ public class TextFieldMapper extends FieldMapper {
} }
TextFieldType that = (TextFieldType) o; TextFieldType that = (TextFieldType) o;
return fielddata == that.fielddata return fielddata == that.fielddata
&& indexPhrases == that.indexPhrases
&& Objects.equals(prefixFieldType, that.prefixFieldType) && Objects.equals(prefixFieldType, that.prefixFieldType)
&& fielddataMinFrequency == that.fielddataMinFrequency && fielddataMinFrequency == that.fielddataMinFrequency
&& fielddataMaxFrequency == that.fielddataMaxFrequency && fielddataMaxFrequency == that.fielddataMaxFrequency
@ -382,7 +491,7 @@ public class TextFieldMapper extends FieldMapper {
@Override @Override
public int hashCode() { public int hashCode() {
return Objects.hash(super.hashCode(), fielddata, prefixFieldType, return Objects.hash(super.hashCode(), fielddata, indexPhrases, prefixFieldType,
fielddataMinFrequency, fielddataMaxFrequency, fielddataMinSegmentSize); fielddataMinFrequency, fielddataMaxFrequency, fielddataMinSegmentSize);
} }
@ -427,6 +536,11 @@ public class TextFieldMapper extends FieldMapper {
this.prefixFieldType = prefixFieldType; this.prefixFieldType = prefixFieldType;
} }
void setIndexPhrases(boolean indexPhrases) {
checkIfFrozen();
this.indexPhrases = indexPhrases;
}
public PrefixFieldType getPrefixFieldType() { public PrefixFieldType getPrefixFieldType() {
return this.prefixFieldType; return this.prefixFieldType;
} }
@ -458,6 +572,93 @@ public class TextFieldMapper extends FieldMapper {
} }
} }
@Override
public Query phraseQuery(String field, TokenStream stream, int slop, boolean enablePosIncrements) throws IOException {
if (indexPhrases && slop == 0 && hasGaps(cache(stream)) == false) {
stream = new FixedShingleFilter(stream, 2);
field = field + FAST_PHRASE_SUFFIX;
}
PhraseQuery.Builder builder = new PhraseQuery.Builder();
builder.setSlop(slop);
TermToBytesRefAttribute termAtt = stream.getAttribute(TermToBytesRefAttribute.class);
PositionIncrementAttribute posIncrAtt = stream.getAttribute(PositionIncrementAttribute.class);
int position = -1;
stream.reset();
while (stream.incrementToken()) {
if (enablePosIncrements) {
position += posIncrAtt.getPositionIncrement();
}
else {
position += 1;
}
builder.add(new Term(field, termAtt.getBytesRef()), position);
}
return builder.build();
}
@Override
public Query multiPhraseQuery(String field, TokenStream stream, int slop, boolean enablePositionIncrements) throws IOException {
if (indexPhrases && slop == 0 && hasGaps(cache(stream)) == false) {
stream = new FixedShingleFilter(stream, 2);
field = field + FAST_PHRASE_SUFFIX;
}
MultiPhraseQuery.Builder mpqb = new MultiPhraseQuery.Builder();
mpqb.setSlop(slop);
TermToBytesRefAttribute termAtt = stream.getAttribute(TermToBytesRefAttribute.class);
PositionIncrementAttribute posIncrAtt = stream.getAttribute(PositionIncrementAttribute.class);
int position = -1;
List<Term> multiTerms = new ArrayList<>();
stream.reset();
while (stream.incrementToken()) {
int positionIncrement = posIncrAtt.getPositionIncrement();
if (positionIncrement > 0 && multiTerms.size() > 0) {
if (enablePositionIncrements) {
mpqb.add(multiTerms.toArray(new Term[0]), position);
} else {
mpqb.add(multiTerms.toArray(new Term[0]));
}
multiTerms.clear();
}
position += positionIncrement;
multiTerms.add(new Term(field, termAtt.getBytesRef()));
}
if (enablePositionIncrements) {
mpqb.add(multiTerms.toArray(new Term[0]), position);
} else {
mpqb.add(multiTerms.toArray(new Term[0]));
}
return mpqb.build();
}
private static CachingTokenFilter cache(TokenStream in) {
if (in instanceof CachingTokenFilter) {
return (CachingTokenFilter) in;
}
return new CachingTokenFilter(in);
}
private static boolean hasGaps(CachingTokenFilter stream) throws IOException {
PositionIncrementAttribute posIncAtt = stream.getAttribute(PositionIncrementAttribute.class);
stream.reset();
while (stream.incrementToken()) {
if (posIncAtt.getPositionIncrement() > 1) {
return true;
}
}
return false;
}
@Override @Override
public IndexFieldData.Builder fielddataBuilder(String fullyQualifiedIndexName) { public IndexFieldData.Builder fielddataBuilder(String fullyQualifiedIndexName) {
if (fielddata == false) { if (fielddata == false) {
@ -472,6 +673,9 @@ public class TextFieldMapper extends FieldMapper {
public void checkCompatibility(MappedFieldType other, List<String> conflicts) { public void checkCompatibility(MappedFieldType other, List<String> conflicts) {
super.checkCompatibility(other, conflicts); super.checkCompatibility(other, conflicts);
TextFieldType tft = (TextFieldType) other; TextFieldType tft = (TextFieldType) other;
if (tft.indexPhrases != this.indexPhrases) {
conflicts.add("mapper [" + name() + "] has different [index_phrases] values");
}
if (Objects.equals(this.prefixFieldType, tft.prefixFieldType) == false) { if (Objects.equals(this.prefixFieldType, tft.prefixFieldType) == false) {
if (this.prefixFieldType == null) { if (this.prefixFieldType == null) {
conflicts.add("mapper [" + name() conflicts.add("mapper [" + name()
@ -490,8 +694,9 @@ public class TextFieldMapper extends FieldMapper {
private int positionIncrementGap; private int positionIncrementGap;
private PrefixFieldMapper prefixFieldMapper; private PrefixFieldMapper prefixFieldMapper;
private PhraseFieldMapper phraseFieldMapper;
protected TextFieldMapper(String simpleName, MappedFieldType fieldType, MappedFieldType defaultFieldType, protected TextFieldMapper(String simpleName, TextFieldType fieldType, MappedFieldType defaultFieldType,
int positionIncrementGap, PrefixFieldMapper prefixFieldMapper, int positionIncrementGap, PrefixFieldMapper prefixFieldMapper,
Settings indexSettings, MultiFields multiFields, CopyTo copyTo) { Settings indexSettings, MultiFields multiFields, CopyTo copyTo) {
super(simpleName, fieldType, defaultFieldType, indexSettings, multiFields, copyTo); super(simpleName, fieldType, defaultFieldType, indexSettings, multiFields, copyTo);
@ -502,6 +707,7 @@ public class TextFieldMapper extends FieldMapper {
} }
this.positionIncrementGap = positionIncrementGap; this.positionIncrementGap = positionIncrementGap;
this.prefixFieldMapper = prefixFieldMapper; this.prefixFieldMapper = prefixFieldMapper;
this.phraseFieldMapper = fieldType.indexPhrases ? new PhraseFieldMapper(new PhraseFieldType(fieldType), indexSettings) : null;
} }
@Override @Override
@ -535,15 +741,25 @@ public class TextFieldMapper extends FieldMapper {
if (prefixFieldMapper != null) { if (prefixFieldMapper != null) {
prefixFieldMapper.addField(value, fields); prefixFieldMapper.addField(value, fields);
} }
if (phraseFieldMapper != null) {
fields.add(new Field(phraseFieldMapper.fieldType.name(), value, phraseFieldMapper.fieldType));
}
} }
} }
@Override @Override
public Iterator<Mapper> iterator() { public Iterator<Mapper> iterator() {
if (prefixFieldMapper == null) { List<Mapper> subIterators = new ArrayList<>();
if (prefixFieldMapper != null) {
subIterators.add(prefixFieldMapper);
}
if (phraseFieldMapper != null) {
subIterators.add(phraseFieldMapper);
}
if (subIterators.size() == 0) {
return super.iterator(); return super.iterator();
} }
return Iterators.concat(super.iterator(), Collections.singleton(prefixFieldMapper).iterator()); return Iterators.concat(super.iterator(), subIterators.iterator());
} }
@Override @Override
@ -562,6 +778,10 @@ public class TextFieldMapper extends FieldMapper {
throw new IllegalArgumentException("mapper [" + name() + "] has different index_prefix settings, current [" throw new IllegalArgumentException("mapper [" + name() + "] has different index_prefix settings, current ["
+ this.prefixFieldMapper + "], merged [" + mw.prefixFieldMapper + "]"); + this.prefixFieldMapper + "], merged [" + mw.prefixFieldMapper + "]");
} }
else if (this.fieldType().indexPhrases != mw.fieldType().indexPhrases) {
throw new IllegalArgumentException("mapper [" + name() + "] has different index_phrases settings, current ["
+ this.fieldType().indexPhrases + "], merged [" + mw.fieldType().indexPhrases + "]");
}
} }
@Override @Override
@ -602,5 +822,8 @@ public class TextFieldMapper extends FieldMapper {
if (fieldType().prefixFieldType != null) { if (fieldType().prefixFieldType != null) {
fieldType().prefixFieldType.doXContent(builder); fieldType().prefixFieldType.doXContent(builder);
} }
if (fieldType().indexPhrases) {
builder.field("index_phrases", fieldType().indexPhrases);
}
} }
} }

View File

@ -28,6 +28,7 @@ import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.search.MatchQuery; import org.elasticsearch.index.search.MatchQuery;
import org.elasticsearch.index.search.MatchQuery.ZeroTermsQuery; import org.elasticsearch.index.search.MatchQuery.ZeroTermsQuery;

View File

@ -352,16 +352,14 @@ public class MatchQuery {
@Override @Override
protected Query analyzePhrase(String field, TokenStream stream, int slop) throws IOException { protected Query analyzePhrase(String field, TokenStream stream, int slop) throws IOException {
if (hasPositions(mapper) == false) { IllegalStateException e = checkForPositions(field);
IllegalStateException exc = if (e != null) {
new IllegalStateException("field:[" + field + "] was indexed without position data; cannot run PhraseQuery");
if (lenient) { if (lenient) {
return newLenientFieldQuery(field, exc); return newLenientFieldQuery(field, e);
} else {
throw exc;
} }
throw e;
} }
Query query = super.analyzePhrase(field, stream, slop); Query query = mapper.phraseQuery(field, stream, slop, enablePositionIncrements);
if (query instanceof PhraseQuery) { if (query instanceof PhraseQuery) {
// synonyms that expand to multiple terms can return a phrase query. // synonyms that expand to multiple terms can return a phrase query.
return blendPhraseQuery((PhraseQuery) query, mapper); return blendPhraseQuery((PhraseQuery) query, mapper);
@ -369,6 +367,25 @@ public class MatchQuery {
return query; return query;
} }
@Override
protected Query analyzeMultiPhrase(String field, TokenStream stream, int slop) throws IOException {
IllegalStateException e = checkForPositions(field);
if (e != null) {
if (lenient) {
return newLenientFieldQuery(field, e);
}
throw e;
}
return mapper.multiPhraseQuery(field, stream, slop, enablePositionIncrements);
}
private IllegalStateException checkForPositions(String field) {
if (hasPositions(mapper) == false) {
return new IllegalStateException("field:[" + field + "] was indexed without position data; cannot run PhraseQuery");
}
return null;
}
/** /**
* Checks if graph analysis should be enabled for the field depending * Checks if graph analysis should be enabled for the field depending
* on the provided {@link Analyzer} * on the provided {@link Analyzer}

View File

@ -19,6 +19,7 @@
package org.elasticsearch.ingest; package org.elasticsearch.ingest;
import org.elasticsearch.Version;
import org.elasticsearch.cluster.Diff; import org.elasticsearch.cluster.Diff;
import org.elasticsearch.cluster.DiffableUtils; import org.elasticsearch.cluster.DiffableUtils;
import org.elasticsearch.cluster.NamedDiff; import org.elasticsearch.cluster.NamedDiff;
@ -69,6 +70,11 @@ public final class IngestMetadata implements MetaData.Custom {
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumCompatibilityVersion();
}
public Map<String, PipelineConfiguration> getPipelines() { public Map<String, PipelineConfiguration> getPipelines() {
return pipelines; return pipelines;
} }

View File

@ -35,7 +35,7 @@ public class NodePersistentTasksExecutor {
this.threadPool = threadPool; this.threadPool = threadPool;
} }
public <Params extends PersistentTaskParams> void executeTask(@Nullable Params params, public <Params extends PersistentTaskParams> void executeTask(Params params,
@Nullable Task.Status status, @Nullable Task.Status status,
AllocatedPersistentTask task, AllocatedPersistentTask task,
PersistentTasksExecutor<Params> executor) { PersistentTasksExecutor<Params> executor) {

View File

@ -19,12 +19,13 @@
package org.elasticsearch.persistent; package org.elasticsearch.persistent;
import org.elasticsearch.common.io.stream.NamedWriteable; import org.elasticsearch.cluster.ClusterState;
import org.elasticsearch.common.io.stream.VersionedNamedWriteable;
import org.elasticsearch.common.xcontent.ToXContentObject; import org.elasticsearch.common.xcontent.ToXContentObject;
/** /**
* Parameters used to start persistent task * Parameters used to start persistent task
*/ */
public interface PersistentTaskParams extends NamedWriteable, ToXContentObject { public interface PersistentTaskParams extends VersionedNamedWriteable, ToXContentObject, ClusterState.FeatureAware {
} }

View File

@ -29,7 +29,6 @@ import org.elasticsearch.cluster.ClusterStateUpdateTask;
import org.elasticsearch.cluster.metadata.MetaData; import org.elasticsearch.cluster.metadata.MetaData;
import org.elasticsearch.cluster.node.DiscoveryNodes; import org.elasticsearch.cluster.node.DiscoveryNodes;
import org.elasticsearch.cluster.service.ClusterService; import org.elasticsearch.cluster.service.ClusterService;
import org.elasticsearch.common.Nullable;
import org.elasticsearch.common.component.AbstractComponent; import org.elasticsearch.common.component.AbstractComponent;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.Assignment; import org.elasticsearch.persistent.PersistentTasksCustomMetaData.Assignment;
@ -65,7 +64,7 @@ public class PersistentTasksClusterService extends AbstractComponent implements
* @param taskParams the task's parameters * @param taskParams the task's parameters
* @param listener the listener that will be called when task is started * @param listener the listener that will be called when task is started
*/ */
public <Params extends PersistentTaskParams> void createPersistentTask(String taskId, String taskName, @Nullable Params taskParams, public <Params extends PersistentTaskParams> void createPersistentTask(String taskId, String taskName, Params taskParams,
ActionListener<PersistentTask<?>> listener) { ActionListener<PersistentTask<?>> listener) {
clusterService.submitStateUpdateTask("create persistent task", new ClusterStateUpdateTask() { clusterService.submitStateUpdateTask("create persistent task", new ClusterStateUpdateTask() {
@Override @Override
@ -225,7 +224,7 @@ public class PersistentTasksClusterService extends AbstractComponent implements
* @return a new {@link Assignment} * @return a new {@link Assignment}
*/ */
private <Params extends PersistentTaskParams> Assignment createAssignment(final String taskName, private <Params extends PersistentTaskParams> Assignment createAssignment(final String taskName,
final @Nullable Params taskParams, final Params taskParams,
final ClusterState currentState) { final ClusterState currentState) {
PersistentTasksExecutor<Params> persistentTasksExecutor = registry.getPersistentTaskExecutorSafe(taskName); PersistentTasksExecutor<Params> persistentTasksExecutor = registry.getPersistentTaskExecutorSafe(taskName);

View File

@ -49,8 +49,8 @@ import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Objects; import java.util.Objects;
import java.util.Optional;
import java.util.Set; import java.util.Set;
import java.util.function.Function;
import java.util.function.Predicate; import java.util.function.Predicate;
import java.util.stream.Collectors; import java.util.stream.Collectors;
@ -264,7 +264,6 @@ public final class PersistentTasksCustomMetaData extends AbstractNamedDiffable<M
private final String id; private final String id;
private final long allocationId; private final long allocationId;
private final String taskName; private final String taskName;
@Nullable
private final P params; private final P params;
@Nullable @Nullable
private final Status status; private final Status status;
@ -314,7 +313,11 @@ public final class PersistentTasksCustomMetaData extends AbstractNamedDiffable<M
id = in.readString(); id = in.readString();
allocationId = in.readLong(); allocationId = in.readLong();
taskName = in.readString(); taskName = in.readString();
params = (P) in.readOptionalNamedWriteable(PersistentTaskParams.class); if (in.getVersion().onOrAfter(Version.V_7_0_0_alpha1)) {
params = (P) in.readNamedWriteable(PersistentTaskParams.class);
} else {
params = (P) in.readOptionalNamedWriteable(PersistentTaskParams.class);
}
status = in.readOptionalNamedWriteable(Task.Status.class); status = in.readOptionalNamedWriteable(Task.Status.class);
assignment = new Assignment(in.readOptionalString(), in.readString()); assignment = new Assignment(in.readOptionalString(), in.readString());
allocationIdOnLastStatusUpdate = in.readOptionalLong(); allocationIdOnLastStatusUpdate = in.readOptionalLong();
@ -325,7 +328,11 @@ public final class PersistentTasksCustomMetaData extends AbstractNamedDiffable<M
out.writeString(id); out.writeString(id);
out.writeLong(allocationId); out.writeLong(allocationId);
out.writeString(taskName); out.writeString(taskName);
out.writeOptionalNamedWriteable(params); if (out.getVersion().onOrAfter(Version.V_7_0_0_alpha1)) {
out.writeNamedWriteable(params);
} else {
out.writeOptionalNamedWriteable(params);
}
out.writeOptionalNamedWriteable(status); out.writeOptionalNamedWriteable(status);
out.writeOptionalString(assignment.executorNode); out.writeOptionalString(assignment.executorNode);
out.writeString(assignment.explanation); out.writeString(assignment.explanation);
@ -500,7 +507,10 @@ public final class PersistentTasksCustomMetaData extends AbstractNamedDiffable<M
@Override @Override
public void writeTo(StreamOutput out) throws IOException { public void writeTo(StreamOutput out) throws IOException {
out.writeLong(lastAllocationId); out.writeLong(lastAllocationId);
out.writeMap(tasks, StreamOutput::writeString, (stream, value) -> value.writeTo(stream)); Map<String, PersistentTask<?>> filteredTasks = tasks.values().stream()
.filter(t -> ClusterState.FeatureAware.shouldSerialize(out, t.getParams()))
.collect(Collectors.toMap(PersistentTask::getId, Function.identity()));
out.writeMap(filteredTasks, StreamOutput::writeString, (stream, value) -> value.writeTo(stream));
} }
public static NamedDiff<MetaData.Custom> readDiffFrom(StreamInput in) throws IOException { public static NamedDiff<MetaData.Custom> readDiffFrom(StreamInput in) throws IOException {

View File

@ -24,10 +24,10 @@ import org.elasticsearch.cluster.node.DiscoveryNode;
import org.elasticsearch.common.Nullable; import org.elasticsearch.common.Nullable;
import org.elasticsearch.common.component.AbstractComponent; import org.elasticsearch.common.component.AbstractComponent;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.tasks.Task;
import org.elasticsearch.tasks.TaskId;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.Assignment; import org.elasticsearch.persistent.PersistentTasksCustomMetaData.Assignment;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.PersistentTask; import org.elasticsearch.persistent.PersistentTasksCustomMetaData.PersistentTask;
import org.elasticsearch.tasks.Task;
import org.elasticsearch.tasks.TaskId;
import java.util.Map; import java.util.Map;
import java.util.function.Predicate; import java.util.function.Predicate;
@ -118,7 +118,7 @@ public abstract class PersistentTasksExecutor<Params extends PersistentTaskParam
* NOTE: The nodeOperation has to throw an exception, trigger task.markAsCompleted() or task.completeAndNotifyIfNeeded() methods to * NOTE: The nodeOperation has to throw an exception, trigger task.markAsCompleted() or task.completeAndNotifyIfNeeded() methods to
* indicate that the persistent task has finished. * indicate that the persistent task has finished.
*/ */
protected abstract void nodeOperation(AllocatedPersistentTask task, @Nullable Params params, @Nullable Task.Status status); protected abstract void nodeOperation(AllocatedPersistentTask task, Params params, @Nullable Task.Status status);
public String getExecutor() { public String getExecutor() {
return executor; return executor;

View File

@ -21,7 +21,6 @@ package org.elasticsearch.persistent;
import org.elasticsearch.action.Action; import org.elasticsearch.action.Action;
import org.elasticsearch.action.ActionListener; import org.elasticsearch.action.ActionListener;
import org.elasticsearch.action.ActionRequest; import org.elasticsearch.action.ActionRequest;
import org.elasticsearch.action.ActionRequestBuilder;
import org.elasticsearch.action.admin.cluster.node.tasks.cancel.CancelTasksRequest; import org.elasticsearch.action.admin.cluster.node.tasks.cancel.CancelTasksRequest;
import org.elasticsearch.action.admin.cluster.node.tasks.cancel.CancelTasksResponse; import org.elasticsearch.action.admin.cluster.node.tasks.cancel.CancelTasksResponse;
import org.elasticsearch.action.support.ContextPreservingActionListener; import org.elasticsearch.action.support.ContextPreservingActionListener;
@ -69,7 +68,7 @@ public class PersistentTasksService extends AbstractComponent {
*/ */
public <Params extends PersistentTaskParams> void sendStartRequest(final String taskId, public <Params extends PersistentTaskParams> void sendStartRequest(final String taskId,
final String taskName, final String taskName,
final @Nullable Params taskParams, final Params taskParams,
final ActionListener<PersistentTask<Params>> listener) { final ActionListener<PersistentTask<Params>> listener) {
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
final ActionListener<PersistentTask<?>> wrappedListener = final ActionListener<PersistentTask<?>> wrappedListener =

View File

@ -18,6 +18,7 @@
*/ */
package org.elasticsearch.persistent; package org.elasticsearch.persistent;
import org.elasticsearch.Version;
import org.elasticsearch.action.Action; import org.elasticsearch.action.Action;
import org.elasticsearch.action.ActionListener; import org.elasticsearch.action.ActionListener;
import org.elasticsearch.action.ActionRequestValidationException; import org.elasticsearch.action.ActionRequestValidationException;
@ -36,9 +37,9 @@ import org.elasticsearch.common.inject.Inject;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.PersistentTask;
import org.elasticsearch.threadpool.ThreadPool; import org.elasticsearch.threadpool.ThreadPool;
import org.elasticsearch.transport.TransportService; import org.elasticsearch.transport.TransportService;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.PersistentTask;
import java.io.IOException; import java.io.IOException;
import java.util.Objects; import java.util.Objects;
@ -66,7 +67,6 @@ public class StartPersistentTaskAction extends Action<StartPersistentTaskAction.
private String taskId; private String taskId;
@Nullable
private String taskName; private String taskName;
private PersistentTaskParams params; private PersistentTaskParams params;
@ -86,7 +86,11 @@ public class StartPersistentTaskAction extends Action<StartPersistentTaskAction.
super.readFrom(in); super.readFrom(in);
taskId = in.readString(); taskId = in.readString();
taskName = in.readString(); taskName = in.readString();
params = in.readOptionalNamedWriteable(PersistentTaskParams.class); if (in.getVersion().onOrAfter(Version.V_7_0_0_alpha1)) {
params = in.readNamedWriteable(PersistentTaskParams.class);
} else {
params = in.readOptionalNamedWriteable(PersistentTaskParams.class);
}
} }
@Override @Override
@ -94,7 +98,11 @@ public class StartPersistentTaskAction extends Action<StartPersistentTaskAction.
super.writeTo(out); super.writeTo(out);
out.writeString(taskId); out.writeString(taskId);
out.writeString(taskName); out.writeString(taskName);
out.writeOptionalNamedWriteable(params); if (out.getVersion().onOrAfter(Version.V_7_0_0_alpha1)) {
out.writeNamedWriteable(params);
} else {
out.writeOptionalNamedWriteable(params);
}
} }
@Override @Override

View File

@ -130,7 +130,7 @@ public abstract class RestRequest implements ToXContent.Params {
} }
public enum Method { public enum Method {
GET, POST, PUT, DELETE, OPTIONS, HEAD GET, POST, PUT, DELETE, OPTIONS, HEAD, PATCH, TRACE, CONNECT
} }
public abstract Method method(); public abstract Method method();

View File

@ -383,6 +383,11 @@ public final class ScriptMetaData implements MetaData.Custom, Writeable, ToXCont
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumCompatibilityVersion();
}
@Override @Override
public EnumSet<MetaData.XContentContext> context() { public EnumSet<MetaData.XContentContext> context() {
return MetaData.ALL_CONTEXTS; return MetaData.ALL_CONTEXTS;

View File

@ -1121,7 +1121,7 @@ public abstract class TcpTransport extends AbstractLifecycleComponent implements
stream.setVersion(version); stream.setVersion(version);
threadPool.getThreadContext().writeTo(stream); threadPool.getThreadContext().writeTo(stream);
if (version.onOrAfter(Version.V_7_0_0_alpha1)) { if (version.onOrAfter(Version.V_6_3_0)) {
stream.writeStringArray(features); stream.writeStringArray(features);
} }
stream.writeString(action); stream.writeString(action);
@ -1589,7 +1589,7 @@ public abstract class TcpTransport extends AbstractLifecycleComponent implements
int messageLengthBytes, Version version, InetSocketAddress remoteAddress, byte status) int messageLengthBytes, Version version, InetSocketAddress remoteAddress, byte status)
throws IOException { throws IOException {
final Set<String> features; final Set<String> features;
if (version.onOrAfter(Version.V_7_0_0_alpha1)) { if (version.onOrAfter(Version.V_6_3_0)) {
features = Collections.unmodifiableSet(new TreeSet<>(Arrays.asList(stream.readStringArray()))); features = Collections.unmodifiableSet(new TreeSet<>(Arrays.asList(stream.readStringArray())));
} else { } else {
features = Collections.emptySet(); features = Collections.emptySet();

View File

@ -308,6 +308,11 @@ public class ClusterChangedEventTests extends ESTestCase {
return "2"; return "2";
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
@Override @Override
public EnumSet<MetaData.XContentContext> context() { public EnumSet<MetaData.XContentContext> context() {
return EnumSet.of(MetaData.XContentContext.GATEWAY); return EnumSet.of(MetaData.XContentContext.GATEWAY);
@ -324,6 +329,11 @@ public class ClusterChangedEventTests extends ESTestCase {
return "1"; return "1";
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
@Override @Override
public EnumSet<MetaData.XContentContext> context() { public EnumSet<MetaData.XContentContext> context() {
return EnumSet.of(MetaData.XContentContext.GATEWAY); return EnumSet.of(MetaData.XContentContext.GATEWAY);

View File

@ -19,6 +19,7 @@
package org.elasticsearch.cluster; package org.elasticsearch.cluster;
import org.elasticsearch.Version;
import org.elasticsearch.action.admin.cluster.state.ClusterStateResponse; import org.elasticsearch.action.admin.cluster.state.ClusterStateResponse;
import org.elasticsearch.client.Client; import org.elasticsearch.client.Client;
import org.elasticsearch.cluster.metadata.IndexGraveyard; import org.elasticsearch.cluster.metadata.IndexGraveyard;
@ -73,7 +74,8 @@ import static org.hamcrest.Matchers.instanceOf;
@ESIntegTestCase.ClusterScope(scope = TEST) @ESIntegTestCase.ClusterScope(scope = TEST)
public class ClusterStateIT extends ESIntegTestCase { public class ClusterStateIT extends ESIntegTestCase {
public abstract static class Custom implements MetaData.Custom { public abstract static
class Custom implements MetaData.Custom {
private static final ParseField VALUE = new ParseField("value"); private static final ParseField VALUE = new ParseField("value");
@ -131,6 +133,11 @@ public class ClusterStateIT extends ESIntegTestCase {
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
@Override @Override
public Optional<String> getRequiredFeature() { public Optional<String> getRequiredFeature() {
return Optional.of("node"); return Optional.of("node");
@ -155,6 +162,11 @@ public class ClusterStateIT extends ESIntegTestCase {
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
/* /*
* This custom should always be returned yet we randomize whether it has a required feature that the client is expected to have * This custom should always be returned yet we randomize whether it has a required feature that the client is expected to have
* versus not requiring any feature. We use a field to make the random choice exactly once. * versus not requiring any feature. We use a field to make the random choice exactly once.

View File

@ -107,7 +107,7 @@ public class FeatureAwareTests extends ESTestCase {
} }
public void testVersion() { public void testVersion() {
final Version version = VersionUtils.randomVersion(random()); final Version version = randomValueOtherThan(VersionUtils.getFirstVersion(), () -> VersionUtils.randomVersion(random()));
for (final Custom custom : Arrays.asList(new NoRequiredFeatureCustom(version), new RequiredFeatureCustom(version))) { for (final Custom custom : Arrays.asList(new NoRequiredFeatureCustom(version), new RequiredFeatureCustom(version))) {
{ {
final BytesStreamOutput out = new BytesStreamOutput(); final BytesStreamOutput out = new BytesStreamOutput();
@ -116,7 +116,7 @@ public class FeatureAwareTests extends ESTestCase {
if (custom.getRequiredFeature().isPresent()) { if (custom.getRequiredFeature().isPresent()) {
out.setFeatures(Collections.singleton(custom.getRequiredFeature().get())); out.setFeatures(Collections.singleton(custom.getRequiredFeature().get()));
} }
assertTrue(FeatureAware.shouldSerializeCustom(out, custom)); assertTrue(FeatureAware.shouldSerialize(out, custom));
} }
{ {
final BytesStreamOutput out = new BytesStreamOutput(); final BytesStreamOutput out = new BytesStreamOutput();
@ -126,7 +126,7 @@ public class FeatureAwareTests extends ESTestCase {
if (custom.getRequiredFeature().isPresent() && randomBoolean()) { if (custom.getRequiredFeature().isPresent() && randomBoolean()) {
out.setFeatures(Collections.singleton(custom.getRequiredFeature().get())); out.setFeatures(Collections.singleton(custom.getRequiredFeature().get()));
} }
assertFalse(FeatureAware.shouldSerializeCustom(out, custom)); assertFalse(FeatureAware.shouldSerialize(out, custom));
} }
} }
} }
@ -141,7 +141,7 @@ public class FeatureAwareTests extends ESTestCase {
out.setVersion(afterVersion); out.setVersion(afterVersion);
assertTrue(custom.getRequiredFeature().isPresent()); assertTrue(custom.getRequiredFeature().isPresent());
out.setFeatures(Collections.singleton(custom.getRequiredFeature().get())); out.setFeatures(Collections.singleton(custom.getRequiredFeature().get()));
assertTrue(FeatureAware.shouldSerializeCustom(out, custom)); assertTrue(FeatureAware.shouldSerialize(out, custom));
} }
{ {
// the feature is present and the client is a transport client // the feature is present and the client is a transport client
@ -149,7 +149,7 @@ public class FeatureAwareTests extends ESTestCase {
out.setVersion(afterVersion); out.setVersion(afterVersion);
assertTrue(custom.getRequiredFeature().isPresent()); assertTrue(custom.getRequiredFeature().isPresent());
out.setFeatures(new HashSet<>(Arrays.asList(custom.getRequiredFeature().get(), TransportClient.TRANSPORT_CLIENT_FEATURE))); out.setFeatures(new HashSet<>(Arrays.asList(custom.getRequiredFeature().get(), TransportClient.TRANSPORT_CLIENT_FEATURE)));
assertTrue(FeatureAware.shouldSerializeCustom(out, custom)); assertTrue(FeatureAware.shouldSerialize(out, custom));
} }
} }
@ -161,14 +161,14 @@ public class FeatureAwareTests extends ESTestCase {
// the feature is missing but we should serialize it anyway because the client is not a transport client // the feature is missing but we should serialize it anyway because the client is not a transport client
final BytesStreamOutput out = new BytesStreamOutput(); final BytesStreamOutput out = new BytesStreamOutput();
out.setVersion(afterVersion); out.setVersion(afterVersion);
assertTrue(FeatureAware.shouldSerializeCustom(out, custom)); assertTrue(FeatureAware.shouldSerialize(out, custom));
} }
{ {
// the feature is missing and we should not serialize it because the client is a transport client // the feature is missing and we should not serialize it because the client is a transport client
final BytesStreamOutput out = new BytesStreamOutput(); final BytesStreamOutput out = new BytesStreamOutput();
out.setVersion(afterVersion); out.setVersion(afterVersion);
out.setFeatures(Collections.singleton(TransportClient.TRANSPORT_CLIENT_FEATURE)); out.setFeatures(Collections.singleton(TransportClient.TRANSPORT_CLIENT_FEATURE));
assertFalse(FeatureAware.shouldSerializeCustom(out, custom)); assertFalse(FeatureAware.shouldSerialize(out, custom));
} }
} }

View File

@ -19,6 +19,7 @@
package org.elasticsearch.cluster; package org.elasticsearch.cluster;
import org.elasticsearch.Version;
import org.elasticsearch.action.admin.cluster.state.ClusterStateResponse; import org.elasticsearch.action.admin.cluster.state.ClusterStateResponse;
import org.elasticsearch.action.admin.indices.template.get.GetIndexTemplatesResponse; import org.elasticsearch.action.admin.indices.template.get.GetIndexTemplatesResponse;
import org.elasticsearch.action.support.IndicesOptions; import org.elasticsearch.action.support.IndicesOptions;
@ -37,7 +38,6 @@ import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.unit.ByteSizeValue; import org.elasticsearch.common.unit.ByteSizeValue;
import org.elasticsearch.common.xcontent.ToXContent;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.index.IndexNotFoundException; import org.elasticsearch.index.IndexNotFoundException;
@ -304,6 +304,11 @@ public class SimpleClusterStateIT extends ESIntegTestCase {
return "test"; return "test";
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
@Override @Override
public void writeTo(StreamOutput out) throws IOException { public void writeTo(StreamOutput out) throws IOException {
out.writeInt(value); out.writeInt(value);

View File

@ -45,7 +45,6 @@ import org.elasticsearch.snapshots.SnapshotId;
import org.elasticsearch.test.VersionUtils; import org.elasticsearch.test.VersionUtils;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collections; import java.util.Collections;
@ -133,7 +132,7 @@ public class ClusterSerializationTests extends ESAllocationTestCase {
// serialize with current version // serialize with current version
BytesStreamOutput outStream = new BytesStreamOutput(); BytesStreamOutput outStream = new BytesStreamOutput();
Version version = VersionUtils.randomVersionBetween(random(), Version.CURRENT.minimumIndexCompatibilityVersion(), Version.CURRENT); Version version = VersionUtils.randomVersionBetween(random(), Version.CURRENT.minimumCompatibilityVersion(), Version.CURRENT);
outStream.setVersion(version); outStream.setVersion(version);
diffs.writeTo(outStream); diffs.writeTo(outStream);
StreamInput inStream = outStream.bytes().streamInput(); StreamInput inStream = outStream.bytes().streamInput();

View File

@ -18,6 +18,7 @@
*/ */
package org.elasticsearch.cluster.service; package org.elasticsearch.cluster.service;
import org.elasticsearch.Version;
import org.elasticsearch.cluster.ClusterState; import org.elasticsearch.cluster.ClusterState;
import org.elasticsearch.cluster.Diff; import org.elasticsearch.cluster.Diff;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
@ -43,6 +44,11 @@ public class ClusterSerivceTests extends ESTestCase {
return null; return null;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
@Override @Override
public void writeTo(StreamOutput out) throws IOException { public void writeTo(StreamOutput out) throws IOException {

View File

@ -239,6 +239,11 @@ public class ZenDiscoveryIT extends ESIntegTestCase {
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
@Override @Override
public EnumSet<MetaData.XContentContext> context() { public EnumSet<MetaData.XContentContext> context() {
return EnumSet.of(MetaData.XContentContext.GATEWAY, MetaData.XContentContext.SNAPSHOT); return EnumSet.of(MetaData.XContentContext.GATEWAY, MetaData.XContentContext.SNAPSHOT);

View File

@ -492,6 +492,11 @@ public class GatewayMetaStateTests extends ESAllocationTestCase {
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
@Override @Override
public EnumSet<MetaData.XContentContext> context() { public EnumSet<MetaData.XContentContext> context() {
return EnumSet.of(MetaData.XContentContext.GATEWAY); return EnumSet.of(MetaData.XContentContext.GATEWAY);
@ -510,6 +515,11 @@ public class GatewayMetaStateTests extends ESAllocationTestCase {
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
@Override @Override
public EnumSet<MetaData.XContentContext> context() { public EnumSet<MetaData.XContentContext> context() {
return EnumSet.of(MetaData.XContentContext.GATEWAY); return EnumSet.of(MetaData.XContentContext.GATEWAY);

View File

@ -19,6 +19,8 @@
package org.elasticsearch.index.mapper; package org.elasticsearch.index.mapper;
import org.apache.lucene.analysis.TokenStream;
import org.apache.lucene.analysis.tokenattributes.CharTermAttribute;
import org.apache.lucene.document.FieldType; import org.apache.lucene.document.FieldType;
import org.apache.lucene.index.DocValuesType; import org.apache.lucene.index.DocValuesType;
import org.apache.lucene.index.IndexOptions; import org.apache.lucene.index.IndexOptions;
@ -29,6 +31,8 @@ import org.apache.lucene.index.PostingsEnum;
import org.apache.lucene.index.Term; import org.apache.lucene.index.Term;
import org.apache.lucene.index.TermsEnum; import org.apache.lucene.index.TermsEnum;
import org.apache.lucene.search.ConstantScoreQuery; import org.apache.lucene.search.ConstantScoreQuery;
import org.apache.lucene.search.MultiPhraseQuery;
import org.apache.lucene.search.PhraseQuery;
import org.apache.lucene.search.PrefixQuery; import org.apache.lucene.search.PrefixQuery;
import org.apache.lucene.search.Query; import org.apache.lucene.search.Query;
import org.apache.lucene.search.TermQuery; import org.apache.lucene.search.TermQuery;
@ -38,6 +42,7 @@ import org.elasticsearch.common.Strings;
import org.elasticsearch.common.bytes.BytesReference; import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.compress.CompressedXContent; import org.elasticsearch.common.compress.CompressedXContent;
import org.elasticsearch.common.lucene.uid.Versions; import org.elasticsearch.common.lucene.uid.Versions;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.ToXContent; import org.elasticsearch.common.xcontent.ToXContent;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
@ -47,7 +52,9 @@ import org.elasticsearch.index.VersionType;
import org.elasticsearch.index.engine.Engine; import org.elasticsearch.index.engine.Engine;
import org.elasticsearch.index.mapper.MapperService.MergeReason; import org.elasticsearch.index.mapper.MapperService.MergeReason;
import org.elasticsearch.index.mapper.TextFieldMapper.TextFieldType; import org.elasticsearch.index.mapper.TextFieldMapper.TextFieldType;
import org.elasticsearch.index.query.MatchPhraseQueryBuilder;
import org.elasticsearch.index.query.QueryShardContext; import org.elasticsearch.index.query.QueryShardContext;
import org.elasticsearch.index.search.MatchQuery;
import org.elasticsearch.index.shard.IndexShard; import org.elasticsearch.index.shard.IndexShard;
import org.elasticsearch.plugins.Plugin; import org.elasticsearch.plugins.Plugin;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESSingleNodeTestCase;
@ -65,6 +72,7 @@ import static org.apache.lucene.search.MultiTermQuery.CONSTANT_SCORE_REWRITE;
import static org.hamcrest.Matchers.containsString; import static org.hamcrest.Matchers.containsString;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.instanceOf; import static org.hamcrest.Matchers.instanceOf;
import static org.hamcrest.core.Is.is;
public class TextFieldMapperTests extends ESSingleNodeTestCase { public class TextFieldMapperTests extends ESSingleNodeTestCase {
@ -73,7 +81,13 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
@Before @Before
public void setup() { public void setup() {
indexService = createIndex("test"); Settings settings = Settings.builder()
.put("index.analysis.filter.mySynonyms.type", "synonym")
.putList("index.analysis.filter.mySynonyms.synonyms", Collections.singletonList("car, auto"))
.put("index.analysis.analyzer.synonym.tokenizer", "standard")
.put("index.analysis.analyzer.synonym.filter", "mySynonyms")
.build();
indexService = createIndex("test", settings);
parser = indexService.mapperService().documentMapperParser(); parser = indexService.mapperService().documentMapperParser();
} }
@ -670,6 +684,102 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
} }
} }
public void testFastPhraseMapping() throws IOException {
QueryShardContext queryShardContext = indexService.newQueryShardContext(
randomInt(20), null, () -> {
throw new UnsupportedOperationException();
}, null);
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties")
.startObject("field")
.field("type", "text")
.field("analyzer", "english")
.field("index_phrases", true)
.endObject()
.startObject("synfield")
.field("type", "text")
.field("analyzer", "synonym")
.field("index_phrases", true)
.endObject()
.endObject()
.endObject().endObject());
DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping));
assertEquals(mapping, mapper.mappingSource().toString());
queryShardContext.getMapperService().merge("type", new CompressedXContent(mapping), MergeReason.MAPPING_UPDATE);
Query q = new MatchPhraseQueryBuilder("field", "two words").toQuery(queryShardContext);
assertThat(q, is(new PhraseQuery("field._index_phrase", "two word")));
Query q2 = new MatchPhraseQueryBuilder("field", "three words here").toQuery(queryShardContext);
assertThat(q2, is(new PhraseQuery("field._index_phrase", "three word", "word here")));
Query q3 = new MatchPhraseQueryBuilder("field", "two words").slop(1).toQuery(queryShardContext);
assertThat(q3, is(new PhraseQuery(1, "field", "two", "word")));
Query q4 = new MatchPhraseQueryBuilder("field", "singleton").toQuery(queryShardContext);
assertThat(q4, is(new TermQuery(new Term("field", "singleton"))));
Query q5 = new MatchPhraseQueryBuilder("field", "sparkle a stopword").toQuery(queryShardContext);
assertThat(q5,
is(new PhraseQuery.Builder().add(new Term("field", "sparkl")).add(new Term("field", "stopword"), 2).build()));
Query q6 = new MatchPhraseQueryBuilder("synfield", "motor car").toQuery(queryShardContext);
assertThat(q6, is(new MultiPhraseQuery.Builder()
.add(new Term[]{
new Term("synfield._index_phrase", "motor car"),
new Term("synfield._index_phrase", "motor auto")})
.build()));
ParsedDocument doc = mapper.parse(SourceToParse.source("test", "type", "1", BytesReference
.bytes(XContentFactory.jsonBuilder()
.startObject()
.field("field", "Some English text that is going to be very useful")
.endObject()),
XContentType.JSON));
IndexableField[] fields = doc.rootDoc().getFields("field._index_phrase");
assertEquals(1, fields.length);
try (TokenStream ts = fields[0].tokenStream(queryShardContext.getMapperService().indexAnalyzer(), null)) {
CharTermAttribute termAtt = ts.addAttribute(CharTermAttribute.class);
ts.reset();
assertTrue(ts.incrementToken());
assertEquals("some english", termAtt.toString());
}
{
String badConfigMapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties").startObject("field")
.field("type", "text")
.field("index", "false")
.field("index_phrases", true)
.endObject().endObject()
.endObject().endObject());
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
() -> parser.parse("type", new CompressedXContent(badConfigMapping))
);
assertThat(e.getMessage(), containsString("Cannot set index_phrases on unindexed field [field]"));
}
{
String badConfigMapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties").startObject("field")
.field("type", "text")
.field("index_options", "freqs")
.field("index_phrases", true)
.endObject().endObject()
.endObject().endObject());
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
() -> parser.parse("type", new CompressedXContent(badConfigMapping))
);
assertThat(e.getMessage(), containsString("Cannot set index_phrases on field [field] if positions are not enabled"));
}
}
public void testIndexPrefixMapping() throws IOException { public void testIndexPrefixMapping() throws IOException {
QueryShardContext queryShardContext = indexService.newQueryShardContext( QueryShardContext queryShardContext = indexService.newQueryShardContext(

View File

@ -68,6 +68,13 @@ public class TextFieldTypeTests extends FieldTypeTestCase {
tft.setFielddataMinSegmentSize(1000); tft.setFielddataMinSegmentSize(1000);
} }
}); });
addModifier(new Modifier("index_phrases", false) {
@Override
public void modify(MappedFieldType ft) {
TextFieldMapper.TextFieldType tft = (TextFieldMapper.TextFieldType) ft;
tft.setIndexPhrases(true);
}
});
addModifier(new Modifier("index_prefixes", false) { addModifier(new Modifier("index_prefixes", false) {
@Override @Override
public void modify(MappedFieldType ft) { public void modify(MappedFieldType ft) {

View File

@ -19,6 +19,8 @@
package org.elasticsearch.persistent; package org.elasticsearch.persistent;
import org.elasticsearch.ResourceNotFoundException; import org.elasticsearch.ResourceNotFoundException;
import org.elasticsearch.Version;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.cluster.Diff; import org.elasticsearch.cluster.Diff;
import org.elasticsearch.cluster.NamedDiff; import org.elasticsearch.cluster.NamedDiff;
import org.elasticsearch.cluster.metadata.MetaData; import org.elasticsearch.cluster.metadata.MetaData;
@ -26,8 +28,11 @@ import org.elasticsearch.cluster.metadata.MetaData.Custom;
import org.elasticsearch.common.ParseField; import org.elasticsearch.common.ParseField;
import org.elasticsearch.common.UUIDs; import org.elasticsearch.common.UUIDs;
import org.elasticsearch.common.bytes.BytesReference; import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.io.stream.BytesStreamOutput;
import org.elasticsearch.common.io.stream.NamedWriteableAwareStreamInput;
import org.elasticsearch.common.io.stream.NamedWriteableRegistry; import org.elasticsearch.common.io.stream.NamedWriteableRegistry;
import org.elasticsearch.common.io.stream.NamedWriteableRegistry.Entry; import org.elasticsearch.common.io.stream.NamedWriteableRegistry.Entry;
import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.Writeable; import org.elasticsearch.common.io.stream.Writeable;
import org.elasticsearch.common.xcontent.NamedXContentRegistry; import org.elasticsearch.common.xcontent.NamedXContentRegistry;
import org.elasticsearch.common.xcontent.ToXContent; import org.elasticsearch.common.xcontent.ToXContent;
@ -43,13 +48,24 @@ import org.elasticsearch.persistent.TestPersistentTasksPlugin.TestPersistentTask
import org.elasticsearch.tasks.Task; import org.elasticsearch.tasks.Task;
import org.elasticsearch.test.AbstractDiffableSerializationTestCase; import org.elasticsearch.test.AbstractDiffableSerializationTestCase;
import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collections; import java.util.Collections;
import java.util.HashSet;
import java.util.NoSuchElementException;
import java.util.Optional;
import java.util.Set;
import static org.elasticsearch.cluster.metadata.MetaData.CONTEXT_MODE_GATEWAY; import static org.elasticsearch.cluster.metadata.MetaData.CONTEXT_MODE_GATEWAY;
import static org.elasticsearch.cluster.metadata.MetaData.CONTEXT_MODE_SNAPSHOT; import static org.elasticsearch.cluster.metadata.MetaData.CONTEXT_MODE_SNAPSHOT;
import static org.elasticsearch.persistent.PersistentTasksExecutor.NO_NODE_FOUND; import static org.elasticsearch.persistent.PersistentTasksExecutor.NO_NODE_FOUND;
import static org.elasticsearch.test.VersionUtils.allReleasedVersions;
import static org.elasticsearch.test.VersionUtils.compatibleFutureVersion;
import static org.elasticsearch.test.VersionUtils.getFirstVersion;
import static org.elasticsearch.test.VersionUtils.getPreviousVersion;
import static org.elasticsearch.test.VersionUtils.randomVersionBetween;
import static org.hamcrest.Matchers.equalTo;
public class PersistentTasksCustomMetaDataTests extends AbstractDiffableSerializationTestCase<Custom> { public class PersistentTasksCustomMetaDataTests extends AbstractDiffableSerializationTestCase<Custom> {
@ -228,7 +244,65 @@ public class PersistentTasksCustomMetaDataTests extends AbstractDiffableSerializ
assertEquals(changed, builder.isChanged()); assertEquals(changed, builder.isChanged());
persistentTasks = builder.build(); persistentTasks = builder.build();
} }
}
public void testMinVersionSerialization() throws IOException {
PersistentTasksCustomMetaData.Builder tasks = PersistentTasksCustomMetaData.builder();
Version minVersion = allReleasedVersions().stream().filter(Version::isRelease).findFirst().orElseThrow(NoSuchElementException::new);
final Version streamVersion = randomVersionBetween(random(), minVersion, getPreviousVersion(Version.CURRENT));
tasks.addTask("test_compatible_version", TestPersistentTasksExecutor.NAME,
new TestParams(null, randomVersionBetween(random(), minVersion, streamVersion),
randomBoolean() ? Optional.empty() : Optional.of("test")),
randomAssignment());
tasks.addTask("test_incompatible_version", TestPersistentTasksExecutor.NAME,
new TestParams(null, randomVersionBetween(random(), compatibleFutureVersion(streamVersion), Version.CURRENT),
randomBoolean() ? Optional.empty() : Optional.of("test")),
randomAssignment());
final BytesStreamOutput out = new BytesStreamOutput();
out.setVersion(streamVersion);
Set<String> features = new HashSet<>();
final boolean transportClient = randomBoolean();
if (transportClient) {
features.add(TransportClient.TRANSPORT_CLIENT_FEATURE);
}
// if a transport client, then it must have the feature otherwise we add the feature randomly
if (transportClient || randomBoolean()) {
features.add("test");
}
out.setFeatures(features);
tasks.build().writeTo(out);
final StreamInput input = out.bytes().streamInput();
input.setVersion(streamVersion);
PersistentTasksCustomMetaData read =
new PersistentTasksCustomMetaData(new NamedWriteableAwareStreamInput(input, getNamedWriteableRegistry()));
assertThat(read.taskMap().keySet(), equalTo(Collections.singleton("test_compatible_version")));
}
public void testFeatureSerialization() throws IOException {
PersistentTasksCustomMetaData.Builder tasks = PersistentTasksCustomMetaData.builder();
Version minVersion = getFirstVersion();
tasks.addTask("test_compatible", TestPersistentTasksExecutor.NAME,
new TestParams(null, randomVersionBetween(random(), minVersion, Version.CURRENT),
randomBoolean() ? Optional.empty() : Optional.of("existing")),
randomAssignment());
tasks.addTask("test_incompatible", TestPersistentTasksExecutor.NAME,
new TestParams(null, randomVersionBetween(random(), minVersion, Version.CURRENT), Optional.of("non_existing")),
randomAssignment());
final BytesStreamOutput out = new BytesStreamOutput();
out.setVersion(Version.CURRENT);
Set<String> features = new HashSet<>();
features.add("existing");
features.add(TransportClient.TRANSPORT_CLIENT_FEATURE);
out.setFeatures(features);
tasks.build().writeTo(out);
PersistentTasksCustomMetaData read = new PersistentTasksCustomMetaData(
new NamedWriteableAwareStreamInput(out.bytes().streamInput(), getNamedWriteableRegistry()));
assertThat(read.taskMap().keySet(), equalTo(Collections.singleton("test_compatible")));
} }
private Assignment randomAssignment() { private Assignment randomAssignment() {

View File

@ -20,12 +20,12 @@ package org.elasticsearch.persistent;
import org.elasticsearch.action.support.PlainActionFuture; import org.elasticsearch.action.support.PlainActionFuture;
import org.elasticsearch.common.UUIDs; import org.elasticsearch.common.UUIDs;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.PersistentTask;
import org.elasticsearch.persistent.TestPersistentTasksPlugin.TestParams;
import org.elasticsearch.persistent.TestPersistentTasksPlugin.TestPersistentTasksExecutor;
import org.elasticsearch.plugins.Plugin; import org.elasticsearch.plugins.Plugin;
import org.elasticsearch.test.ESIntegTestCase; import org.elasticsearch.test.ESIntegTestCase;
import org.elasticsearch.test.junit.annotations.TestLogging; import org.elasticsearch.test.junit.annotations.TestLogging;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.PersistentTask;
import org.elasticsearch.persistent.TestPersistentTasksPlugin.TestPersistentTasksExecutor;
import org.elasticsearch.persistent.TestPersistentTasksPlugin.TestParams;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
@ -35,8 +35,6 @@ import java.util.List;
import static org.hamcrest.Matchers.empty; import static org.hamcrest.Matchers.empty;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.greaterThan; import static org.hamcrest.Matchers.greaterThan;
import static org.hamcrest.Matchers.notNullValue;
import static org.hamcrest.Matchers.nullValue;
@ESIntegTestCase.ClusterScope(scope = ESIntegTestCase.Scope.TEST, minNumDataNodes = 1) @ESIntegTestCase.ClusterScope(scope = ESIntegTestCase.Scope.TEST, minNumDataNodes = 1)
public class PersistentTasksExecutorFullRestartIT extends ESIntegTestCase { public class PersistentTasksExecutorFullRestartIT extends ESIntegTestCase {
@ -65,7 +63,7 @@ public class PersistentTasksExecutorFullRestartIT extends ESIntegTestCase {
PlainActionFuture<PersistentTask<TestParams>> future = new PlainActionFuture<>(); PlainActionFuture<PersistentTask<TestParams>> future = new PlainActionFuture<>();
futures.add(future); futures.add(future);
taskIds[i] = UUIDs.base64UUID(); taskIds[i] = UUIDs.base64UUID();
service.sendStartRequest(taskIds[i], TestPersistentTasksExecutor.NAME, randomBoolean() ? null : new TestParams("Blah"), future); service.sendStartRequest(taskIds[i], TestPersistentTasksExecutor.NAME, new TestParams("Blah"), future);
} }
for (int i = 0; i < numberOfTasks; i++) { for (int i = 0; i < numberOfTasks; i++) {

View File

@ -22,8 +22,8 @@ import org.elasticsearch.common.UUIDs;
import org.elasticsearch.common.io.stream.NamedWriteableRegistry; import org.elasticsearch.common.io.stream.NamedWriteableRegistry;
import org.elasticsearch.common.io.stream.NamedWriteableRegistry.Entry; import org.elasticsearch.common.io.stream.NamedWriteableRegistry.Entry;
import org.elasticsearch.persistent.StartPersistentTaskAction.Request; import org.elasticsearch.persistent.StartPersistentTaskAction.Request;
import org.elasticsearch.persistent.TestPersistentTasksPlugin.TestPersistentTasksExecutor;
import org.elasticsearch.persistent.TestPersistentTasksPlugin.TestParams; import org.elasticsearch.persistent.TestPersistentTasksPlugin.TestParams;
import org.elasticsearch.persistent.TestPersistentTasksPlugin.TestPersistentTasksExecutor;
import org.elasticsearch.test.AbstractStreamableTestCase; import org.elasticsearch.test.AbstractStreamableTestCase;
import java.util.Collections; import java.util.Collections;
@ -32,17 +32,12 @@ public class StartPersistentActionRequestTests extends AbstractStreamableTestCas
@Override @Override
protected Request createTestInstance() { protected Request createTestInstance() {
TestParams testParams; TestParams testParams = new TestParams();
if (randomBoolean()) { if (randomBoolean()) {
testParams = new TestParams(); testParams.setTestParam(randomAlphaOfLengthBetween(1, 20));
if (randomBoolean()) { }
testParams.setTestParam(randomAlphaOfLengthBetween(1, 20)); if (randomBoolean()) {
} testParams.setExecutorNodeAttr(randomAlphaOfLengthBetween(1, 20));
if (randomBoolean()) {
testParams.setExecutorNodeAttr(randomAlphaOfLengthBetween(1, 20));
}
} else {
testParams = null;
} }
return new Request(UUIDs.base64UUID(), randomAlphaOfLengthBetween(1, 20), testParams); return new Request(UUIDs.base64UUID(), randomAlphaOfLengthBetween(1, 20), testParams);
} }

View File

@ -19,6 +19,7 @@
package org.elasticsearch.persistent; package org.elasticsearch.persistent;
import org.elasticsearch.Version;
import org.elasticsearch.action.Action; import org.elasticsearch.action.Action;
import org.elasticsearch.action.ActionListener; import org.elasticsearch.action.ActionListener;
import org.elasticsearch.action.ActionRequest; import org.elasticsearch.action.ActionRequest;
@ -49,6 +50,8 @@ import org.elasticsearch.common.xcontent.ConstructingObjectParser;
import org.elasticsearch.common.xcontent.NamedXContentRegistry; import org.elasticsearch.common.xcontent.NamedXContentRegistry;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.Assignment;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.PersistentTask;
import org.elasticsearch.plugins.ActionPlugin; import org.elasticsearch.plugins.ActionPlugin;
import org.elasticsearch.plugins.PersistentTaskPlugin; import org.elasticsearch.plugins.PersistentTaskPlugin;
import org.elasticsearch.plugins.Plugin; import org.elasticsearch.plugins.Plugin;
@ -57,8 +60,6 @@ import org.elasticsearch.tasks.TaskCancelledException;
import org.elasticsearch.tasks.TaskId; import org.elasticsearch.tasks.TaskId;
import org.elasticsearch.threadpool.ThreadPool; import org.elasticsearch.threadpool.ThreadPool;
import org.elasticsearch.transport.TransportService; import org.elasticsearch.transport.TransportService;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.Assignment;
import org.elasticsearch.persistent.PersistentTasksCustomMetaData.PersistentTask;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -67,6 +68,7 @@ import java.util.Collections;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Objects; import java.util.Objects;
import java.util.Optional;
import java.util.concurrent.CountDownLatch; import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit; import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicInteger; import java.util.concurrent.atomic.AtomicInteger;
@ -120,6 +122,9 @@ public class TestPersistentTasksPlugin extends Plugin implements ActionPlugin, P
REQUEST_PARSER.declareString(constructorArg(), new ParseField("param")); REQUEST_PARSER.declareString(constructorArg(), new ParseField("param"));
} }
private final Version minVersion;
private final Optional<String> feature;
private String executorNodeAttr = null; private String executorNodeAttr = null;
private String responseNode = null; private String responseNode = null;
@ -127,17 +132,25 @@ public class TestPersistentTasksPlugin extends Plugin implements ActionPlugin, P
private String testParam = null; private String testParam = null;
public TestParams() { public TestParams() {
this((String)null);
} }
public TestParams(String testParam) { public TestParams(String testParam) {
this(testParam, Version.CURRENT, Optional.empty());
}
public TestParams(String testParam, Version minVersion, Optional<String> feature) {
this.testParam = testParam; this.testParam = testParam;
this.minVersion = minVersion;
this.feature = feature;
} }
public TestParams(StreamInput in) throws IOException { public TestParams(StreamInput in) throws IOException {
executorNodeAttr = in.readOptionalString(); executorNodeAttr = in.readOptionalString();
responseNode = in.readOptionalString(); responseNode = in.readOptionalString();
testParam = in.readOptionalString(); testParam = in.readOptionalString();
minVersion = Version.readVersion(in);
feature = Optional.ofNullable(in.readOptionalString());
} }
@Override @Override
@ -166,6 +179,8 @@ public class TestPersistentTasksPlugin extends Plugin implements ActionPlugin, P
out.writeOptionalString(executorNodeAttr); out.writeOptionalString(executorNodeAttr);
out.writeOptionalString(responseNode); out.writeOptionalString(responseNode);
out.writeOptionalString(testParam); out.writeOptionalString(testParam);
Version.writeVersion(minVersion, out);
out.writeOptionalString(feature.orElse(null));
} }
@Override @Override
@ -194,6 +209,16 @@ public class TestPersistentTasksPlugin extends Plugin implements ActionPlugin, P
public int hashCode() { public int hashCode() {
return Objects.hash(executorNodeAttr, responseNode, testParam); return Objects.hash(executorNodeAttr, responseNode, testParam);
} }
@Override
public Version getMinimalSupportedVersion() {
return minVersion;
}
@Override
public Optional<String> getRequiredFeature() {
return feature;
}
} }
public static class Status implements Task.Status { public static class Status implements Task.Status {

View File

@ -71,7 +71,7 @@ public class EnableAssignmentDeciderIT extends ESIntegTestCase {
final CountDownLatch latch = new CountDownLatch(numberOfTasks); final CountDownLatch latch = new CountDownLatch(numberOfTasks);
for (int i = 0; i < numberOfTasks; i++) { for (int i = 0; i < numberOfTasks; i++) {
PersistentTasksService service = internalCluster().getInstance(PersistentTasksService.class); PersistentTasksService service = internalCluster().getInstance(PersistentTasksService.class);
service.sendStartRequest("task_" + i, TestPersistentTasksExecutor.NAME, randomTaskParams(), service.sendStartRequest("task_" + i, TestPersistentTasksExecutor.NAME, new TestParams(randomAlphaOfLength(10)),
new ActionListener<PersistentTask<PersistentTaskParams>>() { new ActionListener<PersistentTask<PersistentTaskParams>>() {
@Override @Override
public void onResponse(PersistentTask<PersistentTaskParams> task) { public void onResponse(PersistentTask<PersistentTaskParams> task) {
@ -163,11 +163,4 @@ public class EnableAssignmentDeciderIT extends ESIntegTestCase {
assertAcked(client().admin().cluster().prepareUpdateSettings().setPersistentSettings(settings)); assertAcked(client().admin().cluster().prepareUpdateSettings().setPersistentSettings(settings));
} }
/** Returns a random task parameter **/
private static PersistentTaskParams randomTaskParams() {
if (randomBoolean()) {
return null;
}
return new TestParams(randomAlphaOfLength(10));
}
} }

View File

@ -21,9 +21,9 @@ package org.elasticsearch.snapshots;
import com.carrotsearch.hppc.IntHashSet; import com.carrotsearch.hppc.IntHashSet;
import com.carrotsearch.hppc.IntSet; import com.carrotsearch.hppc.IntSet;
import org.elasticsearch.Version;
import org.elasticsearch.action.ActionFuture; import org.elasticsearch.action.ActionFuture;
import org.elasticsearch.action.admin.cluster.repositories.put.PutRepositoryResponse; import org.elasticsearch.action.admin.cluster.repositories.put.PutRepositoryResponse;
import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryResponse;
import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotResponse; import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotResponse;
import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotResponse; import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotResponse;
import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsResponse; import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsResponse;
@ -1162,6 +1162,11 @@ public class DedicatedClusterSnapshotRestoreIT extends AbstractSnapshotIntegTest
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
public static SnapshottableMetadata readFrom(StreamInput in) throws IOException { public static SnapshottableMetadata readFrom(StreamInput in) throws IOException {
return readFrom(SnapshottableMetadata::new, in); return readFrom(SnapshottableMetadata::new, in);
} }
@ -1193,6 +1198,11 @@ public class DedicatedClusterSnapshotRestoreIT extends AbstractSnapshotIntegTest
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
public static NonSnapshottableMetadata readFrom(StreamInput in) throws IOException { public static NonSnapshottableMetadata readFrom(StreamInput in) throws IOException {
return readFrom(NonSnapshottableMetadata::new, in); return readFrom(NonSnapshottableMetadata::new, in);
} }
@ -1223,6 +1233,11 @@ public class DedicatedClusterSnapshotRestoreIT extends AbstractSnapshotIntegTest
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
public static SnapshottableGatewayMetadata readFrom(StreamInput in) throws IOException { public static SnapshottableGatewayMetadata readFrom(StreamInput in) throws IOException {
return readFrom(SnapshottableGatewayMetadata::new, in); return readFrom(SnapshottableGatewayMetadata::new, in);
} }
@ -1253,6 +1268,11 @@ public class DedicatedClusterSnapshotRestoreIT extends AbstractSnapshotIntegTest
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
public static NonSnapshottableGatewayMetadata readFrom(StreamInput in) throws IOException { public static NonSnapshottableGatewayMetadata readFrom(StreamInput in) throws IOException {
return readFrom(NonSnapshottableGatewayMetadata::new, in); return readFrom(NonSnapshottableGatewayMetadata::new, in);
} }
@ -1284,6 +1304,11 @@ public class DedicatedClusterSnapshotRestoreIT extends AbstractSnapshotIntegTest
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT;
}
public static SnapshotableGatewayNoApiMetadata readFrom(StreamInput in) throws IOException { public static SnapshotableGatewayNoApiMetadata readFrom(StreamInput in) throws IOException {
return readFrom(SnapshotableGatewayNoApiMetadata::new, in); return readFrom(SnapshotableGatewayNoApiMetadata::new, in);
} }

View File

@ -19,22 +19,19 @@
package org.elasticsearch.test; package org.elasticsearch.test;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Optional;
import java.util.Random;
import java.util.SortedSet;
import java.util.TreeSet;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import org.elasticsearch.Version; import org.elasticsearch.Version;
import org.elasticsearch.common.Nullable; import org.elasticsearch.common.Nullable;
import org.elasticsearch.common.collect.Tuple; import org.elasticsearch.common.collect.Tuple;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.Random;
import java.util.stream.Collectors;
import java.util.stream.Stream;
/** Utilities for selecting versions in tests */ /** Utilities for selecting versions in tests */
public class VersionUtils { public class VersionUtils {
@ -228,6 +225,13 @@ public class VersionUtils {
return opt.get(); return opt.get();
} }
/** returns the first future compatible version */
public static Version compatibleFutureVersion(Version version) {
final Optional<Version> opt = ALL_VERSIONS.stream().filter(version::before).filter(v -> v.isCompatible(version)).findAny();
assert opt.isPresent() : "no future compatible version for " + version;
return opt.get();
}
/** Returns the maximum {@link Version} that is compatible with the given version. */ /** Returns the maximum {@link Version} that is compatible with the given version. */
public static Version maxCompatibleVersion(Version version) { public static Version maxCompatibleVersion(Version version) {
final List<Version> compatible = ALL_VERSIONS.stream().filter(version::isCompatible).filter(version::onOrBefore) final List<Version> compatible = ALL_VERSIONS.stream().filter(version::isCompatible).filter(version::onOrBefore)

View File

@ -2329,7 +2329,7 @@ public abstract class AbstractSimpleTransportTestCase extends ESTestCase {
assertEquals(1, transportStats.getRxCount()); assertEquals(1, transportStats.getRxCount());
assertEquals(2, transportStats.getTxCount()); assertEquals(2, transportStats.getTxCount());
assertEquals(25, transportStats.getRxSize().getBytes()); assertEquals(25, transportStats.getRxSize().getBytes());
assertEquals(91, transportStats.getTxSize().getBytes()); assertEquals(92, transportStats.getTxSize().getBytes());
}); });
sendResponseLatch.countDown(); sendResponseLatch.countDown();
responseLatch.await(); responseLatch.await();
@ -2337,7 +2337,7 @@ public abstract class AbstractSimpleTransportTestCase extends ESTestCase {
assertEquals(2, stats.getRxCount()); assertEquals(2, stats.getRxCount());
assertEquals(2, stats.getTxCount()); assertEquals(2, stats.getTxCount());
assertEquals(46, stats.getRxSize().getBytes()); assertEquals(46, stats.getRxSize().getBytes());
assertEquals(91, stats.getTxSize().getBytes()); assertEquals(92, stats.getTxSize().getBytes());
} finally { } finally {
serviceC.close(); serviceC.close();
} }
@ -2444,7 +2444,7 @@ public abstract class AbstractSimpleTransportTestCase extends ESTestCase {
assertEquals(1, transportStats.getRxCount()); assertEquals(1, transportStats.getRxCount());
assertEquals(2, transportStats.getTxCount()); assertEquals(2, transportStats.getTxCount());
assertEquals(25, transportStats.getRxSize().getBytes()); assertEquals(25, transportStats.getRxSize().getBytes());
assertEquals(91, transportStats.getTxSize().getBytes()); assertEquals(92, transportStats.getTxSize().getBytes());
}); });
sendResponseLatch.countDown(); sendResponseLatch.countDown();
responseLatch.await(); responseLatch.await();
@ -2459,7 +2459,7 @@ public abstract class AbstractSimpleTransportTestCase extends ESTestCase {
// 49 bytes are the non-exception message bytes that have been received. It should include the initial // 49 bytes are the non-exception message bytes that have been received. It should include the initial
// handshake message and the header, version, etc bytes in the exception message. // handshake message and the header, version, etc bytes in the exception message.
assertEquals(failedMessage, 49 + streamOutput.bytes().length(), stats.getRxSize().getBytes()); assertEquals(failedMessage, 49 + streamOutput.bytes().length(), stats.getRxSize().getBytes());
assertEquals(91, stats.getTxSize().getBytes()); assertEquals(92, stats.getTxSize().getBytes());
} finally { } finally {
serviceC.close(); serviceC.close();
} }

View File

@ -108,6 +108,11 @@ public class LicensesMetaData extends AbstractNamedDiffable<MetaData.Custom> imp
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumCompatibilityVersion();
}
@Override @Override
public EnumSet<MetaData.XContentContext> context() { public EnumSet<MetaData.XContentContext> context() {
return EnumSet.of(MetaData.XContentContext.GATEWAY); return EnumSet.of(MetaData.XContentContext.GATEWAY);

View File

@ -40,6 +40,7 @@ import org.elasticsearch.license.LicenseService;
import org.elasticsearch.license.LicensesMetaData; import org.elasticsearch.license.LicensesMetaData;
import org.elasticsearch.license.Licensing; import org.elasticsearch.license.Licensing;
import org.elasticsearch.license.XPackLicenseState; import org.elasticsearch.license.XPackLicenseState;
import org.elasticsearch.persistent.PersistentTaskParams;
import org.elasticsearch.plugins.ExtensiblePlugin; import org.elasticsearch.plugins.ExtensiblePlugin;
import org.elasticsearch.plugins.ScriptPlugin; import org.elasticsearch.plugins.ScriptPlugin;
import org.elasticsearch.rest.RestController; import org.elasticsearch.rest.RestController;
@ -331,4 +332,12 @@ public class XPackPlugin extends XPackClientPlugin implements ScriptPlugin, Exte
} }
public interface XPackPersistentTaskParams extends PersistentTaskParams {
@Override
default Optional<String> getRequiredFeature() {
return XPackClientPlugin.X_PACK_FEATURE;
}
}
} }

View File

@ -23,10 +23,10 @@ import org.elasticsearch.common.xcontent.ToXContentObject;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.tasks.Task; import org.elasticsearch.tasks.Task;
import org.elasticsearch.xpack.core.XPackPlugin;
import org.elasticsearch.xpack.core.ml.MachineLearningField; import org.elasticsearch.xpack.core.ml.MachineLearningField;
import org.elasticsearch.xpack.core.ml.job.config.Job; import org.elasticsearch.xpack.core.ml.job.config.Job;
import org.elasticsearch.xpack.core.ml.utils.ExceptionsHelper; import org.elasticsearch.xpack.core.ml.utils.ExceptionsHelper;
import org.elasticsearch.persistent.PersistentTaskParams;
import java.io.IOException; import java.io.IOException;
import java.util.Objects; import java.util.Objects;
@ -127,7 +127,7 @@ public class OpenJobAction extends Action<OpenJobAction.Request, OpenJobAction.R
} }
} }
public static class JobParams implements PersistentTaskParams { public static class JobParams implements XPackPlugin.XPackPersistentTaskParams {
/** TODO Remove in 7.0.0 */ /** TODO Remove in 7.0.0 */
public static final ParseField IGNORE_DOWNTIME = new ParseField("ignore_downtime"); public static final ParseField IGNORE_DOWNTIME = new ParseField("ignore_downtime");
@ -237,6 +237,11 @@ public class OpenJobAction extends Action<OpenJobAction.Request, OpenJobAction.R
public String toString() { public String toString() {
return Strings.toString(this); return Strings.toString(this);
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumCompatibilityVersion();
}
} }
public static class Response extends AcknowledgedResponse { public static class Response extends AcknowledgedResponse {

View File

@ -6,6 +6,7 @@
package org.elasticsearch.xpack.core.ml.action; package org.elasticsearch.xpack.core.ml.action;
import org.elasticsearch.ElasticsearchParseException; import org.elasticsearch.ElasticsearchParseException;
import org.elasticsearch.Version;
import org.elasticsearch.action.Action; import org.elasticsearch.action.Action;
import org.elasticsearch.action.ActionRequestBuilder; import org.elasticsearch.action.ActionRequestBuilder;
import org.elasticsearch.action.ActionRequestValidationException; import org.elasticsearch.action.ActionRequestValidationException;
@ -24,10 +25,10 @@ import org.elasticsearch.common.xcontent.ToXContentObject;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.index.mapper.DateFieldMapper; import org.elasticsearch.index.mapper.DateFieldMapper;
import org.elasticsearch.xpack.core.XPackPlugin;
import org.elasticsearch.xpack.core.ml.datafeed.DatafeedConfig; import org.elasticsearch.xpack.core.ml.datafeed.DatafeedConfig;
import org.elasticsearch.xpack.core.ml.job.messages.Messages; import org.elasticsearch.xpack.core.ml.job.messages.Messages;
import org.elasticsearch.xpack.core.ml.utils.ExceptionsHelper; import org.elasticsearch.xpack.core.ml.utils.ExceptionsHelper;
import org.elasticsearch.persistent.PersistentTaskParams;
import java.io.IOException; import java.io.IOException;
import java.util.Objects; import java.util.Objects;
@ -138,7 +139,7 @@ public class StartDatafeedAction extends Action<StartDatafeedAction.Request, Sta
} }
} }
public static class DatafeedParams implements PersistentTaskParams { public static class DatafeedParams implements XPackPlugin.XPackPersistentTaskParams {
public static ObjectParser<DatafeedParams, Void> PARSER = new ObjectParser<>(TASK_NAME, DatafeedParams::new); public static ObjectParser<DatafeedParams, Void> PARSER = new ObjectParser<>(TASK_NAME, DatafeedParams::new);
@ -231,6 +232,11 @@ public class StartDatafeedAction extends Action<StartDatafeedAction.Request, Sta
return TASK_NAME; return TASK_NAME;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumCompatibilityVersion();
}
@Override @Override
public void writeTo(StreamOutput out) throws IOException { public void writeTo(StreamOutput out) throws IOException {
out.writeString(datafeedId); out.writeString(datafeedId);

View File

@ -5,6 +5,7 @@
*/ */
package org.elasticsearch.xpack.core.rollup.job; package org.elasticsearch.xpack.core.rollup.job;
import org.elasticsearch.Version;
import org.elasticsearch.cluster.AbstractDiffable; import org.elasticsearch.cluster.AbstractDiffable;
import org.elasticsearch.cluster.Diff; import org.elasticsearch.cluster.Diff;
import org.elasticsearch.common.ParseField; import org.elasticsearch.common.ParseField;
@ -13,7 +14,7 @@ import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.xcontent.ConstructingObjectParser; import org.elasticsearch.common.xcontent.ConstructingObjectParser;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.persistent.PersistentTaskParams; import org.elasticsearch.xpack.core.XPackPlugin;
import java.io.IOException; import java.io.IOException;
import java.util.Collections; import java.util.Collections;
@ -25,7 +26,7 @@ import java.util.Objects;
* It holds the config (RollupJobConfig) and a map of authentication headers. Only RollupJobConfig * It holds the config (RollupJobConfig) and a map of authentication headers. Only RollupJobConfig
* is ever serialized to the user, so the headers should never leak * is ever serialized to the user, so the headers should never leak
*/ */
public class RollupJob extends AbstractDiffable<RollupJob> implements PersistentTaskParams { public class RollupJob extends AbstractDiffable<RollupJob> implements XPackPlugin.XPackPersistentTaskParams {
public static final String NAME = "xpack/rollup/job"; public static final String NAME = "xpack/rollup/job";
@ -110,4 +111,9 @@ public class RollupJob extends AbstractDiffable<RollupJob> implements Persistent
public int hashCode() { public int hashCode() {
return Objects.hash(config, headers); return Objects.hash(config, headers);
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.V_6_3_0;
}
} }

View File

@ -5,6 +5,7 @@
*/ */
package org.elasticsearch.xpack.core.watcher; package org.elasticsearch.xpack.core.watcher;
import org.elasticsearch.Version;
import org.elasticsearch.cluster.AbstractNamedDiffable; import org.elasticsearch.cluster.AbstractNamedDiffable;
import org.elasticsearch.cluster.NamedDiff; import org.elasticsearch.cluster.NamedDiff;
import org.elasticsearch.cluster.metadata.MetaData; import org.elasticsearch.cluster.metadata.MetaData;
@ -38,6 +39,11 @@ public class WatcherMetaData extends AbstractNamedDiffable<MetaData.Custom> impl
return TYPE; return TYPE;
} }
@Override
public Version getMinimalSupportedVersion() {
return Version.CURRENT.minimumCompatibilityVersion();
}
@Override @Override
public EnumSet<MetaData.XContentContext> context() { public EnumSet<MetaData.XContentContext> context() {
return EnumSet.of(MetaData.XContentContext.GATEWAY); return EnumSet.of(MetaData.XContentContext.GATEWAY);

View File

@ -10,7 +10,7 @@ setlocal enableextensions
set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.xpack.core.ssl.CertificateGenerateTool ^ org.elasticsearch.xpack.core.ssl.CertificateGenerateTool ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal

View File

@ -10,7 +10,7 @@ setlocal enableextensions
set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.xpack.core.ssl.CertificateTool ^ org.elasticsearch.xpack.core.ssl.CertificateTool ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal

View File

@ -10,7 +10,7 @@ setlocal enableextensions
set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.xpack.security.authc.esnative.ESNativeRealmMigrateTool ^ org.elasticsearch.xpack.security.authc.esnative.ESNativeRealmMigrateTool ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal

View File

@ -10,7 +10,7 @@ setlocal enableextensions
set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.xpack.security.authc.saml.SamlMetadataCommand ^ org.elasticsearch.xpack.security.authc.saml.SamlMetadataCommand ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal

View File

@ -10,7 +10,7 @@ setlocal enableextensions
set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.xpack.security.authc.esnative.tool.SetupPasswordTool ^ org.elasticsearch.xpack.security.authc.esnative.tool.SetupPasswordTool ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal

View File

@ -10,7 +10,7 @@ setlocal enableextensions
set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.xpack.security.crypto.tool.SystemKeyTool ^ org.elasticsearch.xpack.security.crypto.tool.SystemKeyTool ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal

View File

@ -10,7 +10,7 @@ setlocal enableextensions
set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-security-env
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.xpack.security.authc.file.tool.UsersTool ^ org.elasticsearch.xpack.security.authc.file.tool.UsersTool ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal

View File

@ -10,7 +10,7 @@ setlocal enableextensions
set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-watcher-env set ES_ADDITIONAL_SOURCES=x-pack-env;x-pack-watcher-env
call "%~dp0elasticsearch-cli.bat" ^ call "%~dp0elasticsearch-cli.bat" ^
org.elasticsearch.xpack.watcher.trigger.schedule.tool.CronEvalTool ^ org.elasticsearch.xpack.watcher.trigger.schedule.tool.CronEvalTool ^
%* ^ %%* ^
|| exit /b 1 || exit /b 1
endlocal endlocal