mirror of
https://github.com/honeymoose/OpenSearch.git
synced 2025-03-24 17:09:48 +00:00
This field is a specialization of the `keyword` field for the case when all documents have the same value. It typically performs more efficiently than keywords at query time by figuring out whether all or none of the documents match at rewrite time, like `term` queries on `_index`. The name is up for discussion. I liked including `keyword` in it, so that we still have room for a `singleton_numeric` in the future. However I'm unsure whether to call it `singleton`, `constant` or something else, any opinions? For this field there is a choice between 1. accepting values in `_source` when they are equal to the value configured in mappings, but rejecting mapping updates 2. rejecting values in `_source` but then allowing updates to the value that is configured in the mapping This commit implements option 1, so that it is possible to reindex from/to an index that has the field mapped as a keyword with no changes to the source. Backport of #49713
This commit is contained in:
parent
bcb68c860c
commit
cb868d2f5e
@ -418,3 +418,115 @@ The <<text,`text`>> field has an <<index-prefixes,`index_prefixes`>> option that
|
||||
indexes prefixes of all terms and is automatically leveraged by query parsers to
|
||||
run prefix queries. If your use-case involves running lots of prefix queries,
|
||||
this can speed up queries significantly.
|
||||
|
||||
[[faster-filtering-with-constant-keyword]]
|
||||
=== Use <<constant-keyword,`constant_keyword`>> to speed up filtering
|
||||
|
||||
There is a general rule that the cost of a filter is mostly a function of the
|
||||
number of matched documents. Imagine that you have an index containing cycles.
|
||||
There are a large number of bicycles and many searches perform a filter on
|
||||
`cycle_type: bicycle`. This very common filter is unfortunately also very costly
|
||||
since it matches most documents. There is a simple way to avoid running this
|
||||
filter: move bicycles to their own index and filter bicycles by searching this
|
||||
index instead of adding a filter to the query.
|
||||
|
||||
Unfortunately this can make client-side logic tricky, which is where
|
||||
`constant_keyword` helps. By mapping `cycle_type` as a `constant_keyword` with
|
||||
value `bicycle` on the index that contains bicycles, clients can keep running
|
||||
the exact same queries as they used to run on the monolithic index and
|
||||
Elasticsearch will do the right thing on the bicycles index by ignoring filters
|
||||
on `cycle_type` if the value is `bicycle` and returning no hits otherwise.
|
||||
|
||||
Here is what mappings could look like:
|
||||
|
||||
[source,console]
|
||||
--------------------------------------------------
|
||||
PUT bicycles
|
||||
{
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"cycle_type": {
|
||||
"type": "constant_keyword",
|
||||
"value": "bicycle"
|
||||
},
|
||||
"name": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
PUT other_cycles
|
||||
{
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"cycle_type": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"name": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------------------------
|
||||
|
||||
We are splitting our index in two: one that will contain only bicycles, and
|
||||
another one that contains other cycles: unicycles, tricycles, etc. Then at
|
||||
search time, we need to search both indices, but we don't need to modify
|
||||
queries.
|
||||
|
||||
|
||||
[source,console]
|
||||
--------------------------------------------------
|
||||
GET bicycles,other_cycles/_search
|
||||
{
|
||||
"query": {
|
||||
"bool": {
|
||||
"must": {
|
||||
"match": {
|
||||
"description": "dutch"
|
||||
}
|
||||
},
|
||||
"filter": {
|
||||
"term": {
|
||||
"cycle_type": "bicycle"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------------------------
|
||||
// TEST[continued]
|
||||
|
||||
On the `bicycles` index, Elasticsearch will simply ignore the `cycle_type`
|
||||
filter and rewrite the search request to the one below:
|
||||
|
||||
[source,console]
|
||||
--------------------------------------------------
|
||||
GET bicycles,other_cycles/_search
|
||||
{
|
||||
"query": {
|
||||
"match": {
|
||||
"description": "dutch"
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------------------------
|
||||
// TEST[continued]
|
||||
|
||||
On the `other_cycles` index, Elasticsearch will quickly figure out that
|
||||
`bicycle` doesn't exist in the terms dictionary of the `cycle_type` field and
|
||||
return a search response with no hits.
|
||||
|
||||
This is a powerful way of making queries cheaper by putting common values in a
|
||||
dedicated index. This idea can also be combined across multiple fields: for
|
||||
instance if you track the color of each cycle and your `bicycles` index ends up
|
||||
having a majority of black bikes, you could split it into a `bicycles-black`
|
||||
and a `bicycles-other-colors` indices.
|
||||
|
||||
The `constant_keyword` is not strictly required for this optimization: it is
|
||||
also possible to update the client-side logic in order to route queries to the
|
||||
relevant indices based on filters. However `constant_keyword` makes it
|
||||
transparently and allows to decouple search requests from the index topology in
|
||||
exchange of very little overhead.
|
||||
|
@ -59,6 +59,8 @@ string:: <<text,`text`>> and <<keyword,`keyword`>>
|
||||
|
||||
<<histogram>>:: `histogram` for pre-aggregated numerical values for percentiles aggregations.
|
||||
|
||||
<<constant-keyword>>:: Specialization of `keyword` for the case when all documents have the same value.
|
||||
|
||||
[float]
|
||||
[[types-array-handling]]
|
||||
=== Arrays
|
||||
@ -130,4 +132,6 @@ include::types/text.asciidoc[]
|
||||
|
||||
include::types/token-count.asciidoc[]
|
||||
|
||||
include::types/shape.asciidoc[]
|
||||
include::types/shape.asciidoc[]
|
||||
|
||||
include::types/constant-keyword.asciidoc[]
|
||||
|
85
docs/reference/mapping/types/constant-keyword.asciidoc
Normal file
85
docs/reference/mapping/types/constant-keyword.asciidoc
Normal file
@ -0,0 +1,85 @@
|
||||
[role="xpack"]
|
||||
[testenv="basic"]
|
||||
|
||||
[[constant-keyword]]
|
||||
=== Constant keyword datatype
|
||||
++++
|
||||
<titleabbrev>Constant keyword</titleabbrev>
|
||||
++++
|
||||
|
||||
Constant keyword is a specialization of the <<keyword,`keyword`>> field for
|
||||
the case that all documents in the index have the same value.
|
||||
|
||||
[source,console]
|
||||
--------------------------------
|
||||
PUT logs-debug
|
||||
{
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"@timestamp": {
|
||||
"type": "date"
|
||||
},
|
||||
"message": {
|
||||
"type": "text"
|
||||
},
|
||||
"level": {
|
||||
"type": "constant_keyword",
|
||||
"value": "debug"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------
|
||||
|
||||
`constant_keyword` supports the same queries and aggregations as `keyword`
|
||||
fields do, but takes advantage of the fact that all documents have the same
|
||||
value per index to execute queries more efficiently.
|
||||
|
||||
It is both allowed to submit documents that don't have a value for the field or
|
||||
that have a value equal to the value configured in mappings. The two below
|
||||
indexing requests are equivalent:
|
||||
|
||||
[source,console]
|
||||
--------------------------------
|
||||
POST logs-debug/_doc
|
||||
{
|
||||
"date": "2019-12-12",
|
||||
"message": "Starting up Elasticsearch",
|
||||
"level": "debug"
|
||||
}
|
||||
|
||||
POST logs-debug/_doc
|
||||
{
|
||||
"date": "2019-12-12",
|
||||
"message": "Starting up Elasticsearch"
|
||||
}
|
||||
--------------------------------
|
||||
//TEST[continued]
|
||||
|
||||
However providing a value that is different from the one configured in the
|
||||
mapping is disallowed.
|
||||
|
||||
In case no `value` is provided in the mappings, the field will automatically
|
||||
configure itself based on the value contained in the first indexed document.
|
||||
While this behavior can be convenient, note that it means that a single
|
||||
poisonous document can cause all other documents to be rejected if it had a
|
||||
wrong value.
|
||||
|
||||
The `value` of the field cannot be changed after it has been set.
|
||||
|
||||
[[constant-keyword-params]]
|
||||
==== Parameters for constant keyword fields
|
||||
|
||||
The following mapping parameters are accepted:
|
||||
|
||||
[horizontal]
|
||||
|
||||
<<mapping-field-meta,`meta`>>::
|
||||
|
||||
Metadata about the field.
|
||||
|
||||
`value`::
|
||||
|
||||
The value to associate with all documents in the index. If this parameter
|
||||
is not provided, it is set based on the first document that gets indexed.
|
||||
|
@ -91,6 +91,9 @@ public class ConstantIndexFieldData extends AbstractIndexOrdinalsFieldData {
|
||||
|
||||
@Override
|
||||
public SortedSetDocValues getOrdinalsValues() {
|
||||
if (value == null) {
|
||||
return DocValues.emptySortedSet();
|
||||
}
|
||||
final BytesRef term = new BytesRef(value);
|
||||
final SortedDocValues sortedValues = new AbstractSortedDocValues() {
|
||||
|
||||
|
@ -626,6 +626,10 @@ public class XPackLicenseState {
|
||||
return allowForAllLicenses();
|
||||
}
|
||||
|
||||
public boolean isConstantKeywordAllowed() {
|
||||
return allowForAllLicenses();
|
||||
}
|
||||
|
||||
/**
|
||||
* @return true if security is available to be used with the current license type
|
||||
*/
|
||||
|
@ -55,6 +55,8 @@ public final class XPackField {
|
||||
public static final String ANALYTICS = "analytics";
|
||||
/** Name constant for the enrich plugin. */
|
||||
public static final String ENRICH = "enrich";
|
||||
/** Name constant for the constant-keyword plugin. */
|
||||
public static final String CONSTANT_KEYWORD = "constant_keyword";
|
||||
|
||||
private XPackField() {}
|
||||
|
||||
|
24
x-pack/plugin/mapper-constant-keyword/build.gradle
Normal file
24
x-pack/plugin/mapper-constant-keyword/build.gradle
Normal file
@ -0,0 +1,24 @@
|
||||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
evaluationDependsOn(xpackModule('core'))
|
||||
|
||||
apply plugin: 'elasticsearch.esplugin'
|
||||
|
||||
esplugin {
|
||||
name 'constant-keyword'
|
||||
description 'Module for the constant-keyword field type, which is a specialization of keyword for the case when all documents have the same value.'
|
||||
classname 'org.elasticsearch.xpack.constantkeyword.ConstantKeywordMapperPlugin'
|
||||
extendedPlugins = ['x-pack-core']
|
||||
}
|
||||
archivesBaseName = 'x-pack-constant-keyword'
|
||||
|
||||
dependencies {
|
||||
compileOnly project(path: xpackModule('core'), configuration: 'default')
|
||||
testCompile project(path: xpackModule('core'), configuration: 'testArtifacts')
|
||||
}
|
||||
|
||||
integTest.enabled = false
|
@ -0,0 +1,29 @@
|
||||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.constantkeyword;
|
||||
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.index.mapper.Mapper;
|
||||
import org.elasticsearch.plugins.ActionPlugin;
|
||||
import org.elasticsearch.plugins.MapperPlugin;
|
||||
import org.elasticsearch.plugins.Plugin;
|
||||
import org.elasticsearch.xpack.constantkeyword.mapper.ConstantKeywordFieldMapper;
|
||||
|
||||
import java.util.Map;
|
||||
|
||||
import static java.util.Collections.singletonMap;
|
||||
|
||||
public class ConstantKeywordMapperPlugin extends Plugin implements MapperPlugin, ActionPlugin {
|
||||
|
||||
public ConstantKeywordMapperPlugin(Settings settings) {}
|
||||
|
||||
@Override
|
||||
public Map<String, Mapper.TypeParser> getMappers() {
|
||||
return singletonMap(ConstantKeywordFieldMapper.CONTENT_TYPE, new ConstantKeywordFieldMapper.TypeParser());
|
||||
}
|
||||
|
||||
}
|
@ -0,0 +1,308 @@
|
||||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
package org.elasticsearch.xpack.constantkeyword.mapper;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.time.ZoneId;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Objects;
|
||||
|
||||
import org.apache.lucene.index.IndexOptions;
|
||||
import org.apache.lucene.index.IndexableField;
|
||||
import org.apache.lucene.search.MatchAllDocsQuery;
|
||||
import org.apache.lucene.search.MatchNoDocsQuery;
|
||||
import org.apache.lucene.search.MultiTermQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.apache.lucene.util.UnicodeUtil;
|
||||
import org.apache.lucene.util.automaton.Automaton;
|
||||
import org.apache.lucene.util.automaton.CharacterRunAutomaton;
|
||||
import org.apache.lucene.util.automaton.LevenshteinAutomata;
|
||||
import org.apache.lucene.util.automaton.RegExp;
|
||||
import org.elasticsearch.common.geo.ShapeRelation;
|
||||
import org.elasticsearch.common.lucene.BytesRefs;
|
||||
import org.elasticsearch.common.regex.Regex;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.time.DateMathParser;
|
||||
import org.elasticsearch.common.unit.Fuzziness;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.index.fielddata.IndexFieldData;
|
||||
import org.elasticsearch.index.fielddata.plain.ConstantIndexFieldData;
|
||||
import org.elasticsearch.index.mapper.ConstantFieldType;
|
||||
import org.elasticsearch.index.mapper.FieldMapper;
|
||||
import org.elasticsearch.index.mapper.MappedFieldType;
|
||||
import org.elasticsearch.index.mapper.Mapper;
|
||||
import org.elasticsearch.index.mapper.MapperParsingException;
|
||||
import org.elasticsearch.index.mapper.ParseContext;
|
||||
import org.elasticsearch.index.mapper.TypeParsers;
|
||||
import org.elasticsearch.index.query.QueryShardContext;
|
||||
|
||||
/**
|
||||
* A {@link FieldMapper} that assigns every document the same value.
|
||||
*/
|
||||
public class ConstantKeywordFieldMapper extends FieldMapper {
|
||||
|
||||
public static final String CONTENT_TYPE = "constant_keyword";
|
||||
|
||||
public static class Defaults {
|
||||
public static final MappedFieldType FIELD_TYPE = new ConstantKeywordFieldType();
|
||||
static {
|
||||
FIELD_TYPE.setIndexOptions(IndexOptions.NONE);
|
||||
FIELD_TYPE.freeze();
|
||||
}
|
||||
}
|
||||
|
||||
public static class Builder extends FieldMapper.Builder<Builder, ConstantKeywordFieldMapper> {
|
||||
|
||||
public Builder(String name) {
|
||||
super(name, Defaults.FIELD_TYPE, Defaults.FIELD_TYPE);
|
||||
builder = this;
|
||||
}
|
||||
|
||||
public Builder setValue(String value) {
|
||||
fieldType().setValue(value);
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
public ConstantKeywordFieldType fieldType() {
|
||||
return (ConstantKeywordFieldType) super.fieldType();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ConstantKeywordFieldMapper build(BuilderContext context) {
|
||||
setupFieldType(context);
|
||||
return new ConstantKeywordFieldMapper(
|
||||
name, fieldType, defaultFieldType,
|
||||
context.indexSettings());
|
||||
}
|
||||
}
|
||||
|
||||
public static class TypeParser implements Mapper.TypeParser {
|
||||
@Override
|
||||
public Mapper.Builder<?,?> parse(String name, Map<String, Object> node, ParserContext parserContext) throws MapperParsingException {
|
||||
Object value = null;
|
||||
if (node.containsKey("value")) {
|
||||
value = node.remove("value");
|
||||
if (value == null) {
|
||||
throw new MapperParsingException("Property [value] of field [" + name + "] can't be [null].");
|
||||
}
|
||||
if (value instanceof Number == false && value instanceof CharSequence == false) {
|
||||
throw new MapperParsingException("Property [value] of field [" + name +
|
||||
"] must be a number or a string, but got [" + value + "]");
|
||||
}
|
||||
}
|
||||
ConstantKeywordFieldMapper.Builder builder = new ConstantKeywordFieldMapper.Builder(name);
|
||||
if (value != null) {
|
||||
builder.setValue(value.toString());
|
||||
}
|
||||
TypeParsers.parseMeta(builder, name, node);
|
||||
return builder;
|
||||
}
|
||||
}
|
||||
|
||||
public static final class ConstantKeywordFieldType extends ConstantFieldType {
|
||||
|
||||
private String value;
|
||||
|
||||
public ConstantKeywordFieldType() {
|
||||
super();
|
||||
}
|
||||
|
||||
protected ConstantKeywordFieldType(ConstantKeywordFieldType ref) {
|
||||
super(ref);
|
||||
this.value = ref.value;
|
||||
}
|
||||
|
||||
public ConstantKeywordFieldType clone() {
|
||||
return new ConstantKeywordFieldType(this);
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (super.equals(o) == false) {
|
||||
return false;
|
||||
}
|
||||
ConstantKeywordFieldType other = (ConstantKeywordFieldType) o;
|
||||
return Objects.equals(value, other.value);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void checkCompatibility(MappedFieldType newFT, List<String> conflicts) {
|
||||
super.checkCompatibility(newFT, conflicts);
|
||||
ConstantKeywordFieldType newConstantKeywordFT = (ConstantKeywordFieldType) newFT;
|
||||
if (this.value != null) {
|
||||
if (newConstantKeywordFT.value == null) {
|
||||
conflicts.add("mapper [" + name() + "] cannot unset [value]");
|
||||
} else if (Objects.equals(value, newConstantKeywordFT.value) == false) {
|
||||
conflicts.add("mapper [" + name() + "] has different [value] from the value that is configured in mappings: [" + value +
|
||||
"] vs. [" + newConstantKeywordFT.value + "]");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return 31 * super.hashCode() + Objects.hashCode(value);
|
||||
}
|
||||
|
||||
/** Return the value that this field wraps. This may be {@code null} if the field is not configured yet. */
|
||||
public String value() {
|
||||
return value;
|
||||
}
|
||||
|
||||
/** Set the value. */
|
||||
public void setValue(String value) {
|
||||
checkIfFrozen();
|
||||
this.value = Objects.requireNonNull(value);
|
||||
}
|
||||
|
||||
@Override
|
||||
public String typeName() {
|
||||
return CONTENT_TYPE;
|
||||
}
|
||||
|
||||
@Override
|
||||
public IndexFieldData.Builder fielddataBuilder(String fullyQualifiedIndexName) {
|
||||
return new ConstantIndexFieldData.Builder(mapperService -> value);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected boolean matches(String pattern, QueryShardContext context) {
|
||||
if (value == null) {
|
||||
return false;
|
||||
}
|
||||
return Regex.simpleMatch(pattern, value);
|
||||
}
|
||||
|
||||
@Override
|
||||
public Query rangeQuery(
|
||||
Object lowerTerm, Object upperTerm,
|
||||
boolean includeLower, boolean includeUpper,
|
||||
ShapeRelation relation, ZoneId timeZone, DateMathParser parser,
|
||||
QueryShardContext context) {
|
||||
if (this.value == null) {
|
||||
return new MatchNoDocsQuery();
|
||||
}
|
||||
|
||||
final BytesRef valueAsBytesRef = new BytesRef(value);
|
||||
if (lowerTerm != null && BytesRefs.toBytesRef(lowerTerm).compareTo(valueAsBytesRef) >= (includeLower ? 1 : 0)) {
|
||||
return new MatchNoDocsQuery();
|
||||
}
|
||||
if (upperTerm != null && valueAsBytesRef.compareTo(BytesRefs.toBytesRef(upperTerm)) >= (includeUpper ? 1 : 0)) {
|
||||
return new MatchNoDocsQuery();
|
||||
}
|
||||
return new MatchAllDocsQuery();
|
||||
}
|
||||
|
||||
@Override
|
||||
public Query fuzzyQuery(Object value, Fuzziness fuzziness, int prefixLength, int maxExpansions,
|
||||
boolean transpositions, QueryShardContext context) {
|
||||
if (this.value == null) {
|
||||
return new MatchNoDocsQuery();
|
||||
}
|
||||
|
||||
final String termAsString = BytesRefs.toString(value);
|
||||
final int maxEdits = fuzziness.asDistance(termAsString);
|
||||
|
||||
final int[] termText = new int[termAsString.codePointCount(0, termAsString.length())];
|
||||
for (int cp, i = 0, j = 0; i < termAsString.length(); i += Character.charCount(cp)) {
|
||||
termText[j++] = cp = termAsString.codePointAt(i);
|
||||
}
|
||||
final int termLength = termText.length;
|
||||
|
||||
prefixLength = Math.min(prefixLength, termLength);
|
||||
final String suffix = UnicodeUtil.newString(termText, prefixLength, termText.length - prefixLength);
|
||||
final LevenshteinAutomata builder = new LevenshteinAutomata(suffix, transpositions);
|
||||
final String prefix = UnicodeUtil.newString(termText, 0, prefixLength);
|
||||
final Automaton automaton = builder.toAutomaton(maxEdits, prefix);
|
||||
|
||||
final CharacterRunAutomaton runAutomaton = new CharacterRunAutomaton(automaton);
|
||||
if (runAutomaton.run(this.value)) {
|
||||
return new MatchAllDocsQuery();
|
||||
} else {
|
||||
return new MatchNoDocsQuery();
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public Query regexpQuery(String value, int flags, int maxDeterminizedStates,
|
||||
MultiTermQuery.RewriteMethod method, QueryShardContext context) {
|
||||
if (this.value == null) {
|
||||
return new MatchNoDocsQuery();
|
||||
}
|
||||
|
||||
final Automaton automaton = new RegExp(value, flags).toAutomaton(maxDeterminizedStates);
|
||||
final CharacterRunAutomaton runAutomaton = new CharacterRunAutomaton(automaton);
|
||||
if (runAutomaton.run(this.value)) {
|
||||
return new MatchAllDocsQuery();
|
||||
} else {
|
||||
return new MatchNoDocsQuery();
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
ConstantKeywordFieldMapper(String simpleName, MappedFieldType fieldType, MappedFieldType defaultFieldType,
|
||||
Settings indexSettings) {
|
||||
super(simpleName, fieldType, defaultFieldType, indexSettings, MultiFields.empty(), CopyTo.empty());
|
||||
}
|
||||
|
||||
@Override
|
||||
protected ConstantKeywordFieldMapper clone() {
|
||||
return (ConstantKeywordFieldMapper) super.clone();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ConstantKeywordFieldType fieldType() {
|
||||
return (ConstantKeywordFieldType) super.fieldType();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void parseCreateField(ParseContext context, List<IndexableField> fields) throws IOException {
|
||||
String value;
|
||||
if (context.externalValueSet()) {
|
||||
value = context.externalValue().toString();
|
||||
} else {
|
||||
XContentParser parser = context.parser();
|
||||
value = parser.textOrNull();
|
||||
}
|
||||
|
||||
if (value == null) {
|
||||
throw new IllegalArgumentException("[constant_keyword] field [" + name() + "] doesn't accept [null] values");
|
||||
}
|
||||
|
||||
if (fieldType().value == null) {
|
||||
ConstantKeywordFieldType newFieldType = new ConstantKeywordFieldType(fieldType());
|
||||
newFieldType.setValue(value);
|
||||
newFieldType.freeze();
|
||||
Mapper update = new ConstantKeywordFieldMapper(
|
||||
simpleName(), newFieldType, defaultFieldType, context.indexSettings().getSettings());
|
||||
context.addDynamicMapper(update);
|
||||
} else if (Objects.equals(fieldType().value, value) == false) {
|
||||
throw new IllegalArgumentException("[constant_keyword] field [" + name() +
|
||||
"] only accepts values that are equal to the value defined in the mappings [" + fieldType().value() +
|
||||
"], but got [" + value + "]");
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
protected String contentType() {
|
||||
return CONTENT_TYPE;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void doXContentBody(XContentBuilder builder, boolean includeDefaults, Params params) throws IOException {
|
||||
super.doXContentBody(builder, includeDefaults, params);
|
||||
if (fieldType().value() != null) {
|
||||
builder.field("value", fieldType().value());
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,110 @@
|
||||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.constantkeyword.mapper;
|
||||
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.compress.CompressedXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.common.xcontent.XContentType;
|
||||
import org.elasticsearch.index.IndexService;
|
||||
import org.elasticsearch.index.mapper.DocumentMapper;
|
||||
import org.elasticsearch.index.mapper.MapperParsingException;
|
||||
import org.elasticsearch.index.mapper.MapperService.MergeReason;
|
||||
import org.elasticsearch.index.mapper.ParsedDocument;
|
||||
import org.elasticsearch.index.mapper.SourceToParse;
|
||||
import org.elasticsearch.plugins.Plugin;
|
||||
import org.elasticsearch.test.ESSingleNodeTestCase;
|
||||
import org.elasticsearch.xpack.constantkeyword.ConstantKeywordMapperPlugin;
|
||||
import org.elasticsearch.xpack.core.LocalStateCompositeXPackPlugin;
|
||||
|
||||
import java.util.Collection;
|
||||
import java.util.Collections;
|
||||
|
||||
public class ConstantKeywordFieldMapperTests extends ESSingleNodeTestCase {
|
||||
|
||||
@Override
|
||||
protected Collection<Class<? extends Plugin>> getPlugins() {
|
||||
return pluginList(ConstantKeywordMapperPlugin.class, LocalStateCompositeXPackPlugin.class);
|
||||
}
|
||||
|
||||
public void testDefaults() throws Exception {
|
||||
IndexService indexService = createIndex("test");
|
||||
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("_doc")
|
||||
.startObject("properties").startObject("field").field("type", "constant_keyword")
|
||||
.field("value", "foo").endObject().endObject().endObject().endObject());
|
||||
DocumentMapper mapper = indexService.mapperService().merge("_doc", new CompressedXContent(mapping), MergeReason.MAPPING_UPDATE);
|
||||
assertEquals(mapping, mapper.mappingSource().toString());
|
||||
|
||||
BytesReference source = BytesReference.bytes(XContentFactory.jsonBuilder().startObject().endObject());
|
||||
ParsedDocument doc = mapper.parse(new SourceToParse("test", "_doc", "1", source, XContentType.JSON));
|
||||
assertNull(doc.rootDoc().getField("field"));
|
||||
|
||||
source = BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("field", "foo").endObject());
|
||||
doc = mapper.parse(new SourceToParse("test", "_doc", "1", source, XContentType.JSON));
|
||||
assertNull(doc.rootDoc().getField("field"));
|
||||
|
||||
BytesReference illegalSource = BytesReference.bytes(XContentFactory.jsonBuilder()
|
||||
.startObject().field("field", "bar").endObject());
|
||||
MapperParsingException e = expectThrows(MapperParsingException.class,
|
||||
() -> mapper.parse(new SourceToParse("test", "_doc", "1", illegalSource, XContentType.JSON)));
|
||||
assertEquals("[constant_keyword] field [field] only accepts values that are equal to the value defined in the mappings [foo], " +
|
||||
"but got [bar]", e.getCause().getMessage());
|
||||
}
|
||||
|
||||
public void testDynamicValue() throws Exception {
|
||||
IndexService indexService = createIndex("test");
|
||||
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("_doc")
|
||||
.startObject("properties").startObject("field").field("type", "constant_keyword")
|
||||
.endObject().endObject().endObject().endObject());
|
||||
DocumentMapper mapper = indexService.mapperService().merge("_doc", new CompressedXContent(mapping), MergeReason.MAPPING_UPDATE);
|
||||
assertEquals(mapping, mapper.mappingSource().toString());
|
||||
|
||||
BytesReference source = BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("field", "foo").endObject());
|
||||
ParsedDocument doc = mapper.parse(new SourceToParse("test", "_doc", "1", source, XContentType.JSON));
|
||||
assertNull(doc.rootDoc().getField("field"));
|
||||
assertNotNull(doc.dynamicMappingsUpdate());
|
||||
|
||||
CompressedXContent mappingUpdate = new CompressedXContent(Strings.toString(doc.dynamicMappingsUpdate()));
|
||||
DocumentMapper updatedMapper = indexService.mapperService().merge("_doc", mappingUpdate, MergeReason.MAPPING_UPDATE);
|
||||
String expectedMapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("_doc")
|
||||
.startObject("properties").startObject("field").field("type", "constant_keyword")
|
||||
.field("value", "foo").endObject().endObject().endObject().endObject());
|
||||
assertEquals(expectedMapping, updatedMapper.mappingSource().toString());
|
||||
|
||||
doc = updatedMapper.parse(new SourceToParse("test", "_doc", "1", source, XContentType.JSON));
|
||||
assertNull(doc.rootDoc().getField("field"));
|
||||
assertNull(doc.dynamicMappingsUpdate());
|
||||
}
|
||||
|
||||
public void testMeta() throws Exception {
|
||||
IndexService indexService = createIndex("test");
|
||||
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("_doc")
|
||||
.startObject("properties").startObject("field").field("type", "constant_keyword")
|
||||
.field("meta", Collections.singletonMap("foo", "bar"))
|
||||
.endObject().endObject().endObject().endObject());
|
||||
|
||||
DocumentMapper mapper = indexService.mapperService().merge("_doc",
|
||||
new CompressedXContent(mapping), MergeReason.MAPPING_UPDATE);
|
||||
assertEquals(mapping, mapper.mappingSource().toString());
|
||||
|
||||
String mapping2 = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("_doc")
|
||||
.startObject("properties").startObject("field").field("type", "constant_keyword")
|
||||
.endObject().endObject().endObject().endObject());
|
||||
mapper = indexService.mapperService().merge("_doc",
|
||||
new CompressedXContent(mapping2), MergeReason.MAPPING_UPDATE);
|
||||
assertEquals(mapping2, mapper.mappingSource().toString());
|
||||
|
||||
String mapping3 = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("_doc")
|
||||
.startObject("properties").startObject("field").field("type", "constant_keyword")
|
||||
.field("meta", Collections.singletonMap("baz", "quux"))
|
||||
.endObject().endObject().endObject().endObject());
|
||||
mapper = indexService.mapperService().merge("_doc",
|
||||
new CompressedXContent(mapping3), MergeReason.MAPPING_UPDATE);
|
||||
assertEquals(mapping3, mapper.mappingSource().toString());
|
||||
}
|
||||
}
|
@ -0,0 +1,130 @@
|
||||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.constantkeyword.mapper;
|
||||
|
||||
import org.apache.lucene.search.MatchAllDocsQuery;
|
||||
import org.apache.lucene.search.MatchNoDocsQuery;
|
||||
import org.apache.lucene.util.automaton.RegExp;
|
||||
import org.elasticsearch.common.unit.Fuzziness;
|
||||
import org.elasticsearch.index.mapper.FieldTypeTestCase;
|
||||
import org.elasticsearch.index.mapper.MappedFieldType;
|
||||
import org.elasticsearch.xpack.constantkeyword.mapper.ConstantKeywordFieldMapper.ConstantKeywordFieldType;
|
||||
import org.junit.Before;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
|
||||
public class ConstantKeywordFieldTypeTests extends FieldTypeTestCase {
|
||||
|
||||
@Before
|
||||
public void setupProperties() {
|
||||
addModifier(new Modifier("value", false) {
|
||||
@Override
|
||||
public void modify(MappedFieldType type) {
|
||||
((ConstantKeywordFieldType) type).setValue("bar");
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
public void testSetValue() {
|
||||
ConstantKeywordFieldType ft1 = new ConstantKeywordFieldType();
|
||||
ft1.setName("field");
|
||||
ConstantKeywordFieldType ft2 = new ConstantKeywordFieldType();
|
||||
ft2.setName("field");
|
||||
ft2.setValue("bar");
|
||||
List<String> conflicts = new ArrayList<>();
|
||||
ft1.checkCompatibility(ft2, conflicts);
|
||||
assertEquals(Collections.emptyList(), conflicts);
|
||||
}
|
||||
|
||||
public void testUnsetValue() {
|
||||
ConstantKeywordFieldType ft1 = new ConstantKeywordFieldType();
|
||||
ft1.setName("field");
|
||||
ft1.setValue("foo");
|
||||
ConstantKeywordFieldType ft2 = new ConstantKeywordFieldType();
|
||||
ft2.setName("field");
|
||||
List<String> conflicts = new ArrayList<>();
|
||||
ft1.checkCompatibility(ft2, conflicts);
|
||||
assertEquals(Collections.singletonList("mapper [field] cannot unset [value]"), conflicts);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected MappedFieldType createDefaultFieldType() {
|
||||
ConstantKeywordFieldType ft = new ConstantKeywordFieldType();
|
||||
ft.setValue("foo");
|
||||
return ft;
|
||||
}
|
||||
|
||||
public void testTermQuery() {
|
||||
ConstantKeywordFieldType ft = new ConstantKeywordFieldType();
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termQuery("foo", null));
|
||||
ft.setValue("foo");
|
||||
assertEquals(new MatchAllDocsQuery(), ft.termQuery("foo", null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termQuery("bar", null));
|
||||
}
|
||||
|
||||
public void testTermsQuery() {
|
||||
ConstantKeywordFieldType ft = new ConstantKeywordFieldType();
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termsQuery(Collections.singletonList("foo"), null));
|
||||
ft.setValue("foo");
|
||||
assertEquals(new MatchAllDocsQuery(), ft.termsQuery(Collections.singletonList("foo"), null));
|
||||
assertEquals(new MatchAllDocsQuery(), ft.termsQuery(Arrays.asList("bar", "foo", "quux"), null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termsQuery(Collections.emptyList(), null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termsQuery(Collections.singletonList("bar"), null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termsQuery(Arrays.asList("bar", "quux"), null));
|
||||
}
|
||||
|
||||
public void testWildcardQuery() {
|
||||
ConstantKeywordFieldType ft = new ConstantKeywordFieldType();
|
||||
assertEquals(new MatchNoDocsQuery(), ft.wildcardQuery("f*o", null, null));
|
||||
ft.setValue("foo");
|
||||
assertEquals(new MatchAllDocsQuery(), ft.wildcardQuery("f*o", null, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.wildcardQuery("b*r", null, null));
|
||||
}
|
||||
|
||||
public void testPrefixQuery() {
|
||||
ConstantKeywordFieldType ft = new ConstantKeywordFieldType();
|
||||
assertEquals(new MatchNoDocsQuery(), ft.prefixQuery("fo", null, null));
|
||||
ft.setValue("foo");
|
||||
assertEquals(new MatchAllDocsQuery(), ft.prefixQuery("fo", null, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.prefixQuery("ba", null, null));
|
||||
}
|
||||
|
||||
public void testRangeQuery() {
|
||||
ConstantKeywordFieldType ft = new ConstantKeywordFieldType();
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery(null, null, randomBoolean(), randomBoolean(), null, null, null, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery(null, "foo", randomBoolean(), randomBoolean(), null, null, null, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery("foo", null, randomBoolean(), randomBoolean(), null, null, null, null));
|
||||
ft.setValue("foo");
|
||||
assertEquals(new MatchAllDocsQuery(), ft.rangeQuery(null, null, randomBoolean(), randomBoolean(), null, null, null, null));
|
||||
assertEquals(new MatchAllDocsQuery(), ft.rangeQuery("foo", null, true, randomBoolean(), null, null, null, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery("foo", null, false, randomBoolean(), null, null, null, null));
|
||||
assertEquals(new MatchAllDocsQuery(), ft.rangeQuery(null, "foo", randomBoolean(), true, null, null, null, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery(null, "foo", randomBoolean(), false, null, null, null, null));
|
||||
assertEquals(new MatchAllDocsQuery(), ft.rangeQuery("abc", "xyz", randomBoolean(), randomBoolean(), null, null, null, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery("abc", "def", randomBoolean(), randomBoolean(), null, null, null, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery("mno", "xyz", randomBoolean(), randomBoolean(), null, null, null, null));
|
||||
}
|
||||
|
||||
public void testFuzzyQuery() {
|
||||
ConstantKeywordFieldType ft = new ConstantKeywordFieldType();
|
||||
assertEquals(new MatchNoDocsQuery(), ft.fuzzyQuery("fooquux", Fuzziness.AUTO, 3, 50, randomBoolean(), null));
|
||||
ft.setValue("foobar");
|
||||
assertEquals(new MatchAllDocsQuery(), ft.fuzzyQuery("foobaz", Fuzziness.AUTO, 3, 50, randomBoolean(), null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.fuzzyQuery("fooquux", Fuzziness.AUTO, 3, 50, randomBoolean(), null));
|
||||
}
|
||||
|
||||
public void testRegexpQuery() {
|
||||
ConstantKeywordFieldType ft = new ConstantKeywordFieldType();
|
||||
assertEquals(new MatchNoDocsQuery(), ft.regexpQuery("f..o", RegExp.ALL, 10, null, null));
|
||||
ft.setValue("foo");
|
||||
assertEquals(new MatchAllDocsQuery(), ft.regexpQuery("f.o", RegExp.ALL, 10, null, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.regexpQuery("f..o", RegExp.ALL, 10, null, null));
|
||||
}
|
||||
}
|
@ -0,0 +1,182 @@
|
||||
setup:
|
||||
|
||||
- skip:
|
||||
version: " - 7.99.99" # TODO: make it 7.6.99 after backport
|
||||
reason: "constant_keyword was added in 7.7"
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test1
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
foo:
|
||||
type: constant_keyword
|
||||
value: bar
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test2
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
foo:
|
||||
type: constant_keyword
|
||||
value: baz
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test1
|
||||
id: 1
|
||||
body: {}
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test1
|
||||
id: 2
|
||||
body: { "foo": "bar" }
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test2
|
||||
id: 1
|
||||
body: {}
|
||||
|
||||
- do:
|
||||
indices.refresh: {}
|
||||
|
||||
---
|
||||
"Exist query":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
exists:
|
||||
field: foo
|
||||
|
||||
- match: { "hits.total.value": 3 }
|
||||
|
||||
|
||||
---
|
||||
"Term query":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
pre_filter_shard_size: 1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
term:
|
||||
foo: bar
|
||||
|
||||
- match: { "hits.total.value": 2 }
|
||||
- match: { _shards.skipped : 1}
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
pre_filter_shard_size: 1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
term:
|
||||
foo: baz
|
||||
|
||||
- match: { "hits.total.value": 1 }
|
||||
- match: { _shards.skipped : 1}
|
||||
|
||||
---
|
||||
"Terms query":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
pre_filter_shard_size: 1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
terms:
|
||||
foo: [bar, quux]
|
||||
|
||||
- match: { "hits.total.value": 2 }
|
||||
- match: { _shards.skipped : 1}
|
||||
|
||||
---
|
||||
"Prefix query":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
prefix:
|
||||
foo: ba
|
||||
|
||||
- match: { "hits.total.value": 3 }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
pre_filter_shard_size: 1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
prefix:
|
||||
foo: baz
|
||||
|
||||
- match: { "hits.total.value": 1 }
|
||||
- match: { _shards.skipped : 1}
|
||||
|
||||
---
|
||||
"Wildcard query":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
pre_filter_shard_size: 1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
wildcard:
|
||||
foo: "*r*"
|
||||
|
||||
- match: { "hits.total.value": 2 }
|
||||
- match: { _shards.skipped : 1}
|
||||
|
||||
---
|
||||
"Terms agg":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
body:
|
||||
size: 0
|
||||
aggs:
|
||||
foo_terms:
|
||||
terms:
|
||||
field: foo
|
||||
|
||||
- match: { aggregations.foo_terms.buckets.0.key: "bar" }
|
||||
- match: { aggregations.foo_terms.buckets.0.doc_count: 2 }
|
||||
- match: { aggregations.foo_terms.buckets.1.key: "baz" }
|
||||
- match: { aggregations.foo_terms.buckets.1.doc_count: 1 }
|
||||
- length: { aggregations.foo_terms.buckets: 2 }
|
||||
|
||||
---
|
||||
"Sort":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
body:
|
||||
sort: [ { foo: asc } ]
|
||||
|
||||
- match: { "hits.total.value": 3 }
|
||||
- match: {hits.hits.0._index: test1 }
|
||||
- match: {hits.hits.1._index: test1 }
|
||||
- match: {hits.hits.2._index: test2 }
|
@ -0,0 +1,82 @@
|
||||
---
|
||||
"Dynamic mappings":
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test1
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
foo:
|
||||
type: constant_keyword
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test1
|
||||
id: 1
|
||||
body: {}
|
||||
|
||||
- do:
|
||||
indices.get_mapping:
|
||||
index: test1
|
||||
|
||||
- match: { test1.mappings.properties.foo.type: constant_keyword }
|
||||
- is_false: test1.mappings.properties.foo.value
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test1
|
||||
id: 1
|
||||
body: {}
|
||||
|
||||
- do:
|
||||
indices.refresh: {}
|
||||
|
||||
- do:
|
||||
indices.get_mapping:
|
||||
index: test1
|
||||
|
||||
- match: { test1.mappings.properties.foo.type: constant_keyword }
|
||||
- is_false: test1.mappings.properties.foo.value
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
term:
|
||||
foo:
|
||||
value: bar
|
||||
|
||||
- match: { hits.total.value: 0 }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
aggs:
|
||||
foo_terms:
|
||||
terms:
|
||||
field: foo
|
||||
|
||||
- match: { hits.total.value: 1 }
|
||||
- length: { aggregations.foo_terms.buckets: 0 }
|
||||
|
||||
- do:
|
||||
index:
|
||||
index: test1
|
||||
id: 1
|
||||
body:
|
||||
foo: bar
|
||||
|
||||
- do:
|
||||
indices.refresh: {}
|
||||
|
||||
- do:
|
||||
indices.get_mapping:
|
||||
index: test1
|
||||
|
||||
- match: { test1.mappings.properties.foo.type: constant_keyword }
|
||||
- match: { test1.mappings.properties.foo.value: bar }
|
Loading…
x
Reference in New Issue
Block a user