Add support for field aliases. (#32172)

* Add basic support for field aliases in index mappings. (#31287)
* Allow for aliases when fetching stored fields. (#31411)
* Add tests around accessing field aliases in scripts. (#31417)
* Add documentation around field aliases. (#31538)
* Add validation for field alias mappings. (#31518)
* Return both concrete fields and aliases in DocumentFieldMappers#getMapper. (#31671)
* Make sure that field-level security is enforced when using field aliases. (#31807)
* Add more comprehensive tests for field aliases in queries + aggregations. (#31565)
* Remove the deprecated method DocumentFieldMappers#getFieldMapper. (#32148)
This commit is contained in:
Julie Tibshirani 2018-07-18 09:33:09 -07:00 committed by GitHub
parent 605dc49c48
commit 15ff3da653
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
126 changed files with 4090 additions and 1051 deletions

View File

@ -16,7 +16,8 @@ explicitly by setting `query`, `fielddata` or `request`.
All caches relating to a specific field(s) can also be cleared by All caches relating to a specific field(s) can also be cleared by
specifying `fields` parameter with a comma delimited list of the specifying `fields` parameter with a comma delimited list of the
relevant fields. relevant fields. Note that the provided names must refer to concrete
fields -- objects and field aliases are not supported.
[float] [float]
=== Multi Index === Multi Index

View File

@ -124,8 +124,10 @@ fields to an existing index with the <<indices-put-mapping,PUT mapping API>>.
Other than where documented, *existing field mappings cannot be Other than where documented, *existing field mappings cannot be
updated*. Changing the mapping would mean invalidating already indexed updated*. Changing the mapping would mean invalidating already indexed
documents. Instead, you should create a new index with the correct mappings documents. Instead, you should create a new index with the correct mappings
and <<docs-reindex,reindex>> your data into that index. and <<docs-reindex,reindex>> your data into that index. If you only wish
to rename a field and not change its mappings, it may make sense to introduce
an <<alias, `alias`>> field.
[float] [float]
== Example mapping == Example mapping

View File

@ -40,6 +40,8 @@ string:: <<text,`text`>> and <<keyword,`keyword`>>
<<parent-join>>:: Defines parent/child relation for documents within the same index <<parent-join>>:: Defines parent/child relation for documents within the same index
<<alias>>:: Defines an alias to an existing field.
<<feature>>:: Record numeric features to boost hits at query time. <<feature>>:: Record numeric features to boost hits at query time.
<<feature-vector>>:: Record numeric feature vectors to boost hits at query time. <<feature-vector>>:: Record numeric feature vectors to boost hits at query time.
@ -58,6 +60,8 @@ the <<analysis-standard-analyzer,`standard` analyzer>>, the
This is the purpose of _multi-fields_. Most datatypes support multi-fields This is the purpose of _multi-fields_. Most datatypes support multi-fields
via the <<multi-fields>> parameter. via the <<multi-fields>> parameter.
include::types/alias.asciidoc[]
include::types/array.asciidoc[] include::types/array.asciidoc[]
include::types/binary.asciidoc[] include::types/binary.asciidoc[]

View File

@ -0,0 +1,101 @@
[[alias]]
=== Alias datatype
An `alias` mapping defines an alternate name for a field in the index.
The alias can be used in place of the target field in <<search, search>> requests,
and selected other APIs like <<search-field-caps, field capabilities>>.
[source,js]
--------------------------------
PUT trips
{
"mappings": {
"_doc": {
"properties": {
"distance": {
"type": "long"
},
"route_length_miles": {
"type": "alias",
"path": "distance" // <1>
},
"transit_mode": {
"type": "keyword"
}
}
}
}
}
GET _search
{
"query": {
"range" : {
"route_length_miles" : {
"gte" : 39
}
}
}
}
--------------------------------
// CONSOLE
<1> The path to the target field. Note that this must be the full path, including any parent
objects (e.g. `object1.object2.field`).
Almost all components of the search request accept field aliases. In particular, aliases can be
used in queries, aggregations, and sort fields, as well as when requesting `docvalue_fields`,
`stored_fields`, suggestions, and highlights. Scripts also support aliases when accessing
field values. Please see the section on <<unsupported-apis, unsupported APIs>> for exceptions.
In some parts of the search request and when requesting field capabilities, field wildcard patterns can be
provided. In these cases, the wildcard pattern will match field aliases in addition to concrete fields:
[source,js]
--------------------------------
GET trips/_field_caps?fields=route_*,transit_mode
--------------------------------
// CONSOLE
// TEST[continued]
[[alias-targets]]
==== Alias targets
There are a few restrictions on the target of an alias:
* The target must be a concrete field, and not an object or another field alias.
* The target field must exist at the time the alias is created.
* If nested objects are defined, a field alias must have the same nested scope as its target.
Additionally, a field alias can only have one target. This means that it is not possible to use a
field alias to query over multiple target fields in a single clause.
[[unsupported-apis]]
==== Unsupported APIs
Writes to field aliases are not supported: attempting to use an alias in an index or update request
will result in a failure. Likewise, aliases cannot be used as the target of `copy_to`.
Because alias names are not present in the document source, aliases cannot be used when performing
source filtering. For example, the following request will return an empty result for `_source`:
[source,js]
--------------------------------
GET /_search
{
"query" : {
"match_all": {}
},
"_source": "route_length_miles"
}
--------------------------------
// CONSOLE
// TEST[continued]
Currently only the search and field capabilities APIs will accept and resolve field aliases.
Other APIs that accept field names, such as <<docs-termvectors, term vectors>>, cannot be used
with field aliases.
Finally, some queries, such as `terms`, `geo_shape`, and `more_like_this`, allow for fetching query
information from an indexed document. Because field aliases aren't supported when fetching documents,
the part of the query that specifies the lookup path cannot refer to a field by its alias.

View File

@ -20,27 +20,52 @@
package org.elasticsearch.script.expression; package org.elasticsearch.script.expression;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.index.IndexService; import org.elasticsearch.index.fielddata.AtomicNumericFieldData;
import org.elasticsearch.index.query.QueryShardContext; import org.elasticsearch.index.fielddata.IndexNumericFieldData;
import org.elasticsearch.index.fielddata.SortedNumericDoubleValues;
import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.index.mapper.NumberFieldMapper.NumberFieldType;
import org.elasticsearch.index.mapper.NumberFieldMapper.NumberType;
import org.elasticsearch.script.ScriptException; import org.elasticsearch.script.ScriptException;
import org.elasticsearch.script.SearchScript; import org.elasticsearch.script.SearchScript;
import org.elasticsearch.search.lookup.SearchLookup; import org.elasticsearch.search.lookup.SearchLookup;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESTestCase;
import java.io.IOException;
import java.text.ParseException; import java.text.ParseException;
import java.util.Collections; import java.util.Collections;
public class ExpressionTests extends ESSingleNodeTestCase { import static org.mockito.Matchers.anyInt;
ExpressionScriptEngine service; import static org.mockito.Matchers.anyObject;
SearchLookup lookup; import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;
public class ExpressionTests extends ESTestCase {
private ExpressionScriptEngine service;
private SearchLookup lookup;
@Override @Override
public void setUp() throws Exception { public void setUp() throws Exception {
super.setUp(); super.setUp();
IndexService index = createIndex("test", Settings.EMPTY, "type", "d", "type=double");
NumberFieldType fieldType = new NumberFieldType(NumberType.DOUBLE);
MapperService mapperService = mock(MapperService.class);
when(mapperService.fullName("field")).thenReturn(fieldType);
when(mapperService.fullName("alias")).thenReturn(fieldType);
SortedNumericDoubleValues doubleValues = mock(SortedNumericDoubleValues.class);
when(doubleValues.advanceExact(anyInt())).thenReturn(true);
when(doubleValues.nextValue()).thenReturn(2.718);
AtomicNumericFieldData atomicFieldData = mock(AtomicNumericFieldData.class);
when(atomicFieldData.getDoubleValues()).thenReturn(doubleValues);
IndexNumericFieldData fieldData = mock(IndexNumericFieldData.class);
when(fieldData.getFieldName()).thenReturn("field");
when(fieldData.load(anyObject())).thenReturn(atomicFieldData);
service = new ExpressionScriptEngine(Settings.EMPTY); service = new ExpressionScriptEngine(Settings.EMPTY);
QueryShardContext shardContext = index.newQueryShardContext(0, null, () -> 0, null); lookup = new SearchLookup(mapperService, ignored -> fieldData, null);
lookup = new SearchLookup(index.mapperService(), shardContext::getForField, null);
} }
private SearchScript.LeafFactory compile(String expression) { private SearchScript.LeafFactory compile(String expression) {
@ -50,22 +75,38 @@ public class ExpressionTests extends ESSingleNodeTestCase {
public void testNeedsScores() { public void testNeedsScores() {
assertFalse(compile("1.2").needs_score()); assertFalse(compile("1.2").needs_score());
assertFalse(compile("doc['d'].value").needs_score()); assertFalse(compile("doc['field'].value").needs_score());
assertTrue(compile("1/_score").needs_score()); assertTrue(compile("1/_score").needs_score());
assertTrue(compile("doc['d'].value * _score").needs_score()); assertTrue(compile("doc['field'].value * _score").needs_score());
} }
public void testCompileError() { public void testCompileError() {
ScriptException e = expectThrows(ScriptException.class, () -> { ScriptException e = expectThrows(ScriptException.class, () -> {
compile("doc['d'].value * *@#)(@$*@#$ + 4"); compile("doc['field'].value * *@#)(@$*@#$ + 4");
}); });
assertTrue(e.getCause() instanceof ParseException); assertTrue(e.getCause() instanceof ParseException);
} }
public void testLinkError() { public void testLinkError() {
ScriptException e = expectThrows(ScriptException.class, () -> { ScriptException e = expectThrows(ScriptException.class, () -> {
compile("doc['e'].value * 5"); compile("doc['nonexistent'].value * 5");
}); });
assertTrue(e.getCause() instanceof ParseException); assertTrue(e.getCause() instanceof ParseException);
} }
public void testFieldAccess() throws IOException {
SearchScript script = compile("doc['field'].value").newInstance(null);
script.setDocument(1);
double result = script.runAsDouble();
assertEquals(2.718, result, 0.0);
}
public void testFieldAccessWithFieldAlias() throws IOException {
SearchScript script = compile("doc['alias'].value").newInstance(null);
script.setDocument(1);
double result = script.runAsDouble();
assertEquals(2.718, result, 0.0);
}
} }

View File

@ -194,6 +194,10 @@ final class PercolateQuery extends Query implements Accountable {
return candidateMatchesQuery; return candidateMatchesQuery;
} }
Query getVerifiedMatchesQuery() {
return verifiedMatchesQuery;
}
// Comparing identity here to avoid being cached // Comparing identity here to avoid being cached
// Note that in theory if the same instance gets used multiple times it could still get cached, // Note that in theory if the same instance gets used multiple times it could still get cached,
// however since we create a new query instance each time we this query this shouldn't happen and thus // however since we create a new query instance each time we this query this shouldn't happen and thus

View File

@ -618,13 +618,13 @@ public class PercolateQueryBuilder extends AbstractQueryBuilder<PercolateQueryBu
docSearcher.setQueryCache(null); docSearcher.setQueryCache(null);
} }
PercolatorFieldMapper percolatorFieldMapper = (PercolatorFieldMapper) docMapper.mappers().getMapper(field);
boolean mapUnmappedFieldsAsString = percolatorFieldMapper.isMapUnmappedFieldAsText();
QueryShardContext percolateShardContext = wrap(context);
String name = this.name != null ? this.name : field;
PercolatorFieldMapper.FieldType pft = (PercolatorFieldMapper.FieldType) fieldType; PercolatorFieldMapper.FieldType pft = (PercolatorFieldMapper.FieldType) fieldType;
PercolateQuery.QueryStore queryStore = createStore(pft.queryBuilderField, percolateShardContext, mapUnmappedFieldsAsString); String name = this.name != null ? this.name : pft.name();
QueryShardContext percolateShardContext = wrap(context);
PercolateQuery.QueryStore queryStore = createStore(pft.queryBuilderField,
percolateShardContext,
pft.mapUnmappedFieldsAsText);
return pft.percolateQuery(name, queryStore, documents, docSearcher, context.indexVersionCreated()); return pft.percolateQuery(name, queryStore, documents, docSearcher, context.indexVersionCreated());
} }

View File

@ -136,6 +136,8 @@ public class PercolatorFieldMapper extends FieldMapper {
fieldType.rangeField = rangeFieldMapper.fieldType(); fieldType.rangeField = rangeFieldMapper.fieldType();
NumberFieldMapper minimumShouldMatchFieldMapper = createMinimumShouldMatchField(context); NumberFieldMapper minimumShouldMatchFieldMapper = createMinimumShouldMatchField(context);
fieldType.minimumShouldMatchField = minimumShouldMatchFieldMapper.fieldType(); fieldType.minimumShouldMatchField = minimumShouldMatchFieldMapper.fieldType();
fieldType.mapUnmappedFieldsAsText = getMapUnmappedFieldAsText(context.indexSettings());
context.path().remove(); context.path().remove();
setupFieldType(context); setupFieldType(context);
return new PercolatorFieldMapper(name(), fieldType, defaultFieldType, context.indexSettings(), return new PercolatorFieldMapper(name(), fieldType, defaultFieldType, context.indexSettings(),
@ -143,6 +145,10 @@ public class PercolatorFieldMapper extends FieldMapper {
extractionResultField, queryBuilderField, rangeFieldMapper, minimumShouldMatchFieldMapper); extractionResultField, queryBuilderField, rangeFieldMapper, minimumShouldMatchFieldMapper);
} }
private static boolean getMapUnmappedFieldAsText(Settings indexSettings) {
return INDEX_MAP_UNMAPPED_FIELDS_AS_TEXT_SETTING.get(indexSettings);
}
static KeywordFieldMapper createExtractQueryFieldBuilder(String name, BuilderContext context) { static KeywordFieldMapper createExtractQueryFieldBuilder(String name, BuilderContext context) {
KeywordFieldMapper.Builder queryMetaDataFieldBuilder = new KeywordFieldMapper.Builder(name); KeywordFieldMapper.Builder queryMetaDataFieldBuilder = new KeywordFieldMapper.Builder(name);
queryMetaDataFieldBuilder.docValues(false); queryMetaDataFieldBuilder.docValues(false);
@ -195,6 +201,7 @@ public class PercolatorFieldMapper extends FieldMapper {
MappedFieldType minimumShouldMatchField; MappedFieldType minimumShouldMatchField;
RangeFieldMapper.RangeFieldType rangeField; RangeFieldMapper.RangeFieldType rangeField;
boolean mapUnmappedFieldsAsText;
FieldType() { FieldType() {
setIndexOptions(IndexOptions.NONE); setIndexOptions(IndexOptions.NONE);
@ -209,6 +216,7 @@ public class PercolatorFieldMapper extends FieldMapper {
queryBuilderField = ref.queryBuilderField; queryBuilderField = ref.queryBuilderField;
rangeField = ref.rangeField; rangeField = ref.rangeField;
minimumShouldMatchField = ref.minimumShouldMatchField; minimumShouldMatchField = ref.minimumShouldMatchField;
mapUnmappedFieldsAsText = ref.mapUnmappedFieldsAsText;
} }
@Override @Override
@ -327,7 +335,6 @@ public class PercolatorFieldMapper extends FieldMapper {
} }
private final boolean mapUnmappedFieldAsText;
private final Supplier<QueryShardContext> queryShardContext; private final Supplier<QueryShardContext> queryShardContext;
private KeywordFieldMapper queryTermsField; private KeywordFieldMapper queryTermsField;
private KeywordFieldMapper extractionResultField; private KeywordFieldMapper extractionResultField;
@ -348,14 +355,9 @@ public class PercolatorFieldMapper extends FieldMapper {
this.extractionResultField = extractionResultField; this.extractionResultField = extractionResultField;
this.queryBuilderField = queryBuilderField; this.queryBuilderField = queryBuilderField;
this.minimumShouldMatchFieldMapper = minimumShouldMatchFieldMapper; this.minimumShouldMatchFieldMapper = minimumShouldMatchFieldMapper;
this.mapUnmappedFieldAsText = getMapUnmappedFieldAsText(indexSettings);
this.rangeFieldMapper = rangeFieldMapper; this.rangeFieldMapper = rangeFieldMapper;
} }
private static boolean getMapUnmappedFieldAsText(Settings indexSettings) {
return INDEX_MAP_UNMAPPED_FIELDS_AS_TEXT_SETTING.get(indexSettings);
}
@Override @Override
public FieldMapper updateFieldType(Map<String, MappedFieldType> fullNameToFieldType) { public FieldMapper updateFieldType(Map<String, MappedFieldType> fullNameToFieldType) {
PercolatorFieldMapper updated = (PercolatorFieldMapper) super.updateFieldType(fullNameToFieldType); PercolatorFieldMapper updated = (PercolatorFieldMapper) super.updateFieldType(fullNameToFieldType);
@ -402,7 +404,7 @@ public class PercolatorFieldMapper extends FieldMapper {
Version indexVersion = context.mapperService().getIndexSettings().getIndexVersionCreated(); Version indexVersion = context.mapperService().getIndexSettings().getIndexVersionCreated();
createQueryBuilderField(indexVersion, queryBuilderField, queryBuilder, context); createQueryBuilderField(indexVersion, queryBuilderField, queryBuilder, context);
Query query = toQuery(queryShardContext, mapUnmappedFieldAsText, queryBuilder); Query query = toQuery(queryShardContext, isMapUnmappedFieldAsText(), queryBuilder);
processQuery(query, context); processQuery(query, context);
return null; return null;
} }
@ -522,7 +524,7 @@ public class PercolatorFieldMapper extends FieldMapper {
} }
boolean isMapUnmappedFieldAsText() { boolean isMapUnmappedFieldAsText() {
return mapUnmappedFieldAsText; return ((FieldType) fieldType).mapUnmappedFieldsAsText;
} }
/** /**

View File

@ -194,8 +194,7 @@ public class CandidateQueryTests extends ESSingleNodeTestCase {
} }
Collections.sort(intValues); Collections.sort(intValues);
MappedFieldType intFieldType = mapperService.documentMapper("type").mappers() MappedFieldType intFieldType = mapperService.fullName("int_field");
.getMapper("int_field").fieldType();
List<Supplier<Query>> queryFunctions = new ArrayList<>(); List<Supplier<Query>> queryFunctions = new ArrayList<>();
queryFunctions.add(MatchNoDocsQuery::new); queryFunctions.add(MatchNoDocsQuery::new);
@ -327,8 +326,7 @@ public class CandidateQueryTests extends ESSingleNodeTestCase {
stringValues.add("value2"); stringValues.add("value2");
stringValues.add("value3"); stringValues.add("value3");
MappedFieldType intFieldType = mapperService.documentMapper("type").mappers() MappedFieldType intFieldType = mapperService.fullName("int_field");
.getMapper("int_field").fieldType();
List<int[]> ranges = new ArrayList<>(); List<int[]> ranges = new ArrayList<>();
ranges.add(new int[]{-5, 5}); ranges.add(new int[]{-5, 5});
ranges.add(new int[]{0, 10}); ranges.add(new int[]{0, 10});

View File

@ -75,7 +75,8 @@ public class PercolateQueryBuilderTests extends AbstractQueryTestCase<PercolateQ
PercolateQueryBuilder.DOCUMENTS_FIELD.getPreferredName() PercolateQueryBuilder.DOCUMENTS_FIELD.getPreferredName()
}; };
private static String queryField; private static String queryField = "field";
private static String aliasField = "alias";
private static String docType; private static String docType;
private String indexedDocumentIndex; private String indexedDocumentIndex;
@ -96,9 +97,11 @@ public class PercolateQueryBuilderTests extends AbstractQueryTestCase<PercolateQ
@Override @Override
protected void initializeAdditionalMappings(MapperService mapperService) throws IOException { protected void initializeAdditionalMappings(MapperService mapperService) throws IOException {
queryField = randomAlphaOfLength(4); queryField = randomAlphaOfLength(4);
aliasField = randomAlphaOfLength(4);
String docType = "_doc"; String docType = "_doc";
mapperService.merge(docType, new CompressedXContent(Strings.toString(PutMappingRequest.buildFromSimplifiedDef(docType, mapperService.merge(docType, new CompressedXContent(Strings.toString(PutMappingRequest.buildFromSimplifiedDef(docType,
queryField, "type=percolator" queryField, "type=percolator", aliasField, "type=alias,path=" + queryField
))), MapperService.MergeReason.MAPPING_UPDATE); ))), MapperService.MergeReason.MAPPING_UPDATE);
mapperService.merge(docType, new CompressedXContent(Strings.toString(PutMappingRequest.buildFromSimplifiedDef(docType, mapperService.merge(docType, new CompressedXContent(Strings.toString(PutMappingRequest.buildFromSimplifiedDef(docType,
STRING_FIELD_NAME, "type=text" STRING_FIELD_NAME, "type=text"
@ -355,4 +358,21 @@ public class PercolateQueryBuilderTests extends AbstractQueryTestCase<PercolateQ
builder = rewriteAndFetch(builder, createShardContext()); builder = rewriteAndFetch(builder, createShardContext());
builder.writeTo(new BytesStreamOutput(10)); builder.writeTo(new BytesStreamOutput(10));
} }
public void testFieldAlias() throws IOException {
QueryShardContext shardContext = createShardContext();
PercolateQueryBuilder builder = doCreateTestQueryBuilder(false);
QueryBuilder rewrittenBuilder = rewriteAndFetch(builder, shardContext);
PercolateQuery query = (PercolateQuery) rewrittenBuilder.toQuery(shardContext);
PercolateQueryBuilder aliasBuilder = new PercolateQueryBuilder(aliasField,
builder.getDocuments(),
builder.getXContentType());
QueryBuilder rewrittenAliasBuilder = rewriteAndFetch(aliasBuilder, shardContext);
PercolateQuery aliasQuery = (PercolateQuery) rewrittenAliasBuilder.toQuery(shardContext);
assertEquals(query.getCandidateMatchesQuery(), aliasQuery.getCandidateMatchesQuery());
assertEquals(query.getVerifiedMatchesQuery(), aliasQuery.getVerifiedMatchesQuery());
}
} }

View File

@ -224,10 +224,10 @@ public class PercolatorFieldMapperTests extends ESSingleNodeTestCase {
public void testExtractRanges() throws Exception { public void testExtractRanges() throws Exception {
addQueryFieldMappings(); addQueryFieldMappings();
BooleanQuery.Builder bq = new BooleanQuery.Builder(); BooleanQuery.Builder bq = new BooleanQuery.Builder();
Query rangeQuery1 = mapperService.documentMapper("doc").mappers().getMapper("number_field1").fieldType() Query rangeQuery1 = mapperService.fullName("number_field1")
.rangeQuery(10, 20, true, true, null, null, null, null); .rangeQuery(10, 20, true, true, null, null, null, null);
bq.add(rangeQuery1, Occur.MUST); bq.add(rangeQuery1, Occur.MUST);
Query rangeQuery2 = mapperService.documentMapper("doc").mappers().getMapper("number_field1").fieldType() Query rangeQuery2 = mapperService.fullName("number_field1")
.rangeQuery(15, 20, true, true, null, null, null, null); .rangeQuery(15, 20, true, true, null, null, null, null);
bq.add(rangeQuery2, Occur.MUST); bq.add(rangeQuery2, Occur.MUST);
@ -255,7 +255,7 @@ public class PercolatorFieldMapperTests extends ESSingleNodeTestCase {
// Range queries on different fields: // Range queries on different fields:
bq = new BooleanQuery.Builder(); bq = new BooleanQuery.Builder();
bq.add(rangeQuery1, Occur.MUST); bq.add(rangeQuery1, Occur.MUST);
rangeQuery2 = mapperService.documentMapper("doc").mappers().getMapper("number_field2").fieldType() rangeQuery2 = mapperService.fullName("number_field2")
.rangeQuery(15, 20, true, true, null, null, null, null); .rangeQuery(15, 20, true, true, null, null, null, null);
bq.add(rangeQuery2, Occur.MUST); bq.add(rangeQuery2, Occur.MUST);

View File

@ -39,7 +39,7 @@ import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.index.IndexService; import org.elasticsearch.index.IndexService;
import org.elasticsearch.index.mapper.DocumentFieldMappers; import org.elasticsearch.index.mapper.DocumentFieldMappers;
import org.elasticsearch.index.mapper.DocumentMapper; import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.FieldMapper; import org.elasticsearch.index.mapper.Mapper;
import org.elasticsearch.index.shard.ShardId; import org.elasticsearch.index.shard.ShardId;
import org.elasticsearch.indices.IndicesService; import org.elasticsearch.indices.IndicesService;
import org.elasticsearch.indices.TypeMissingException; import org.elasticsearch.indices.TypeMissingException;
@ -174,19 +174,19 @@ public class TransportGetFieldMappingsIndexAction extends TransportSingleShardAc
final DocumentFieldMappers allFieldMappers = documentMapper.mappers(); final DocumentFieldMappers allFieldMappers = documentMapper.mappers();
for (String field : request.fields()) { for (String field : request.fields()) {
if (Regex.isMatchAllPattern(field)) { if (Regex.isMatchAllPattern(field)) {
for (FieldMapper fieldMapper : allFieldMappers) { for (Mapper fieldMapper : allFieldMappers) {
addFieldMapper(fieldPredicate, fieldMapper.fieldType().name(), fieldMapper, fieldMappings, request.includeDefaults()); addFieldMapper(fieldPredicate, fieldMapper.name(), fieldMapper, fieldMappings, request.includeDefaults());
} }
} else if (Regex.isSimpleMatchPattern(field)) { } else if (Regex.isSimpleMatchPattern(field)) {
for (FieldMapper fieldMapper : allFieldMappers) { for (Mapper fieldMapper : allFieldMappers) {
if (Regex.simpleMatch(field, fieldMapper.fieldType().name())) { if (Regex.simpleMatch(field, fieldMapper.name())) {
addFieldMapper(fieldPredicate, fieldMapper.fieldType().name(), addFieldMapper(fieldPredicate, fieldMapper.name(),
fieldMapper, fieldMappings, request.includeDefaults()); fieldMapper, fieldMappings, request.includeDefaults());
} }
} }
} else { } else {
// not a pattern // not a pattern
FieldMapper fieldMapper = allFieldMappers.getMapper(field); Mapper fieldMapper = allFieldMappers.getMapper(field);
if (fieldMapper != null) { if (fieldMapper != null) {
addFieldMapper(fieldPredicate, field, fieldMapper, fieldMappings, request.includeDefaults()); addFieldMapper(fieldPredicate, field, fieldMapper, fieldMappings, request.includeDefaults());
} else if (request.probablySingleFieldRequest()) { } else if (request.probablySingleFieldRequest()) {
@ -198,7 +198,7 @@ public class TransportGetFieldMappingsIndexAction extends TransportSingleShardAc
} }
private static void addFieldMapper(Predicate<String> fieldPredicate, private static void addFieldMapper(Predicate<String> fieldPredicate,
String field, FieldMapper fieldMapper, Map<String, FieldMappingMetaData> fieldMappings, String field, Mapper fieldMapper, Map<String, FieldMappingMetaData> fieldMappings,
boolean includeDefaults) { boolean includeDefaults) {
if (fieldMappings.containsKey(field)) { if (fieldMappings.containsKey(field)) {
return; return;
@ -207,7 +207,7 @@ public class TransportGetFieldMappingsIndexAction extends TransportSingleShardAc
try { try {
BytesReference bytes = XContentHelper.toXContent(fieldMapper, XContentType.JSON, BytesReference bytes = XContentHelper.toXContent(fieldMapper, XContentType.JSON,
includeDefaults ? includeDefaultsParams : ToXContent.EMPTY_PARAMS, false); includeDefaults ? includeDefaultsParams : ToXContent.EMPTY_PARAMS, false);
fieldMappings.put(field, new FieldMappingMetaData(fieldMapper.fieldType().name(), bytes)); fieldMappings.put(field, new FieldMappingMetaData(fieldMapper.name(), bytes));
} catch (IOException e) { } catch (IOException e) {
throw new ElasticsearchException("failed to serialize XContent of field [" + field + "]", e); throw new ElasticsearchException("failed to serialize XContent of field [" + field + "]", e);
} }

View File

@ -83,8 +83,8 @@ public class TransportFieldCapabilitiesIndexAction extends TransportSingleShardA
for (String field : fieldNames) { for (String field : fieldNames) {
MappedFieldType ft = mapperService.fullName(field); MappedFieldType ft = mapperService.fullName(field);
if (ft != null) { if (ft != null) {
FieldCapabilities fieldCap = new FieldCapabilities(field, ft.typeName(), ft.isSearchable(), ft.isAggregatable()); if (indicesService.isMetaDataField(field) || fieldPredicate.test(ft.name())) {
if (indicesService.isMetaDataField(field) || fieldPredicate.test(field)) { FieldCapabilities fieldCap = new FieldCapabilities(field, ft.typeName(), ft.isSearchable(), ft.isAggregatable());
responseMap.put(field, fieldCap); responseMap.put(field, fieldCap);
} }
} }

View File

@ -27,8 +27,6 @@ import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.index.engine.Engine; import org.elasticsearch.index.engine.Engine;
import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.index.fielddata.IndexFieldData;
import org.elasticsearch.index.fielddata.IndexFieldDataService; import org.elasticsearch.index.fielddata.IndexFieldDataService;
import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.FieldMapper;
import org.elasticsearch.index.mapper.MappedFieldType; import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.MapperService; import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.index.shard.IndexShard; import org.elasticsearch.index.shard.IndexShard;
@ -121,16 +119,12 @@ public final class IndexWarmer extends AbstractComponent {
public TerminationHandle warmReader(final IndexShard indexShard, final Engine.Searcher searcher) { public TerminationHandle warmReader(final IndexShard indexShard, final Engine.Searcher searcher) {
final MapperService mapperService = indexShard.mapperService(); final MapperService mapperService = indexShard.mapperService();
final Map<String, MappedFieldType> warmUpGlobalOrdinals = new HashMap<>(); final Map<String, MappedFieldType> warmUpGlobalOrdinals = new HashMap<>();
DocumentMapper docMapper = mapperService.documentMapper(); for (MappedFieldType fieldType : mapperService.fieldTypes()) {
if (docMapper != null) { final String indexName = fieldType.name();
for (FieldMapper fieldMapper : docMapper.mappers()) { if (fieldType.eagerGlobalOrdinals() == false) {
final MappedFieldType fieldType = fieldMapper.fieldType(); continue;
final String indexName = fieldType.name();
if (fieldType.eagerGlobalOrdinals() == false) {
continue;
}
warmUpGlobalOrdinals.put(indexName, fieldType);
} }
warmUpGlobalOrdinals.put(indexName, fieldType);
} }
final CountDownLatch latch = new CountDownLatch(warmUpGlobalOrdinals.size()); final CountDownLatch latch = new CountDownLatch(warmUpGlobalOrdinals.size());
for (final MappedFieldType fieldType : warmUpGlobalOrdinals.values()) { for (final MappedFieldType fieldType : warmUpGlobalOrdinals.values()) {

View File

@ -19,11 +19,8 @@
package org.elasticsearch.index.fieldvisitor; package org.elasticsearch.index.fieldvisitor;
import org.apache.lucene.index.FieldInfo; import org.apache.lucene.index.FieldInfo;
import org.elasticsearch.common.regex.Regex;
import java.io.IOException; import java.io.IOException;
import java.util.Collections;
import java.util.List;
import java.util.Set; import java.util.Set;
/** /**
@ -35,16 +32,10 @@ import java.util.Set;
public class CustomFieldsVisitor extends FieldsVisitor { public class CustomFieldsVisitor extends FieldsVisitor {
private final Set<String> fields; private final Set<String> fields;
private final List<String> patterns;
public CustomFieldsVisitor(Set<String> fields, List<String> patterns, boolean loadSource) {
super(loadSource);
this.fields = fields;
this.patterns = patterns;
}
public CustomFieldsVisitor(Set<String> fields, boolean loadSource) { public CustomFieldsVisitor(Set<String> fields, boolean loadSource) {
this(fields, Collections.emptyList(), loadSource); super(loadSource);
this.fields = fields;
} }
@Override @Override
@ -55,11 +46,6 @@ public class CustomFieldsVisitor extends FieldsVisitor {
if (fields.contains(fieldInfo.name)) { if (fields.contains(fieldInfo.name)) {
return Status.YES; return Status.YES;
} }
for (String pattern : patterns) {
if (Regex.simpleMatch(pattern, fieldInfo.name)) {
return Status.YES;
}
}
return Status.NO; return Status.NO;
} }
} }

View File

@ -39,7 +39,7 @@ import org.elasticsearch.index.engine.Engine;
import org.elasticsearch.index.fieldvisitor.CustomFieldsVisitor; import org.elasticsearch.index.fieldvisitor.CustomFieldsVisitor;
import org.elasticsearch.index.fieldvisitor.FieldsVisitor; import org.elasticsearch.index.fieldvisitor.FieldsVisitor;
import org.elasticsearch.index.mapper.DocumentMapper; import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.FieldMapper; import org.elasticsearch.index.mapper.Mapper;
import org.elasticsearch.index.mapper.MapperService; import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.index.mapper.RoutingFieldMapper; import org.elasticsearch.index.mapper.RoutingFieldMapper;
import org.elasticsearch.index.mapper.SourceFieldMapper; import org.elasticsearch.index.mapper.SourceFieldMapper;
@ -202,7 +202,7 @@ public final class ShardGetService extends AbstractIndexShardComponent {
if (gFields != null && gFields.length > 0) { if (gFields != null && gFields.length > 0) {
for (String field : gFields) { for (String field : gFields) {
FieldMapper fieldMapper = docMapper.mappers().getMapper(field); Mapper fieldMapper = docMapper.mappers().getMapper(field);
if (fieldMapper == null) { if (fieldMapper == null) {
if (docMapper.objectMappers().get(field) != null) { if (docMapper.objectMappers().get(field) != null) {
// Only fail if we know it is a object field, missing paths / fields shouldn't fail. // Only fail if we know it is a object field, missing paths / fields shouldn't fail.

View File

@ -28,10 +28,10 @@ import java.util.HashMap;
import java.util.Iterator; import java.util.Iterator;
import java.util.Map; import java.util.Map;
public final class DocumentFieldMappers implements Iterable<FieldMapper> { public final class DocumentFieldMappers implements Iterable<Mapper> {
/** Full field name to mapper */ /** Full field name to mapper */
private final Map<String, FieldMapper> fieldMappers; private final Map<String, Mapper> fieldMappers;
private final FieldNameAnalyzer indexAnalyzer; private final FieldNameAnalyzer indexAnalyzer;
private final FieldNameAnalyzer searchAnalyzer; private final FieldNameAnalyzer searchAnalyzer;
@ -44,8 +44,12 @@ public final class DocumentFieldMappers implements Iterable<FieldMapper> {
analyzers.put(key, value); analyzers.put(key, value);
} }
public DocumentFieldMappers(Collection<FieldMapper> mappers, Analyzer defaultIndex, Analyzer defaultSearch, Analyzer defaultSearchQuote) { public DocumentFieldMappers(Collection<FieldMapper> mappers,
Map<String, FieldMapper> fieldMappers = new HashMap<>(); Collection<FieldAliasMapper> aliasMappers,
Analyzer defaultIndex,
Analyzer defaultSearch,
Analyzer defaultSearchQuote) {
Map<String, Mapper> fieldMappers = new HashMap<>();
Map<String, Analyzer> indexAnalyzers = new HashMap<>(); Map<String, Analyzer> indexAnalyzers = new HashMap<>();
Map<String, Analyzer> searchAnalyzers = new HashMap<>(); Map<String, Analyzer> searchAnalyzers = new HashMap<>();
Map<String, Analyzer> searchQuoteAnalyzers = new HashMap<>(); Map<String, Analyzer> searchQuoteAnalyzers = new HashMap<>();
@ -56,14 +60,24 @@ public final class DocumentFieldMappers implements Iterable<FieldMapper> {
put(searchAnalyzers, fieldType.name(), fieldType.searchAnalyzer(), defaultSearch); put(searchAnalyzers, fieldType.name(), fieldType.searchAnalyzer(), defaultSearch);
put(searchQuoteAnalyzers, fieldType.name(), fieldType.searchQuoteAnalyzer(), defaultSearchQuote); put(searchQuoteAnalyzers, fieldType.name(), fieldType.searchQuoteAnalyzer(), defaultSearchQuote);
} }
for (FieldAliasMapper aliasMapper : aliasMappers) {
fieldMappers.put(aliasMapper.name(), aliasMapper);
}
this.fieldMappers = Collections.unmodifiableMap(fieldMappers); this.fieldMappers = Collections.unmodifiableMap(fieldMappers);
this.indexAnalyzer = new FieldNameAnalyzer(indexAnalyzers); this.indexAnalyzer = new FieldNameAnalyzer(indexAnalyzers);
this.searchAnalyzer = new FieldNameAnalyzer(searchAnalyzers); this.searchAnalyzer = new FieldNameAnalyzer(searchAnalyzers);
this.searchQuoteAnalyzer = new FieldNameAnalyzer(searchQuoteAnalyzers); this.searchQuoteAnalyzer = new FieldNameAnalyzer(searchQuoteAnalyzers);
} }
/** Returns the mapper for the given field */ /**
public FieldMapper getMapper(String field) { * Returns the leaf mapper associated with this field name. Note that the returned mapper
* could be either a concrete {@link FieldMapper}, or a {@link FieldAliasMapper}.
*
* To access a field's type information, {@link MapperService#fullName} should be used instead.
*/
public Mapper getMapper(String field) {
return fieldMappers.get(field); return fieldMappers.get(field);
} }
@ -87,7 +101,7 @@ public final class DocumentFieldMappers implements Iterable<FieldMapper> {
return this.searchQuoteAnalyzer; return this.searchQuoteAnalyzer;
} }
public Iterator<FieldMapper> iterator() { public Iterator<Mapper> iterator() {
return fieldMappers.values().iterator(); return fieldMappers.values().iterator();
} }
} }

View File

@ -133,15 +133,18 @@ public class DocumentMapper implements ToXContentFragment {
// collect all the mappers for this type // collect all the mappers for this type
List<ObjectMapper> newObjectMappers = new ArrayList<>(); List<ObjectMapper> newObjectMappers = new ArrayList<>();
List<FieldMapper> newFieldMappers = new ArrayList<>(); List<FieldMapper> newFieldMappers = new ArrayList<>();
List<FieldAliasMapper> newFieldAliasMappers = new ArrayList<>();
for (MetadataFieldMapper metadataMapper : this.mapping.metadataMappers) { for (MetadataFieldMapper metadataMapper : this.mapping.metadataMappers) {
if (metadataMapper instanceof FieldMapper) { if (metadataMapper instanceof FieldMapper) {
newFieldMappers.add(metadataMapper); newFieldMappers.add(metadataMapper);
} }
} }
MapperUtils.collect(this.mapping.root, newObjectMappers, newFieldMappers); MapperUtils.collect(this.mapping.root,
newObjectMappers, newFieldMappers, newFieldAliasMappers);
final IndexAnalyzers indexAnalyzers = mapperService.getIndexAnalyzers(); final IndexAnalyzers indexAnalyzers = mapperService.getIndexAnalyzers();
this.fieldMappers = new DocumentFieldMappers(newFieldMappers, this.fieldMappers = new DocumentFieldMappers(newFieldMappers,
newFieldAliasMappers,
indexAnalyzers.getDefaultIndexAnalyzer(), indexAnalyzers.getDefaultIndexAnalyzer(),
indexAnalyzers.getDefaultSearchAnalyzer(), indexAnalyzers.getDefaultSearchAnalyzer(),
indexAnalyzers.getDefaultSearchQuoteAnalyzer()); indexAnalyzers.getDefaultSearchQuoteAnalyzer());

View File

@ -459,13 +459,18 @@ final class DocumentParser {
private static void parseObjectOrField(ParseContext context, Mapper mapper) throws IOException { private static void parseObjectOrField(ParseContext context, Mapper mapper) throws IOException {
if (mapper instanceof ObjectMapper) { if (mapper instanceof ObjectMapper) {
parseObjectOrNested(context, (ObjectMapper) mapper); parseObjectOrNested(context, (ObjectMapper) mapper);
} else { } else if (mapper instanceof FieldMapper) {
FieldMapper fieldMapper = (FieldMapper)mapper; FieldMapper fieldMapper = (FieldMapper) mapper;
Mapper update = fieldMapper.parse(context); Mapper update = fieldMapper.parse(context);
if (update != null) { if (update != null) {
context.addDynamicMapper(update); context.addDynamicMapper(update);
} }
parseCopyFields(context, fieldMapper.copyTo().copyToFields()); parseCopyFields(context, fieldMapper.copyTo().copyToFields());
} else if (mapper instanceof FieldAliasMapper) {
throw new IllegalArgumentException("Cannot write to a field alias [" + mapper.name() + "].");
} else {
throw new IllegalStateException("The provided mapper [" + mapper.name() + "] has an unrecognized type [" +
mapper.getClass().getSimpleName() + "].");
} }
} }
@ -827,9 +832,16 @@ final class DocumentParser {
/** Creates an copy of the current field with given field name and boost */ /** Creates an copy of the current field with given field name and boost */
private static void parseCopy(String field, ParseContext context) throws IOException { private static void parseCopy(String field, ParseContext context) throws IOException {
FieldMapper fieldMapper = context.docMapper().mappers().getMapper(field); Mapper mapper = context.docMapper().mappers().getMapper(field);
if (fieldMapper != null) { if (mapper != null) {
fieldMapper.parse(context); if (mapper instanceof FieldMapper) {
((FieldMapper) mapper).parse(context);
} else if (mapper instanceof FieldAliasMapper) {
throw new IllegalArgumentException("Cannot copy to a field alias [" + mapper.name() + "].");
} else {
throw new IllegalStateException("The provided mapper [" + mapper.name() +
"] has an unrecognized type [" + mapper.getClass().getSimpleName() + "].");
}
} else { } else {
// The path of the dest field might be completely different from the current one so we need to reset it // The path of the dest field might be completely different from the current one so we need to reset it
context = context.overridePath(new ContentPath(0)); context = context.overridePath(new ContentPath(0));
@ -837,8 +849,8 @@ final class DocumentParser {
final String[] paths = splitAndValidatePath(field); final String[] paths = splitAndValidatePath(field);
final String fieldName = paths[paths.length-1]; final String fieldName = paths[paths.length-1];
Tuple<Integer, ObjectMapper> parentMapperTuple = getDynamicParentMapper(context, paths, null); Tuple<Integer, ObjectMapper> parentMapperTuple = getDynamicParentMapper(context, paths, null);
ObjectMapper mapper = parentMapperTuple.v2(); ObjectMapper objectMapper = parentMapperTuple.v2();
parseDynamicValue(context, mapper, fieldName, context.parser().currentToken()); parseDynamicValue(context, objectMapper, fieldName, context.parser().currentToken());
for (int i = 0; i < parentMapperTuple.v1(); i++) { for (int i = 0; i < parentMapperTuple.v1(); i++) {
context.path().remove(); context.path().remove();
} }
@ -849,46 +861,46 @@ final class DocumentParser {
ObjectMapper currentParent) { ObjectMapper currentParent) {
ObjectMapper mapper = currentParent == null ? context.root() : currentParent; ObjectMapper mapper = currentParent == null ? context.root() : currentParent;
int pathsAdded = 0; int pathsAdded = 0;
ObjectMapper parent = mapper; ObjectMapper parent = mapper;
for (int i = 0; i < paths.length-1; i++) { for (int i = 0; i < paths.length-1; i++) {
String currentPath = context.path().pathAsText(paths[i]); String currentPath = context.path().pathAsText(paths[i]);
FieldMapper existingFieldMapper = context.docMapper().mappers().getMapper(currentPath); Mapper existingFieldMapper = context.docMapper().mappers().getMapper(currentPath);
if (existingFieldMapper != null) { if (existingFieldMapper != null) {
throw new MapperParsingException( throw new MapperParsingException(
"Could not dynamically add mapping for field [{}]. Existing mapping for [{}] must be of type object but found [{}].", "Could not dynamically add mapping for field [{}]. Existing mapping for [{}] must be of type object but found [{}].",
null, String.join(".", paths), currentPath, existingFieldMapper.fieldType.typeName()); null, String.join(".", paths), currentPath, existingFieldMapper.typeName());
} }
mapper = context.docMapper().objectMappers().get(currentPath); mapper = context.docMapper().objectMappers().get(currentPath);
if (mapper == null) { if (mapper == null) {
// One mapping is missing, check if we are allowed to create a dynamic one. // One mapping is missing, check if we are allowed to create a dynamic one.
ObjectMapper.Dynamic dynamic = dynamicOrDefault(parent, context); ObjectMapper.Dynamic dynamic = dynamicOrDefault(parent, context);
switch (dynamic) { switch (dynamic) {
case STRICT: case STRICT:
throw new StrictDynamicMappingException(parent.fullPath(), paths[i]); throw new StrictDynamicMappingException(parent.fullPath(), paths[i]);
case TRUE: case TRUE:
Mapper.Builder builder = context.root().findTemplateBuilder(context, paths[i], XContentFieldType.OBJECT); Mapper.Builder builder = context.root().findTemplateBuilder(context, paths[i], XContentFieldType.OBJECT);
if (builder == null) { if (builder == null) {
builder = new ObjectMapper.Builder(paths[i]).enabled(true); builder = new ObjectMapper.Builder(paths[i]).enabled(true);
} }
Mapper.BuilderContext builderContext = new Mapper.BuilderContext(context.indexSettings(), context.path()); Mapper.BuilderContext builderContext = new Mapper.BuilderContext(context.indexSettings(), context.path());
mapper = (ObjectMapper) builder.build(builderContext); mapper = (ObjectMapper) builder.build(builderContext);
if (mapper.nested() != ObjectMapper.Nested.NO) { if (mapper.nested() != ObjectMapper.Nested.NO) {
throw new MapperParsingException("It is forbidden to create dynamic nested objects ([" + context.path().pathAsText(paths[i]) throw new MapperParsingException("It is forbidden to create dynamic nested objects ([" + context.path().pathAsText(paths[i])
+ "]) through `copy_to` or dots in field names"); + "]) through `copy_to` or dots in field names");
} }
context.addDynamicMapper(mapper); context.addDynamicMapper(mapper);
break; break;
case FALSE: case FALSE:
// Should not dynamically create any more mappers so return the last mapper // Should not dynamically create any more mappers so return the last mapper
return new Tuple<>(pathsAdded, parent); return new Tuple<>(pathsAdded, parent);
}
} }
context.path().add(paths[i]);
pathsAdded++;
parent = mapper;
} }
context.path().add(paths[i]);
pathsAdded++;
parent = mapper;
}
return new Tuple<>(pathsAdded, mapper); return new Tuple<>(pathsAdded, mapper);
} }

View File

@ -0,0 +1,132 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index.mapper;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.support.XContentMapValues;
import java.io.IOException;
import java.util.Collections;
import java.util.Iterator;
import java.util.Map;
/**
* A mapper for field aliases.
*
* A field alias has no concrete field mappings of its own, but instead points to another field by
* its path. Once defined, an alias can be used in place of the concrete field name in search requests.
*/
public final class FieldAliasMapper extends Mapper {
public static final String CONTENT_TYPE = "alias";
public static class Names {
public static final String PATH = "path";
}
private final String name;
private final String path;
public FieldAliasMapper(String simpleName,
String name,
String path) {
super(simpleName);
this.name = name;
this.path = path;
}
@Override
public String name() {
return name;
}
@Override
public String typeName() {
return CONTENT_TYPE;
}
public String path() {
return path;
}
@Override
public Mapper merge(Mapper mergeWith) {
if (!(mergeWith instanceof FieldAliasMapper)) {
throw new IllegalArgumentException("Cannot merge a field alias mapping ["
+ name() + "] with a mapping that is not for a field alias.");
}
return mergeWith;
}
@Override
public Mapper updateFieldType(Map<String, MappedFieldType> fullNameToFieldType) {
return this;
}
@Override
public Iterator<Mapper> iterator() {
return Collections.emptyIterator();
}
@Override
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
return builder.startObject(simpleName())
.field("type", CONTENT_TYPE)
.field(Names.PATH, path)
.endObject();
}
public static class TypeParser implements Mapper.TypeParser {
@Override
public Mapper.Builder parse(String name, Map<String, Object> node, ParserContext parserContext)
throws MapperParsingException {
FieldAliasMapper.Builder builder = new FieldAliasMapper.Builder(name);
Object pathField = node.remove(Names.PATH);
String path = XContentMapValues.nodeStringValue(pathField, null);
if (path == null) {
throw new MapperParsingException("The [path] property must be specified for field [" + name + "].");
}
return builder.path(path);
}
}
public static class Builder extends Mapper.Builder<FieldAliasMapper.Builder, FieldAliasMapper> {
private String name;
private String path;
protected Builder(String name) {
super(name);
this.name = name;
}
public String name() {
return this.name;
}
public Builder path(String path) {
this.path = path;
return this;
}
public FieldAliasMapper build(BuilderContext context) {
String fullName = context.path().pathAsText(name);
return new FieldAliasMapper(name, fullName, path);
}
}
}

View File

@ -247,6 +247,11 @@ public abstract class FieldMapper extends Mapper implements Cloneable {
return fieldType().name(); return fieldType().name();
} }
@Override
public String typeName() {
return fieldType.typeName();
}
public MappedFieldType fieldType() { public MappedFieldType fieldType() {
return fieldType; return fieldType;
} }

View File

@ -35,64 +35,115 @@ import java.util.Set;
*/ */
class FieldTypeLookup implements Iterable<MappedFieldType> { class FieldTypeLookup implements Iterable<MappedFieldType> {
/** Full field name to field type */
final CopyOnWriteHashMap<String, MappedFieldType> fullNameToFieldType; final CopyOnWriteHashMap<String, MappedFieldType> fullNameToFieldType;
private final CopyOnWriteHashMap<String, String> aliasToConcreteName;
/** Create a new empty instance. */
FieldTypeLookup() { FieldTypeLookup() {
fullNameToFieldType = new CopyOnWriteHashMap<>(); fullNameToFieldType = new CopyOnWriteHashMap<>();
aliasToConcreteName = new CopyOnWriteHashMap<>();
} }
private FieldTypeLookup(CopyOnWriteHashMap<String, MappedFieldType> fullName) { private FieldTypeLookup(CopyOnWriteHashMap<String, MappedFieldType> fullNameToFieldType,
this.fullNameToFieldType = fullName; CopyOnWriteHashMap<String, String> aliasToConcreteName) {
this.fullNameToFieldType = fullNameToFieldType;
this.aliasToConcreteName = aliasToConcreteName;
} }
/** /**
* Return a new instance that contains the union of this instance and the field types * Return a new instance that contains the union of this instance and the field types
* from the provided fields. If a field already exists, the field type will be updated * from the provided mappers. If a field already exists, its field type will be updated
* to use the new mappers field type. * to use the new type from the given field mapper. Similarly if an alias already
* exists, it will be updated to reference the field type from the new mapper.
*/ */
public FieldTypeLookup copyAndAddAll(String type, Collection<FieldMapper> fieldMappers) { public FieldTypeLookup copyAndAddAll(String type,
Collection<FieldMapper> fieldMappers,
Collection<FieldAliasMapper> fieldAliasMappers) {
Objects.requireNonNull(type, "type must not be null"); Objects.requireNonNull(type, "type must not be null");
if (MapperService.DEFAULT_MAPPING.equals(type)) { if (MapperService.DEFAULT_MAPPING.equals(type)) {
throw new IllegalArgumentException("Default mappings should not be added to the lookup"); throw new IllegalArgumentException("Default mappings should not be added to the lookup");
} }
CopyOnWriteHashMap<String, MappedFieldType> fullName = this.fullNameToFieldType; CopyOnWriteHashMap<String, MappedFieldType> fullName = this.fullNameToFieldType;
CopyOnWriteHashMap<String, String> aliases = this.aliasToConcreteName;
for (FieldMapper fieldMapper : fieldMappers) { for (FieldMapper fieldMapper : fieldMappers) {
MappedFieldType fieldType = fieldMapper.fieldType(); MappedFieldType fieldType = fieldMapper.fieldType();
MappedFieldType fullNameFieldType = fullName.get(fieldType.name()); MappedFieldType fullNameFieldType = fullName.get(fieldType.name());
if (fullNameFieldType == null) { if (!Objects.equals(fieldType, fullNameFieldType)) {
// introduction of a new field validateField(fullNameFieldType, fieldType, aliases);
fullName = fullName.copyAndPut(fieldType.name(), fieldMapper.fieldType()); fullName = fullName.copyAndPut(fieldType.name(), fieldType);
} else {
// modification of an existing field
checkCompatibility(fullNameFieldType, fieldType);
if (fieldType.equals(fullNameFieldType) == false) {
fullName = fullName.copyAndPut(fieldType.name(), fieldMapper.fieldType());
}
} }
} }
return new FieldTypeLookup(fullName);
for (FieldAliasMapper fieldAliasMapper : fieldAliasMappers) {
String aliasName = fieldAliasMapper.name();
String path = fieldAliasMapper.path();
validateAlias(aliasName, path, aliases, fullName);
aliases = aliases.copyAndPut(aliasName, path);
}
return new FieldTypeLookup(fullName, aliases);
} }
/** /**
* Checks if the given field type is compatible with an existing field type. * Checks that the new field type is valid.
* An IllegalArgumentException is thrown in case of incompatibility.
*/ */
private void checkCompatibility(MappedFieldType existingFieldType, MappedFieldType newFieldType) { private void validateField(MappedFieldType existingFieldType,
List<String> conflicts = new ArrayList<>(); MappedFieldType newFieldType,
existingFieldType.checkCompatibility(newFieldType, conflicts); CopyOnWriteHashMap<String, String> aliasToConcreteName) {
if (conflicts.isEmpty() == false) { String fieldName = newFieldType.name();
throw new IllegalArgumentException("Mapper for [" + newFieldType.name() + "] conflicts with existing mapping:\n" + conflicts.toString()); if (aliasToConcreteName.containsKey(fieldName)) {
throw new IllegalArgumentException("The name for field [" + fieldName + "] has already" +
" been used to define a field alias.");
}
if (existingFieldType != null) {
List<String> conflicts = new ArrayList<>();
existingFieldType.checkCompatibility(newFieldType, conflicts);
if (conflicts.isEmpty() == false) {
throw new IllegalArgumentException("Mapper for [" + fieldName +
"] conflicts with existing mapping:\n" + conflicts.toString());
}
}
}
/**
* Checks that the new field alias is valid.
*
* Note that this method assumes that new concrete fields have already been processed, so that it
* can verify that an alias refers to an existing concrete field.
*/
private void validateAlias(String aliasName,
String path,
CopyOnWriteHashMap<String, String> aliasToConcreteName,
CopyOnWriteHashMap<String, MappedFieldType> fullNameToFieldType) {
if (fullNameToFieldType.containsKey(aliasName)) {
throw new IllegalArgumentException("The name for field alias [" + aliasName + "] has already" +
" been used to define a concrete field.");
}
if (path.equals(aliasName)) {
throw new IllegalArgumentException("Invalid [path] value [" + path + "] for field alias [" +
aliasName + "]: an alias cannot refer to itself.");
}
if (aliasToConcreteName.containsKey(path)) {
throw new IllegalArgumentException("Invalid [path] value [" + path + "] for field alias [" +
aliasName + "]: an alias cannot refer to another alias.");
}
if (!fullNameToFieldType.containsKey(path)) {
throw new IllegalArgumentException("Invalid [path] value [" + path + "] for field alias [" +
aliasName + "]: an alias must refer to an existing field in the mappings.");
} }
} }
/** Returns the field for the given field */ /** Returns the field for the given field */
public MappedFieldType get(String field) { public MappedFieldType get(String field) {
return fullNameToFieldType.get(field); String concreteField = aliasToConcreteName.getOrDefault(field, field);
return fullNameToFieldType.get(concreteField);
} }
/** /**
@ -105,6 +156,11 @@ class FieldTypeLookup implements Iterable<MappedFieldType> {
fields.add(fieldType.name()); fields.add(fieldType.name());
} }
} }
for (String aliasName : aliasToConcreteName.keySet()) {
if (Regex.simpleMatch(pattern, aliasName)) {
fields.add(aliasName);
}
}
return fields; return fields;
} }

View File

@ -173,6 +173,11 @@ public abstract class Mapper implements ToXContentFragment, Iterable<Mapper> {
/** Returns the canonical name which uniquely identifies the mapper against other mappers in a type. */ /** Returns the canonical name which uniquely identifies the mapper against other mappers in a type. */
public abstract String name(); public abstract String name();
/**
* Returns a name representing the the type of this mapper.
*/
public abstract String typeName();
/** Return the merge of {@code mergeWith} into this. /** Return the merge of {@code mergeWith} into this.
* Both {@code this} and {@code mergeWith} will be left unmodified. */ * Both {@code this} and {@code mergeWith} will be left unmodified. */
public abstract Mapper merge(Mapper mergeWith); public abstract Mapper merge(Mapper mergeWith);

View File

@ -0,0 +1,214 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index.mapper;
import java.util.Collection;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.Set;
import java.util.stream.Stream;
/**
* A utility class that helps validate certain aspects of a mappings update.
*/
class MapperMergeValidator {
/**
* Validates the overall structure of the mapping addition, including whether
* duplicate fields are present, and if the provided fields have already been
* defined with a different data type.
*
* @param type The mapping type, for use in error messages.
* @param objectMappers The newly added object mappers.
* @param fieldMappers The newly added field mappers.
* @param fieldAliasMappers The newly added field alias mappers.
* @param fullPathObjectMappers All object mappers, indexed by their full path.
* @param fieldTypes All field and field alias mappers, collected into a lookup structure.
*/
public static void validateMapperStructure(String type,
Collection<ObjectMapper> objectMappers,
Collection<FieldMapper> fieldMappers,
Collection<FieldAliasMapper> fieldAliasMappers,
Map<String, ObjectMapper> fullPathObjectMappers,
FieldTypeLookup fieldTypes) {
checkFieldUniqueness(type, objectMappers, fieldMappers,
fieldAliasMappers, fullPathObjectMappers, fieldTypes);
checkObjectsCompatibility(objectMappers, fullPathObjectMappers);
}
private static void checkFieldUniqueness(String type,
Collection<ObjectMapper> objectMappers,
Collection<FieldMapper> fieldMappers,
Collection<FieldAliasMapper> fieldAliasMappers,
Map<String, ObjectMapper> fullPathObjectMappers,
FieldTypeLookup fieldTypes) {
// first check within mapping
Set<String> objectFullNames = new HashSet<>();
for (ObjectMapper objectMapper : objectMappers) {
String fullPath = objectMapper.fullPath();
if (objectFullNames.add(fullPath) == false) {
throw new IllegalArgumentException("Object mapper [" + fullPath + "] is defined twice in mapping for type [" + type + "]");
}
}
Set<String> fieldNames = new HashSet<>();
Stream.concat(fieldMappers.stream(), fieldAliasMappers.stream())
.forEach(mapper -> {
String name = mapper.name();
if (objectFullNames.contains(name)) {
throw new IllegalArgumentException("Field [" + name + "] is defined both as an object and a field in [" + type + "]");
} else if (fieldNames.add(name) == false) {
throw new IllegalArgumentException("Field [" + name + "] is defined twice in [" + type + "]");
}
});
// then check other types
for (String fieldName : fieldNames) {
if (fullPathObjectMappers.containsKey(fieldName)) {
throw new IllegalArgumentException("[" + fieldName + "] is defined as a field in mapping [" + type
+ "] but this name is already used for an object in other types");
}
}
for (String objectPath : objectFullNames) {
if (fieldTypes.get(objectPath) != null) {
throw new IllegalArgumentException("[" + objectPath + "] is defined as an object in mapping [" + type
+ "] but this name is already used for a field in other types");
}
}
}
private static void checkObjectsCompatibility(Collection<ObjectMapper> objectMappers,
Map<String, ObjectMapper> fullPathObjectMappers) {
for (ObjectMapper newObjectMapper : objectMappers) {
ObjectMapper existingObjectMapper = fullPathObjectMappers.get(newObjectMapper.fullPath());
if (existingObjectMapper != null) {
// simulate a merge and ignore the result, we are just interested
// in exceptions here
existingObjectMapper.merge(newObjectMapper);
}
}
}
/**
* Verifies that each field reference, e.g. the value of copy_to or the target
* of a field alias, corresponds to a valid part of the mapping.
*
* @param fieldMappers The newly added field mappers.
* @param fieldAliasMappers The newly added field alias mappers.
* @param fullPathObjectMappers All object mappers, indexed by their full path.
* @param fieldTypes All field and field alias mappers, collected into a lookup structure.
*/
public static void validateFieldReferences(List<FieldMapper> fieldMappers,
List<FieldAliasMapper> fieldAliasMappers,
Map<String, ObjectMapper> fullPathObjectMappers,
FieldTypeLookup fieldTypes) {
validateCopyTo(fieldMappers, fullPathObjectMappers, fieldTypes);
validateFieldAliasTargets(fieldAliasMappers, fullPathObjectMappers);
}
private static void validateCopyTo(List<FieldMapper> fieldMappers,
Map<String, ObjectMapper> fullPathObjectMappers,
FieldTypeLookup fieldTypes) {
for (FieldMapper mapper : fieldMappers) {
if (mapper.copyTo() != null && mapper.copyTo().copyToFields().isEmpty() == false) {
String sourceParent = parentObject(mapper.name());
if (sourceParent != null && fieldTypes.get(sourceParent) != null) {
throw new IllegalArgumentException("[copy_to] may not be used to copy from a multi-field: [" + mapper.name() + "]");
}
final String sourceScope = getNestedScope(mapper.name(), fullPathObjectMappers);
for (String copyTo : mapper.copyTo().copyToFields()) {
String copyToParent = parentObject(copyTo);
if (copyToParent != null && fieldTypes.get(copyToParent) != null) {
throw new IllegalArgumentException("[copy_to] may not be used to copy to a multi-field: [" + copyTo + "]");
}
if (fullPathObjectMappers.containsKey(copyTo)) {
throw new IllegalArgumentException("Cannot copy to field [" + copyTo + "] since it is mapped as an object");
}
final String targetScope = getNestedScope(copyTo, fullPathObjectMappers);
checkNestedScopeCompatibility(sourceScope, targetScope);
}
}
}
}
private static void validateFieldAliasTargets(List<FieldAliasMapper> fieldAliasMappers,
Map<String, ObjectMapper> fullPathObjectMappers) {
for (FieldAliasMapper mapper : fieldAliasMappers) {
String aliasName = mapper.name();
String path = mapper.path();
String aliasScope = getNestedScope(aliasName, fullPathObjectMappers);
String pathScope = getNestedScope(path, fullPathObjectMappers);
if (!Objects.equals(aliasScope, pathScope)) {
StringBuilder message = new StringBuilder("Invalid [path] value [" + path + "] for field alias [" +
aliasName + "]: an alias must have the same nested scope as its target. ");
message.append(aliasScope == null
? "The alias is not nested"
: "The alias's nested scope is [" + aliasScope + "]");
message.append(", but ");
message.append(pathScope == null
? "the target is not nested."
: "the target's nested scope is [" + pathScope + "].");
throw new IllegalArgumentException(message.toString());
}
}
}
private static String getNestedScope(String path, Map<String, ObjectMapper> fullPathObjectMappers) {
for (String parentPath = parentObject(path); parentPath != null; parentPath = parentObject(parentPath)) {
ObjectMapper objectMapper = fullPathObjectMappers.get(parentPath);
if (objectMapper != null && objectMapper.nested().isNested()) {
return parentPath;
}
}
return null;
}
private static void checkNestedScopeCompatibility(String source, String target) {
boolean targetIsParentOfSource;
if (source == null || target == null) {
targetIsParentOfSource = target == null;
} else {
targetIsParentOfSource = source.equals(target) || source.startsWith(target + ".");
}
if (targetIsParentOfSource == false) {
throw new IllegalArgumentException(
"Illegal combination of [copy_to] and [nested] mappings: [copy_to] may only copy data to the current nested " +
"document or any of its parents, however one [copy_to] directive is trying to copy data from nested object [" +
source + "] to [" + target + "]");
}
}
private static String parentObject(String field) {
int lastDot = field.lastIndexOf('.');
if (lastDot == -1) {
return null;
}
return field.substring(0, lastDot);
}
}

View File

@ -21,7 +21,6 @@ package org.elasticsearch.index.mapper;
import com.carrotsearch.hppc.ObjectHashSet; import com.carrotsearch.hppc.ObjectHashSet;
import com.carrotsearch.hppc.cursors.ObjectCursor; import com.carrotsearch.hppc.cursors.ObjectCursor;
import org.apache.logging.log4j.message.ParameterizedMessage; import org.apache.logging.log4j.message.ParameterizedMessage;
import org.apache.lucene.analysis.Analyzer; import org.apache.lucene.analysis.Analyzer;
import org.apache.lucene.analysis.DelegatingAnalyzerWrapper; import org.apache.lucene.analysis.DelegatingAnalyzerWrapper;
@ -395,15 +394,17 @@ public class MapperService extends AbstractIndexComponent implements Closeable {
// check basic sanity of the new mapping // check basic sanity of the new mapping
List<ObjectMapper> objectMappers = new ArrayList<>(); List<ObjectMapper> objectMappers = new ArrayList<>();
List<FieldMapper> fieldMappers = new ArrayList<>(); List<FieldMapper> fieldMappers = new ArrayList<>();
List<FieldAliasMapper> fieldAliasMappers = new ArrayList<>();
Collections.addAll(fieldMappers, newMapper.mapping().metadataMappers); Collections.addAll(fieldMappers, newMapper.mapping().metadataMappers);
MapperUtils.collect(newMapper.mapping().root(), objectMappers, fieldMappers); MapperUtils.collect(newMapper.mapping().root(), objectMappers, fieldMappers, fieldAliasMappers);
checkFieldUniqueness(newMapper.type(), objectMappers, fieldMappers, fullPathObjectMappers, fieldTypes);
checkObjectsCompatibility(objectMappers, fullPathObjectMappers); MapperMergeValidator.validateMapperStructure(newMapper.type(), objectMappers, fieldMappers,
fieldAliasMappers, fullPathObjectMappers, fieldTypes);
checkPartitionedIndexConstraints(newMapper); checkPartitionedIndexConstraints(newMapper);
// update lookup data-structures // update lookup data-structures
// this will in particular make sure that the merged fields are compatible with other types // this will in particular make sure that the merged fields are compatible with other types
fieldTypes = fieldTypes.copyAndAddAll(newMapper.type(), fieldMappers); fieldTypes = fieldTypes.copyAndAddAll(newMapper.type(), fieldMappers, fieldAliasMappers);
for (ObjectMapper objectMapper : objectMappers) { for (ObjectMapper objectMapper : objectMappers) {
if (fullPathObjectMappers == this.fullPathObjectMappers) { if (fullPathObjectMappers == this.fullPathObjectMappers) {
@ -417,7 +418,8 @@ public class MapperService extends AbstractIndexComponent implements Closeable {
} }
} }
validateCopyTo(fieldMappers, fullPathObjectMappers, fieldTypes); MapperMergeValidator.validateFieldReferences(fieldMappers, fieldAliasMappers,
fullPathObjectMappers, fieldTypes);
if (reason == MergeReason.MAPPING_UPDATE) { if (reason == MergeReason.MAPPING_UPDATE) {
// this check will only be performed on the master node when there is // this check will only be performed on the master node when there is
@ -482,7 +484,7 @@ public class MapperService extends AbstractIndexComponent implements Closeable {
if (mapper != null) { if (mapper != null) {
List<FieldMapper> fieldMappers = new ArrayList<>(); List<FieldMapper> fieldMappers = new ArrayList<>();
Collections.addAll(fieldMappers, mapper.mapping().metadataMappers); Collections.addAll(fieldMappers, mapper.mapping().metadataMappers);
MapperUtils.collect(mapper.root(), new ArrayList<>(), fieldMappers); MapperUtils.collect(mapper.root(), new ArrayList<>(), fieldMappers, new ArrayList<>());
for (FieldMapper fieldMapper : fieldMappers) { for (FieldMapper fieldMapper : fieldMappers) {
assert fieldMapper.fieldType() == fieldTypes.get(fieldMapper.name()) : fieldMapper.name(); assert fieldMapper.fieldType() == fieldTypes.get(fieldMapper.name()) : fieldMapper.name();
} }
@ -503,56 +505,6 @@ public class MapperService extends AbstractIndexComponent implements Closeable {
return true; return true;
} }
private static void checkFieldUniqueness(String type, Collection<ObjectMapper> objectMappers, Collection<FieldMapper> fieldMappers,
Map<String, ObjectMapper> fullPathObjectMappers, FieldTypeLookup fieldTypes) {
// first check within mapping
final Set<String> objectFullNames = new HashSet<>();
for (ObjectMapper objectMapper : objectMappers) {
final String fullPath = objectMapper.fullPath();
if (objectFullNames.add(fullPath) == false) {
throw new IllegalArgumentException("Object mapper [" + fullPath + "] is defined twice in mapping for type [" + type + "]");
}
}
final Set<String> fieldNames = new HashSet<>();
for (FieldMapper fieldMapper : fieldMappers) {
final String name = fieldMapper.name();
if (objectFullNames.contains(name)) {
throw new IllegalArgumentException("Field [" + name + "] is defined both as an object and a field in [" + type + "]");
} else if (fieldNames.add(name) == false) {
throw new IllegalArgumentException("Field [" + name + "] is defined twice in [" + type + "]");
}
}
// then check other types
for (String fieldName : fieldNames) {
if (fullPathObjectMappers.containsKey(fieldName)) {
throw new IllegalArgumentException("[" + fieldName + "] is defined as a field in mapping [" + type
+ "] but this name is already used for an object in other types");
}
}
for (String objectPath : objectFullNames) {
if (fieldTypes.get(objectPath) != null) {
throw new IllegalArgumentException("[" + objectPath + "] is defined as an object in mapping [" + type
+ "] but this name is already used for a field in other types");
}
}
}
private static void checkObjectsCompatibility(Collection<ObjectMapper> objectMappers,
Map<String, ObjectMapper> fullPathObjectMappers) {
for (ObjectMapper newObjectMapper : objectMappers) {
ObjectMapper existingObjectMapper = fullPathObjectMappers.get(newObjectMapper.fullPath());
if (existingObjectMapper != null) {
// simulate a merge and ignore the result, we are just interested
// in exceptions here
existingObjectMapper.merge(newObjectMapper);
}
}
}
private void checkNestedFieldsLimit(Map<String, ObjectMapper> fullPathObjectMappers) { private void checkNestedFieldsLimit(Map<String, ObjectMapper> fullPathObjectMappers) {
long allowedNestedFields = indexSettings.getValue(INDEX_MAPPING_NESTED_FIELDS_LIMIT_SETTING); long allowedNestedFields = indexSettings.getValue(INDEX_MAPPING_NESTED_FIELDS_LIMIT_SETTING);
long actualNestedFields = 0; long actualNestedFields = 0;
@ -609,66 +561,6 @@ public class MapperService extends AbstractIndexComponent implements Closeable {
} }
} }
private static void validateCopyTo(List<FieldMapper> fieldMappers, Map<String, ObjectMapper> fullPathObjectMappers,
FieldTypeLookup fieldTypes) {
for (FieldMapper mapper : fieldMappers) {
if (mapper.copyTo() != null && mapper.copyTo().copyToFields().isEmpty() == false) {
String sourceParent = parentObject(mapper.name());
if (sourceParent != null && fieldTypes.get(sourceParent) != null) {
throw new IllegalArgumentException("[copy_to] may not be used to copy from a multi-field: [" + mapper.name() + "]");
}
final String sourceScope = getNestedScope(mapper.name(), fullPathObjectMappers);
for (String copyTo : mapper.copyTo().copyToFields()) {
String copyToParent = parentObject(copyTo);
if (copyToParent != null && fieldTypes.get(copyToParent) != null) {
throw new IllegalArgumentException("[copy_to] may not be used to copy to a multi-field: [" + copyTo + "]");
}
if (fullPathObjectMappers.containsKey(copyTo)) {
throw new IllegalArgumentException("Cannot copy to field [" + copyTo + "] since it is mapped as an object");
}
final String targetScope = getNestedScope(copyTo, fullPathObjectMappers);
checkNestedScopeCompatibility(sourceScope, targetScope);
}
}
}
}
private static String getNestedScope(String path, Map<String, ObjectMapper> fullPathObjectMappers) {
for (String parentPath = parentObject(path); parentPath != null; parentPath = parentObject(parentPath)) {
ObjectMapper objectMapper = fullPathObjectMappers.get(parentPath);
if (objectMapper != null && objectMapper.nested().isNested()) {
return parentPath;
}
}
return null;
}
private static void checkNestedScopeCompatibility(String source, String target) {
boolean targetIsParentOfSource;
if (source == null || target == null) {
targetIsParentOfSource = target == null;
} else {
targetIsParentOfSource = source.equals(target) || source.startsWith(target + ".");
}
if (targetIsParentOfSource == false) {
throw new IllegalArgumentException(
"Illegal combination of [copy_to] and [nested] mappings: [copy_to] may only copy data to the current nested " +
"document or any of its parents, however one [copy_to] directive is trying to copy data from nested object [" +
source + "] to [" + target + "]");
}
}
private static String parentObject(String field) {
int lastDot = field.lastIndexOf('.');
if (lastDot == -1) {
return null;
}
return field.substring(0, lastDot);
}
public DocumentMapper parse(String mappingType, CompressedXContent mappingSource, boolean applyDefault) throws MapperParsingException { public DocumentMapper parse(String mappingType, CompressedXContent mappingSource, boolean applyDefault) throws MapperParsingException {
return documentParser.parse(mappingType, mappingSource, applyDefault ? defaultMappingSource : null); return documentParser.parse(mappingType, mappingSource, applyDefault ? defaultMappingSource : null);
} }
@ -729,6 +621,13 @@ public class MapperService extends AbstractIndexComponent implements Closeable {
return fieldTypes.simpleMatchToFullName(pattern); return fieldTypes.simpleMatchToFullName(pattern);
} }
/**
* Returns all mapped field types.
*/
public Iterable<MappedFieldType> fieldTypes() {
return fieldTypes;
}
public ObjectMapper getObjectMapper(String name) { public ObjectMapper getObjectMapper(String name) {
return fullPathObjectMappers.get(name); return fullPathObjectMappers.get(name);
} }

View File

@ -24,17 +24,28 @@ import java.util.Collection;
enum MapperUtils { enum MapperUtils {
; ;
/** Split mapper and its descendants into object and field mappers. */ /**
public static void collect(Mapper mapper, Collection<ObjectMapper> objectMappers, Collection<FieldMapper> fieldMappers) { * Splits the provided mapper and its descendants into object, field, and field alias mappers.
*/
public static void collect(Mapper mapper, Collection<ObjectMapper> objectMappers,
Collection<FieldMapper> fieldMappers,
Collection<FieldAliasMapper> fieldAliasMappers) {
if (mapper instanceof RootObjectMapper) { if (mapper instanceof RootObjectMapper) {
// root mapper isn't really an object mapper // root mapper isn't really an object mapper
} else if (mapper instanceof ObjectMapper) { } else if (mapper instanceof ObjectMapper) {
objectMappers.add((ObjectMapper)mapper); objectMappers.add((ObjectMapper)mapper);
} else if (mapper instanceof FieldMapper) { } else if (mapper instanceof FieldMapper) {
fieldMappers.add((FieldMapper)mapper); fieldMappers.add((FieldMapper)mapper);
} else if (mapper instanceof FieldAliasMapper) {
fieldAliasMappers.add((FieldAliasMapper) mapper);
} else {
throw new IllegalStateException("Unrecognized mapper type [" +
mapper.getClass().getSimpleName() + "].");
} }
for (Mapper child : mapper) { for (Mapper child : mapper) {
collect(child, objectMappers, fieldMappers); collect(child, objectMappers, fieldMappers, fieldAliasMappers);
} }
} }
} }

View File

@ -359,6 +359,11 @@ public class ObjectMapper extends Mapper implements Cloneable {
return this.fullPath; return this.fullPath;
} }
@Override
public String typeName() {
return CONTENT_TYPE;
}
public boolean isEnabled() { public boolean isEnabled() {
return this.enabled; return this.enabled;
} }

View File

@ -149,7 +149,7 @@ public class ExistsQueryBuilder extends AbstractQueryBuilder<ExistsQueryBuilder>
} }
if (context.indexVersionCreated().before(Version.V_6_1_0)) { if (context.indexVersionCreated().before(Version.V_6_1_0)) {
return newLegacyExistsQuery(fields); return newLegacyExistsQuery(context, fields);
} }
if (fields.size() == 1) { if (fields.size() == 1) {
@ -164,22 +164,28 @@ public class ExistsQueryBuilder extends AbstractQueryBuilder<ExistsQueryBuilder>
return new ConstantScoreQuery(boolFilterBuilder.build()); return new ConstantScoreQuery(boolFilterBuilder.build());
} }
private static Query newLegacyExistsQuery(Collection<String> fields) { private static Query newLegacyExistsQuery(QueryShardContext context, Collection<String> fields) {
// We create TermsQuery directly here rather than using FieldNamesFieldType.termsQuery() // We create TermsQuery directly here rather than using FieldNamesFieldType.termsQuery()
// so we don't end up with deprecation warnings // so we don't end up with deprecation warnings
if (fields.size() == 1) { if (fields.size() == 1) {
Query filter = new TermQuery(new Term(FieldNamesFieldMapper.NAME, fields.iterator().next())); Query filter = newLegacyExistsQuery(context, fields.iterator().next());
return new ConstantScoreQuery(filter); return new ConstantScoreQuery(filter);
} }
BooleanQuery.Builder boolFilterBuilder = new BooleanQuery.Builder(); BooleanQuery.Builder boolFilterBuilder = new BooleanQuery.Builder();
for (String field : fields) { for (String field : fields) {
Query filter = new TermQuery(new Term(FieldNamesFieldMapper.NAME, field)); Query filter = newLegacyExistsQuery(context, field);
boolFilterBuilder.add(filter, BooleanClause.Occur.SHOULD); boolFilterBuilder.add(filter, BooleanClause.Occur.SHOULD);
} }
return new ConstantScoreQuery(boolFilterBuilder.build()); return new ConstantScoreQuery(boolFilterBuilder.build());
} }
private static Query newLegacyExistsQuery(QueryShardContext context, String field) {
MappedFieldType fieldType = context.fieldMapper(field);
String fieldName = fieldType != null ? fieldType.name() : field;
return new TermQuery(new Term(FieldNamesFieldMapper.NAME, fieldName));
}
private static Query newFieldExistsQuery(QueryShardContext context, String field) { private static Query newFieldExistsQuery(QueryShardContext context, String field) {
MappedFieldType fieldType = context.getMapperService().fullName(field); MappedFieldType fieldType = context.getMapperService().fullName(field);
if (fieldType == null) { if (fieldType == null) {

View File

@ -44,6 +44,7 @@ import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.xcontent.LoggingDeprecationHandler; import org.elasticsearch.common.xcontent.LoggingDeprecationHandler;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.query.support.QueryParsers; import org.elasticsearch.index.query.support.QueryParsers;
import java.io.IOException; import java.io.IOException;
@ -202,14 +203,18 @@ public class SpanMultiTermQueryBuilder extends AbstractQueryBuilder<SpanMultiTer
multiTermQueryBuilder.getClass().getName() + ", should be " + MultiTermQuery.class.getName() multiTermQueryBuilder.getClass().getName() + ", should be " + MultiTermQuery.class.getName()
+ " but was " + subQuery.getClass().getName()); + " but was " + subQuery.getClass().getName());
} }
PrefixQueryBuilder prefixBuilder = (PrefixQueryBuilder) multiTermQueryBuilder;
MappedFieldType fieldType = context.fieldMapper(prefixBuilder.fieldName());
String fieldName = fieldType != null ? fieldType.name() : prefixBuilder.fieldName();
if (context.getIndexSettings().getIndexVersionCreated().before(Version.V_6_4_0)) { if (context.getIndexSettings().getIndexVersionCreated().before(Version.V_6_4_0)) {
/** /**
* Indices created in this version do not index positions on the prefix field * Indices created in this version do not index positions on the prefix field
* so we cannot use it to match positional queries. Instead, we explicitly create the prefix * so we cannot use it to match positional queries. Instead, we explicitly create the prefix
* query on the main field to avoid the rewrite. * query on the main field to avoid the rewrite.
*/ */
PrefixQueryBuilder prefixBuilder = (PrefixQueryBuilder) multiTermQueryBuilder; PrefixQuery prefixQuery = new PrefixQuery(new Term(fieldName, prefixBuilder.value()));
PrefixQuery prefixQuery = new PrefixQuery(new Term(prefixBuilder.fieldName(), prefixBuilder.value()));
if (prefixBuilder.rewrite() != null) { if (prefixBuilder.rewrite() != null) {
MultiTermQuery.RewriteMethod rewriteMethod = MultiTermQuery.RewriteMethod rewriteMethod =
QueryParsers.parseRewriteMethod(prefixBuilder.rewrite(), null, LoggingDeprecationHandler.INSTANCE); QueryParsers.parseRewriteMethod(prefixBuilder.rewrite(), null, LoggingDeprecationHandler.INSTANCE);
@ -218,15 +223,14 @@ public class SpanMultiTermQueryBuilder extends AbstractQueryBuilder<SpanMultiTer
subQuery = prefixQuery; subQuery = prefixQuery;
spanQuery = new SpanMultiTermQueryWrapper<>(prefixQuery); spanQuery = new SpanMultiTermQueryWrapper<>(prefixQuery);
} else { } else {
String origFieldName = ((PrefixQueryBuilder) multiTermQueryBuilder).fieldName();
SpanTermQuery spanTermQuery = new SpanTermQuery(((TermQuery) subQuery).getTerm());
/** /**
* Prefixes are indexed in a different field so we mask the term query with the original field * Prefixes are indexed in a different field so we mask the term query with the original field
* name. This is required because span_near and span_or queries don't work across different field. * name. This is required because span_near and span_or queries don't work across different field.
* The masking is safe because the prefix field is indexed using the same content than the original field * The masking is safe because the prefix field is indexed using the same content than the original field
* and the prefix analyzer preserves positions. * and the prefix analyzer preserves positions.
*/ */
spanQuery = new FieldMaskingSpanQuery(spanTermQuery, origFieldName); SpanTermQuery spanTermQuery = new SpanTermQuery(((TermQuery) subQuery).getTerm());
spanQuery = new FieldMaskingSpanQuery(spanTermQuery, fieldName);
} }
} else { } else {
if (subQuery instanceof MultiTermQuery == false) { if (subQuery instanceof MultiTermQuery == false) {

View File

@ -30,6 +30,7 @@ import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentLocation; import org.elasticsearch.common.xcontent.XContentLocation;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.index.mapper.MappedFieldType;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -218,7 +219,8 @@ public class SpanNearQueryBuilder extends AbstractQueryBuilder<SpanNearQueryBuil
} }
String spanNearFieldName = null; String spanNearFieldName = null;
if (isGap) { if (isGap) {
spanNearFieldName = ((SpanGapQueryBuilder) queryBuilder).fieldName(); String fieldName = ((SpanGapQueryBuilder) queryBuilder).fieldName();
spanNearFieldName = queryFieldName(context, fieldName);
} else { } else {
spanNearFieldName = ((SpanQuery) query).getField(); spanNearFieldName = ((SpanQuery) query).getField();
} }
@ -241,7 +243,9 @@ public class SpanNearQueryBuilder extends AbstractQueryBuilder<SpanNearQueryBuil
isGap = queryBuilder instanceof SpanGapQueryBuilder; isGap = queryBuilder instanceof SpanGapQueryBuilder;
if (isGap) { if (isGap) {
String fieldName = ((SpanGapQueryBuilder) queryBuilder).fieldName(); String fieldName = ((SpanGapQueryBuilder) queryBuilder).fieldName();
if (!spanNearFieldName.equals(fieldName)) { String spanGapFieldName = queryFieldName(context, fieldName);
if (!spanNearFieldName.equals(spanGapFieldName)) {
throw new IllegalArgumentException("[span_near] clauses must have same field"); throw new IllegalArgumentException("[span_near] clauses must have same field");
} }
int gap = ((SpanGapQueryBuilder) queryBuilder).width(); int gap = ((SpanGapQueryBuilder) queryBuilder).width();
@ -255,6 +259,11 @@ public class SpanNearQueryBuilder extends AbstractQueryBuilder<SpanNearQueryBuil
return builder.build(); return builder.build();
} }
private String queryFieldName(QueryShardContext context, String fieldName) {
MappedFieldType fieldType = context.fieldMapper(fieldName);
return fieldType != null ? fieldType.name() : fieldName;
}
@Override @Override
protected int doHashCode() { protected int doHashCode() {
return Objects.hash(clauses, slop, inOrder); return Objects.hash(clauses, slop, inOrder);
@ -273,11 +282,11 @@ public class SpanNearQueryBuilder extends AbstractQueryBuilder<SpanNearQueryBuil
} }
/** /**
* SpanGapQueryBuilder enables gaps in a SpanNearQuery. * SpanGapQueryBuilder enables gaps in a SpanNearQuery.
* Since, SpanGapQuery is private to SpanNearQuery, SpanGapQueryBuilder cannot * Since, SpanGapQuery is private to SpanNearQuery, SpanGapQueryBuilder cannot
* be used to generate a Query (SpanGapQuery) like another QueryBuilder. * be used to generate a Query (SpanGapQuery) like another QueryBuilder.
* Instead, it just identifies a span_gap clause so that SpanNearQuery.addGap(int) * Instead, it just identifies a span_gap clause so that SpanNearQuery.addGap(int)
* can be invoked for it. * can be invoked for it.
* This QueryBuilder is only applicable as a clause in SpanGapQueryBuilder but * This QueryBuilder is only applicable as a clause in SpanGapQueryBuilder but
* yet to enforce this restriction. * yet to enforce this restriction.
*/ */
@ -286,9 +295,9 @@ public class SpanNearQueryBuilder extends AbstractQueryBuilder<SpanNearQueryBuil
/** Name of field to match against. */ /** Name of field to match against. */
private final String fieldName; private final String fieldName;
/** Width of the gap introduced. */ /** Width of the gap introduced. */
private final int width; private final int width;
/** /**
* Constructs a new SpanGapQueryBuilder term query. * Constructs a new SpanGapQueryBuilder term query.
@ -301,7 +310,7 @@ public class SpanNearQueryBuilder extends AbstractQueryBuilder<SpanNearQueryBuil
throw new IllegalArgumentException("[span_gap] field name is null or empty"); throw new IllegalArgumentException("[span_gap] field name is null or empty");
} }
//lucene has not coded any restriction on value of width. //lucene has not coded any restriction on value of width.
//to-do : find if theoretically it makes sense to apply restrictions. //to-do : find if theoretically it makes sense to apply restrictions.
this.fieldName = fieldName; this.fieldName = fieldName;
this.width = width; this.width = width;
} }
@ -396,7 +405,7 @@ public class SpanNearQueryBuilder extends AbstractQueryBuilder<SpanNearQueryBuil
fieldName = currentFieldName; fieldName = currentFieldName;
} else if (token.isValue()) { } else if (token.isValue()) {
width = parser.intValue(); width = parser.intValue();
} }
} }
SpanGapQueryBuilder result = new SpanGapQueryBuilder(fieldName, width); SpanGapQueryBuilder result = new SpanGapQueryBuilder(fieldName, width);
return result; return result;
@ -420,7 +429,7 @@ public class SpanNearQueryBuilder extends AbstractQueryBuilder<SpanNearQueryBuil
return Objects.hash(getClass(), fieldName, width); return Objects.hash(getClass(), fieldName, width);
} }
@Override @Override
public final String toString() { public final String toString() {
return Strings.toString(this, true, true); return Strings.toString(this, true, true);

View File

@ -221,7 +221,7 @@ public final class TermsSetQueryBuilder extends AbstractQueryBuilder<TermsSetQue
} }
@Override @Override
protected Query doToQuery(QueryShardContext context) throws IOException { protected Query doToQuery(QueryShardContext context) {
if (values.isEmpty()) { if (values.isEmpty()) {
return Queries.newMatchNoDocsQuery("No terms supplied for \"" + getName() + "\" query."); return Queries.newMatchNoDocsQuery("No terms supplied for \"" + getName() + "\" query.");
} }
@ -230,6 +230,15 @@ public final class TermsSetQueryBuilder extends AbstractQueryBuilder<TermsSetQue
throw new BooleanQuery.TooManyClauses(); throw new BooleanQuery.TooManyClauses();
} }
List<Query> queries = createTermQueries(context);
LongValuesSource longValuesSource = createValuesSource(context);
return new CoveringQuery(queries, longValuesSource);
}
/**
* Visible only for testing purposes.
*/
List<Query> createTermQueries(QueryShardContext context) {
final MappedFieldType fieldType = context.fieldMapper(fieldName); final MappedFieldType fieldType = context.fieldMapper(fieldName);
final List<Query> queries = new ArrayList<>(values.size()); final List<Query> queries = new ArrayList<>(values.size());
for (Object value : values) { for (Object value : values) {
@ -239,7 +248,11 @@ public final class TermsSetQueryBuilder extends AbstractQueryBuilder<TermsSetQue
queries.add(new TermQuery(new Term(fieldName, BytesRefs.toBytesRef(value)))); queries.add(new TermQuery(new Term(fieldName, BytesRefs.toBytesRef(value))));
} }
} }
final LongValuesSource longValuesSource; return queries;
}
private LongValuesSource createValuesSource(QueryShardContext context) {
LongValuesSource longValuesSource;
if (minimumShouldMatchField != null) { if (minimumShouldMatchField != null) {
MappedFieldType msmFieldType = context.fieldMapper(minimumShouldMatchField); MappedFieldType msmFieldType = context.fieldMapper(minimumShouldMatchField);
if (msmFieldType == null) { if (msmFieldType == null) {
@ -253,13 +266,13 @@ public final class TermsSetQueryBuilder extends AbstractQueryBuilder<TermsSetQue
SearchScript.TERMS_SET_QUERY_CONTEXT); SearchScript.TERMS_SET_QUERY_CONTEXT);
Map<String, Object> params = new HashMap<>(); Map<String, Object> params = new HashMap<>();
params.putAll(minimumShouldMatchScript.getParams()); params.putAll(minimumShouldMatchScript.getParams());
params.put("num_terms", queries.size()); params.put("num_terms", values.size());
SearchScript.LeafFactory leafFactory = factory.newFactory(params, context.lookup()); SearchScript.LeafFactory leafFactory = factory.newFactory(params, context.lookup());
longValuesSource = new ScriptLongValueSource(minimumShouldMatchScript, leafFactory); longValuesSource = new ScriptLongValueSource(minimumShouldMatchScript, leafFactory);
} else { } else {
throw new IllegalStateException("No minimum should match has been specified"); throw new IllegalStateException("No minimum should match has been specified");
} }
return new CoveringQuery(queries, longValuesSource); return longValuesSource;
} }
static final class ScriptLongValueSource extends LongValuesSource { static final class ScriptLongValueSource extends LongValuesSource {

View File

@ -25,6 +25,8 @@ import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.FieldMapper; import org.elasticsearch.index.mapper.FieldMapper;
import org.elasticsearch.index.mapper.IpFieldMapper; import org.elasticsearch.index.mapper.IpFieldMapper;
import org.elasticsearch.index.mapper.KeywordFieldMapper; import org.elasticsearch.index.mapper.KeywordFieldMapper;
import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.Mapper;
import org.elasticsearch.index.mapper.MapperService; import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.index.mapper.MetadataFieldMapper; import org.elasticsearch.index.mapper.MetadataFieldMapper;
import org.elasticsearch.index.mapper.NumberFieldMapper; import org.elasticsearch.index.mapper.NumberFieldMapper;
@ -88,10 +90,10 @@ public final class QueryParserHelper {
* @param mapperService The mapper service where to find the mapping. * @param mapperService The mapper service where to find the mapping.
* @param field The field name to search. * @param field The field name to search.
*/ */
public static FieldMapper getFieldMapper(MapperService mapperService, String field) { public static Mapper getFieldMapper(MapperService mapperService, String field) {
DocumentMapper mapper = mapperService.documentMapper(); DocumentMapper mapper = mapperService.documentMapper();
if (mapper != null) { if (mapper != null) {
FieldMapper fieldMapper = mapper.mappers().getMapper(field); Mapper fieldMapper = mapper.mappers().getMapper(field);
if (fieldMapper != null) { if (fieldMapper != null) {
return fieldMapper; return fieldMapper;
} }
@ -167,23 +169,27 @@ public final class QueryParserHelper {
if (fieldSuffix != null && context.fieldMapper(fieldName + fieldSuffix) != null) { if (fieldSuffix != null && context.fieldMapper(fieldName + fieldSuffix) != null) {
fieldName = fieldName + fieldSuffix; fieldName = fieldName + fieldSuffix;
} }
FieldMapper mapper = getFieldMapper(context.getMapperService(), fieldName);
if (mapper == null) { MappedFieldType fieldType = context.getMapperService().fullName(fieldName);
// Unmapped fields are not ignored if (fieldType == null) {
fields.put(fieldOrPattern, weight); // Note that we don't ignore unmapped fields.
continue; fields.put(fieldName, weight);
}
if (acceptMetadataField == false && mapper instanceof MetadataFieldMapper) {
// Ignore metadata fields
continue; continue;
} }
// Ignore fields that are not in the allowed mapper types. Some // Ignore fields that are not in the allowed mapper types. Some
// types do not support term queries, and thus we cannot generate // types do not support term queries, and thus we cannot generate
// a special query for them. // a special query for them.
String mappingType = mapper.fieldType().typeName(); String mappingType = fieldType.typeName();
if (acceptAllTypes == false && ALLOWED_QUERY_MAPPER_TYPES.contains(mappingType) == false) { if (acceptAllTypes == false && ALLOWED_QUERY_MAPPER_TYPES.contains(mappingType) == false) {
continue; continue;
} }
// Ignore metadata fields.
Mapper mapper = getFieldMapper(context.getMapperService(), fieldName);
if (acceptMetadataField == false && mapper instanceof MetadataFieldMapper) {
continue;
}
fields.put(fieldName, weight); fields.put(fieldName, weight);
} }
checkForTooManyFields(fields); checkForTooManyFields(fields);

View File

@ -36,6 +36,7 @@ import org.elasticsearch.index.mapper.BinaryFieldMapper;
import org.elasticsearch.index.mapper.BooleanFieldMapper; import org.elasticsearch.index.mapper.BooleanFieldMapper;
import org.elasticsearch.index.mapper.CompletionFieldMapper; import org.elasticsearch.index.mapper.CompletionFieldMapper;
import org.elasticsearch.index.mapper.DateFieldMapper; import org.elasticsearch.index.mapper.DateFieldMapper;
import org.elasticsearch.index.mapper.FieldAliasMapper;
import org.elasticsearch.index.mapper.FieldNamesFieldMapper; import org.elasticsearch.index.mapper.FieldNamesFieldMapper;
import org.elasticsearch.index.mapper.GeoPointFieldMapper; import org.elasticsearch.index.mapper.GeoPointFieldMapper;
import org.elasticsearch.index.mapper.GeoShapeFieldMapper; import org.elasticsearch.index.mapper.GeoShapeFieldMapper;
@ -129,7 +130,9 @@ public class IndicesModule extends AbstractModule {
mappers.put(ObjectMapper.CONTENT_TYPE, new ObjectMapper.TypeParser()); mappers.put(ObjectMapper.CONTENT_TYPE, new ObjectMapper.TypeParser());
mappers.put(ObjectMapper.NESTED_CONTENT_TYPE, new ObjectMapper.TypeParser()); mappers.put(ObjectMapper.NESTED_CONTENT_TYPE, new ObjectMapper.TypeParser());
mappers.put(CompletionFieldMapper.CONTENT_TYPE, new CompletionFieldMapper.TypeParser()); mappers.put(CompletionFieldMapper.CONTENT_TYPE, new CompletionFieldMapper.TypeParser());
mappers.put(FieldAliasMapper.CONTENT_TYPE, new FieldAliasMapper.TypeParser());
mappers.put(GeoPointFieldMapper.CONTENT_TYPE, new GeoPointFieldMapper.TypeParser()); mappers.put(GeoPointFieldMapper.CONTENT_TYPE, new GeoPointFieldMapper.TypeParser());
if (ShapesAvailability.JTS_AVAILABLE && ShapesAvailability.SPATIAL4J_AVAILABLE) { if (ShapesAvailability.JTS_AVAILABLE && ShapesAvailability.SPATIAL4J_AVAILABLE) {
mappers.put(GeoShapeFieldMapper.CONTENT_TYPE, new GeoShapeFieldMapper.TypeParser()); mappers.put(GeoShapeFieldMapper.CONTENT_TYPE, new GeoShapeFieldMapper.TypeParser());
} }

View File

@ -85,6 +85,12 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
AggregatorFactories.Builder subFactoriesBuilder, AggregatorFactories.Builder subFactoriesBuilder,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, config, context, parent, subFactoriesBuilder, metaData); super(name, config, context, parent, subFactoriesBuilder, metaData);
if (!config.unmapped()) {
this.fieldType = config.fieldContext().fieldType();
this.indexedFieldName = fieldType.name();
}
this.includeExclude = includeExclude; this.includeExclude = includeExclude;
this.executionHint = executionHint; this.executionHint = executionHint;
this.filter = filterBuilder == null this.filter = filterBuilder == null
@ -98,15 +104,6 @@ public class SignificantTermsAggregatorFactory extends ValuesSourceAggregatorFac
: searcher.count(filter); : searcher.count(filter);
this.bucketCountThresholds = bucketCountThresholds; this.bucketCountThresholds = bucketCountThresholds;
this.significanceHeuristic = significanceHeuristic; this.significanceHeuristic = significanceHeuristic;
setFieldInfo(context);
}
private void setFieldInfo(SearchContext context) {
if (!config.unmapped()) {
this.indexedFieldName = config.fieldContext().field();
fieldType = context.smartNameFieldType(indexedFieldName);
}
} }
/** /**

View File

@ -343,13 +343,10 @@ public class SignificantTextAggregationBuilder extends AbstractAggregationBuilde
protected AggregatorFactory<?> doBuild(SearchContext context, AggregatorFactory<?> parent, protected AggregatorFactory<?> doBuild(SearchContext context, AggregatorFactory<?> parent,
Builder subFactoriesBuilder) throws IOException { Builder subFactoriesBuilder) throws IOException {
SignificanceHeuristic executionHeuristic = this.significanceHeuristic.rewrite(context); SignificanceHeuristic executionHeuristic = this.significanceHeuristic.rewrite(context);
String[] execFieldNames = sourceFieldNames;
if (execFieldNames == null) {
execFieldNames = new String[] { fieldName };
}
return new SignificantTextAggregatorFactory(name, includeExclude, filterBuilder, return new SignificantTextAggregatorFactory(name, includeExclude, filterBuilder,
bucketCountThresholds, executionHeuristic, context, parent, subFactoriesBuilder, bucketCountThresholds, executionHeuristic, context, parent, subFactoriesBuilder,
fieldName, execFieldNames, filterDuplicateText, metaData); fieldName, sourceFieldNames, filterDuplicateText, metaData);
} }
@Override @Override

View File

@ -71,12 +71,19 @@ public class SignificantTextAggregatorFactory extends AggregatorFactory<Signific
AggregatorFactories.Builder subFactoriesBuilder, String fieldName, String [] sourceFieldNames, AggregatorFactories.Builder subFactoriesBuilder, String fieldName, String [] sourceFieldNames,
boolean filterDuplicateText, Map<String, Object> metaData) throws IOException { boolean filterDuplicateText, Map<String, Object> metaData) throws IOException {
super(name, context, parent, subFactoriesBuilder, metaData); super(name, context, parent, subFactoriesBuilder, metaData);
// Note that if the field is unmapped (its field type is null), we don't fail,
// and just use the given field name as a placeholder.
this.fieldType = context.getQueryShardContext().fieldMapper(fieldName);
this.indexedFieldName = fieldType != null ? fieldType.name() : fieldName;
this.sourceFieldNames = sourceFieldNames == null
? new String[] { indexedFieldName }
: sourceFieldNames;
this.includeExclude = includeExclude; this.includeExclude = includeExclude;
this.filter = filterBuilder == null this.filter = filterBuilder == null
? null ? null
: filterBuilder.toQuery(context.getQueryShardContext()); : filterBuilder.toQuery(context.getQueryShardContext());
this.indexedFieldName = fieldName;
this.sourceFieldNames = sourceFieldNames;
this.filterDuplicateText = filterDuplicateText; this.filterDuplicateText = filterDuplicateText;
IndexSearcher searcher = context.searcher(); IndexSearcher searcher = context.searcher();
// Important - need to use the doc count that includes deleted docs // Important - need to use the doc count that includes deleted docs
@ -86,11 +93,8 @@ public class SignificantTextAggregatorFactory extends AggregatorFactory<Signific
: searcher.count(filter); : searcher.count(filter);
this.bucketCountThresholds = bucketCountThresholds; this.bucketCountThresholds = bucketCountThresholds;
this.significanceHeuristic = significanceHeuristic; this.significanceHeuristic = significanceHeuristic;
fieldType = context.getQueryShardContext().fieldMapper(indexedFieldName);
} }
/** /**
* Get the number of docs in the superset. * Get the number of docs in the superset.
*/ */
@ -133,13 +137,13 @@ public class SignificantTextAggregatorFactory extends AggregatorFactory<Signific
} }
return context.searcher().count(query); return context.searcher().count(query);
} }
public long getBackgroundFrequency(BytesRef termBytes) throws IOException { public long getBackgroundFrequency(BytesRef termBytes) throws IOException {
String value = format.format(termBytes).toString(); String value = format.format(termBytes).toString();
return getBackgroundFrequency(value); return getBackgroundFrequency(value);
} }
@Override @Override
public void close() { public void close() {
try { try {
@ -154,11 +158,11 @@ public class SignificantTextAggregatorFactory extends AggregatorFactory<Signific
@Override @Override
protected Aggregator createInternal(Aggregator parent, boolean collectsFromSingleBucket, protected Aggregator createInternal(Aggregator parent, boolean collectsFromSingleBucket,
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData)
throws IOException { throws IOException {
if (collectsFromSingleBucket == false) { if (collectsFromSingleBucket == false) {
return asMultiBucketAggregator(this, context, parent); return asMultiBucketAggregator(this, context, parent);
} }
numberOfAggregatorsCreated++; numberOfAggregatorsCreated++;
BucketCountThresholds bucketCountThresholds = new BucketCountThresholds(this.bucketCountThresholds); BucketCountThresholds bucketCountThresholds = new BucketCountThresholds(this.bucketCountThresholds);
if (bucketCountThresholds.getShardSize() == SignificantTextAggregationBuilder.DEFAULT_BUCKET_COUNT_THRESHOLDS.getShardSize()) { if (bucketCountThresholds.getShardSize() == SignificantTextAggregationBuilder.DEFAULT_BUCKET_COUNT_THRESHOLDS.getShardSize()) {
@ -166,7 +170,7 @@ public class SignificantTextAggregatorFactory extends AggregatorFactory<Signific
// Use default heuristic to avoid any wrong-ranking caused by // Use default heuristic to avoid any wrong-ranking caused by
// distributed counting but request double the usual amount. // distributed counting but request double the usual amount.
// We typically need more than the number of "top" terms requested // We typically need more than the number of "top" terms requested
// by other aggregations as the significance algorithm is in less // by other aggregations as the significance algorithm is in less
// of a position to down-select at shard-level - some of the things // of a position to down-select at shard-level - some of the things
// we want to find have only one occurrence on each shard and as // we want to find have only one occurrence on each shard and as
// such are impossible to differentiate from non-significant terms // such are impossible to differentiate from non-significant terms
@ -177,9 +181,9 @@ public class SignificantTextAggregatorFactory extends AggregatorFactory<Signific
// TODO - need to check with mapping that this is indeed a text field.... // TODO - need to check with mapping that this is indeed a text field....
IncludeExclude.StringFilter incExcFilter = includeExclude == null ? null: IncludeExclude.StringFilter incExcFilter = includeExclude == null ? null:
includeExclude.convertToStringFilter(DocValueFormat.RAW); includeExclude.convertToStringFilter(DocValueFormat.RAW);
return new SignificantTextAggregator(name, factories, context, parent, pipelineAggregators, bucketCountThresholds, return new SignificantTextAggregator(name, factories, context, parent, pipelineAggregators, bucketCountThresholds,
incExcFilter, significanceHeuristic, this, indexedFieldName, sourceFieldNames, filterDuplicateText, metaData); incExcFilter, significanceHeuristic, this, indexedFieldName, sourceFieldNames, filterDuplicateText, metaData);

View File

@ -31,7 +31,6 @@ import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.collect.Tuple; import org.elasticsearch.common.collect.Tuple;
import org.elasticsearch.common.document.DocumentField; import org.elasticsearch.common.document.DocumentField;
import org.elasticsearch.common.lucene.search.Queries; import org.elasticsearch.common.lucene.search.Queries;
import org.elasticsearch.common.regex.Regex;
import org.elasticsearch.common.text.Text; import org.elasticsearch.common.text.Text;
import org.elasticsearch.common.xcontent.XContentHelper; import org.elasticsearch.common.xcontent.XContentHelper;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
@ -55,7 +54,7 @@ import org.elasticsearch.search.lookup.SourceLookup;
import org.elasticsearch.tasks.TaskCancelledException; import org.elasticsearch.tasks.TaskCancelledException;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.Collection;
import java.util.Collections; import java.util.Collections;
import java.util.HashMap; import java.util.HashMap;
import java.util.HashSet; import java.util.HashSet;
@ -84,8 +83,7 @@ public class FetchPhase implements SearchPhase {
@Override @Override
public void execute(SearchContext context) { public void execute(SearchContext context) {
final FieldsVisitor fieldsVisitor; final FieldsVisitor fieldsVisitor;
Set<String> fieldNames = null; Map<String, Set<String>> storedToRequestedFields = new HashMap<>();
List<String> fieldNamePatterns = null;
StoredFieldsContext storedFieldsContext = context.storedFieldsContext(); StoredFieldsContext storedFieldsContext = context.storedFieldsContext();
if (storedFieldsContext == null) { if (storedFieldsContext == null) {
@ -98,39 +96,36 @@ public class FetchPhase implements SearchPhase {
// disable stored fields entirely // disable stored fields entirely
fieldsVisitor = null; fieldsVisitor = null;
} else { } else {
for (String fieldName : context.storedFieldsContext().fieldNames()) { for (String fieldNameOrPattern : context.storedFieldsContext().fieldNames()) {
if (fieldName.equals(SourceFieldMapper.NAME)) { if (fieldNameOrPattern.equals(SourceFieldMapper.NAME)) {
FetchSourceContext fetchSourceContext = context.hasFetchSourceContext() ? context.fetchSourceContext() FetchSourceContext fetchSourceContext = context.hasFetchSourceContext() ? context.fetchSourceContext()
: FetchSourceContext.FETCH_SOURCE; : FetchSourceContext.FETCH_SOURCE;
context.fetchSourceContext(new FetchSourceContext(true, fetchSourceContext.includes(), fetchSourceContext.excludes())); context.fetchSourceContext(new FetchSourceContext(true, fetchSourceContext.includes(), fetchSourceContext.excludes()));
continue; continue;
} }
if (Regex.isSimpleMatchPattern(fieldName)) {
if (fieldNamePatterns == null) { Collection<String> fieldNames = context.mapperService().simpleMatchToFullName(fieldNameOrPattern);
fieldNamePatterns = new ArrayList<>(); for (String fieldName : fieldNames) {
}
fieldNamePatterns.add(fieldName);
} else {
MappedFieldType fieldType = context.smartNameFieldType(fieldName); MappedFieldType fieldType = context.smartNameFieldType(fieldName);
if (fieldType == null) { if (fieldType == null) {
// Only fail if we know it is a object field, missing paths / fields shouldn't fail. // Only fail if we know it is a object field, missing paths / fields shouldn't fail.
if (context.getObjectMapper(fieldName) != null) { if (context.getObjectMapper(fieldName) != null) {
throw new IllegalArgumentException("field [" + fieldName + "] isn't a leaf field"); throw new IllegalArgumentException("field [" + fieldName + "] isn't a leaf field");
} }
} else {
String storedField = fieldType.name();
Set<String> requestedFields = storedToRequestedFields.computeIfAbsent(
storedField, key -> new HashSet<>());
requestedFields.add(fieldName);
} }
if (fieldNames == null) {
fieldNames = new HashSet<>();
}
fieldNames.add(fieldName);
} }
} }
boolean loadSource = context.sourceRequested(); boolean loadSource = context.sourceRequested();
if (fieldNames == null && fieldNamePatterns == null) { if (storedToRequestedFields.isEmpty()) {
// empty list specified, default to disable _source if no explicit indication // empty list specified, default to disable _source if no explicit indication
fieldsVisitor = new FieldsVisitor(loadSource); fieldsVisitor = new FieldsVisitor(loadSource);
} else { } else {
fieldsVisitor = new CustomFieldsVisitor(fieldNames == null ? Collections.emptySet() : fieldNames, fieldsVisitor = new CustomFieldsVisitor(storedToRequestedFields.keySet(), loadSource);
fieldNamePatterns == null ? Collections.emptyList() : fieldNamePatterns, loadSource);
} }
} }
@ -149,10 +144,11 @@ public class FetchPhase implements SearchPhase {
final SearchHit searchHit; final SearchHit searchHit;
int rootDocId = findRootDocumentIfNested(context, subReaderContext, subDocId); int rootDocId = findRootDocumentIfNested(context, subReaderContext, subDocId);
if (rootDocId != -1) { if (rootDocId != -1) {
searchHit = createNestedSearchHit(context, docId, subDocId, rootDocId, fieldNames, fieldNamePatterns, searchHit = createNestedSearchHit(context, docId, subDocId, rootDocId,
subReaderContext); storedToRequestedFields, subReaderContext);
} else { } else {
searchHit = createSearchHit(context, fieldsVisitor, docId, subDocId, subReaderContext); searchHit = createSearchHit(context, fieldsVisitor, docId, subDocId,
storedToRequestedFields, subReaderContext);
} }
hits[index] = searchHit; hits[index] = searchHit;
@ -190,21 +186,18 @@ public class FetchPhase implements SearchPhase {
return -1; return -1;
} }
private SearchHit createSearchHit(SearchContext context, FieldsVisitor fieldsVisitor, int docId, int subDocId, private SearchHit createSearchHit(SearchContext context,
FieldsVisitor fieldsVisitor,
int docId,
int subDocId,
Map<String, Set<String>> storedToRequestedFields,
LeafReaderContext subReaderContext) { LeafReaderContext subReaderContext) {
if (fieldsVisitor == null) { if (fieldsVisitor == null) {
return new SearchHit(docId); return new SearchHit(docId);
} }
loadStoredFields(context, subReaderContext, fieldsVisitor, subDocId);
fieldsVisitor.postProcess(context.mapperService());
Map<String, DocumentField> searchFields = null; Map<String, DocumentField> searchFields = getSearchFields(context, fieldsVisitor, subDocId,
if (!fieldsVisitor.fields().isEmpty()) { storedToRequestedFields, subReaderContext);
searchFields = new HashMap<>(fieldsVisitor.fields().size());
for (Map.Entry<String, List<Object>> entry : fieldsVisitor.fields().entrySet()) {
searchFields.put(entry.getKey(), new DocumentField(entry.getKey(), entry.getValue()));
}
}
DocumentMapper documentMapper = context.mapperService().documentMapper(fieldsVisitor.uid().type()); DocumentMapper documentMapper = context.mapperService().documentMapper(fieldsVisitor.uid().type());
Text typeText; Text typeText;
@ -223,9 +216,40 @@ public class FetchPhase implements SearchPhase {
return searchHit; return searchHit;
} }
private SearchHit createNestedSearchHit(SearchContext context, int nestedTopDocId, int nestedSubDocId, private Map<String, DocumentField> getSearchFields(SearchContext context,
int rootSubDocId, Set<String> fieldNames, FieldsVisitor fieldsVisitor,
List<String> fieldNamePatterns, LeafReaderContext subReaderContext) throws IOException { int subDocId,
Map<String, Set<String>> storedToRequestedFields,
LeafReaderContext subReaderContext) {
loadStoredFields(context, subReaderContext, fieldsVisitor, subDocId);
fieldsVisitor.postProcess(context.mapperService());
if (fieldsVisitor.fields().isEmpty()) {
return null;
}
Map<String, DocumentField> searchFields = new HashMap<>(fieldsVisitor.fields().size());
for (Map.Entry<String, List<Object>> entry : fieldsVisitor.fields().entrySet()) {
String storedField = entry.getKey();
List<Object> storedValues = entry.getValue();
if (storedToRequestedFields.containsKey(storedField)) {
for (String requestedField : storedToRequestedFields.get(storedField)) {
searchFields.put(requestedField, new DocumentField(requestedField, storedValues));
}
} else {
searchFields.put(storedField, new DocumentField(storedField, storedValues));
}
}
return searchFields;
}
private SearchHit createNestedSearchHit(SearchContext context,
int nestedTopDocId,
int nestedSubDocId,
int rootSubDocId,
Map<String, Set<String>> storedToRequestedFields,
LeafReaderContext subReaderContext) throws IOException {
// Also if highlighting is requested on nested documents we need to fetch the _source from the root document, // Also if highlighting is requested on nested documents we need to fetch the _source from the root document,
// otherwise highlighting will attempt to fetch the _source from the nested doc, which will fail, // otherwise highlighting will attempt to fetch the _source from the nested doc, which will fail,
// because the entire _source is only stored with the root document. // because the entire _source is only stored with the root document.
@ -244,9 +268,13 @@ public class FetchPhase implements SearchPhase {
source = null; source = null;
} }
Map<String, DocumentField> searchFields = null;
if (context.hasStoredFields() && !context.storedFieldsContext().fieldNames().isEmpty()) {
FieldsVisitor nestedFieldsVisitor = new CustomFieldsVisitor(storedToRequestedFields.keySet(), false);
searchFields = getSearchFields(context, nestedFieldsVisitor, nestedSubDocId,
storedToRequestedFields, subReaderContext);
}
Map<String, DocumentField> searchFields =
getSearchFields(context, nestedSubDocId, fieldNames, fieldNamePatterns, subReaderContext);
DocumentMapper documentMapper = context.mapperService().documentMapper(uid.type()); DocumentMapper documentMapper = context.mapperService().documentMapper(uid.type());
SourceLookup sourceLookup = context.lookup().source(); SourceLookup sourceLookup = context.lookup().source();
sourceLookup.setSegmentAndDocument(subReaderContext, nestedSubDocId); sourceLookup.setSegmentAndDocument(subReaderContext, nestedSubDocId);
@ -307,26 +335,6 @@ public class FetchPhase implements SearchPhase {
return new SearchHit(nestedTopDocId, uid.id(), documentMapper.typeText(), nestedIdentity, searchFields); return new SearchHit(nestedTopDocId, uid.id(), documentMapper.typeText(), nestedIdentity, searchFields);
} }
private Map<String, DocumentField> getSearchFields(SearchContext context, int nestedSubDocId, Set<String> fieldNames,
List<String> fieldNamePatterns, LeafReaderContext subReaderContext) {
Map<String, DocumentField> searchFields = null;
if (context.hasStoredFields() && !context.storedFieldsContext().fieldNames().isEmpty()) {
FieldsVisitor nestedFieldsVisitor = new CustomFieldsVisitor(fieldNames == null ? Collections.emptySet() : fieldNames,
fieldNamePatterns == null ? Collections.emptyList() : fieldNamePatterns, false);
if (nestedFieldsVisitor != null) {
loadStoredFields(context, subReaderContext, nestedFieldsVisitor, nestedSubDocId);
nestedFieldsVisitor.postProcess(context.mapperService());
if (!nestedFieldsVisitor.fields().isEmpty()) {
searchFields = new HashMap<>(nestedFieldsVisitor.fields().size());
for (Map.Entry<String, List<Object>> entry : nestedFieldsVisitor.fields().entrySet()) {
searchFields.put(entry.getKey(), new DocumentField(entry.getKey(), entry.getValue()));
}
}
}
}
return searchFields;
}
private SearchHit.NestedIdentity getInternalNestedIdentity(SearchContext context, int nestedSubDocId, private SearchHit.NestedIdentity getInternalNestedIdentity(SearchContext context, int nestedSubDocId,
LeafReaderContext subReaderContext, LeafReaderContext subReaderContext,
MapperService mapperService, MapperService mapperService,

View File

@ -100,7 +100,7 @@ public class HighlightPhase extends AbstractComponent implements FetchSubPhase {
if (highlightQuery == null) { if (highlightQuery == null) {
highlightQuery = context.parsedQuery().query(); highlightQuery = context.parsedQuery().query();
} }
HighlighterContext highlighterContext = new HighlighterContext(fieldName, HighlighterContext highlighterContext = new HighlighterContext(fieldType.name(),
field, fieldType, context, hitContext, highlightQuery); field, fieldType, context, hitContext, highlightQuery);
if ((highlighter.canHighlight(fieldType) == false) && fieldNameContainsWildcards) { if ((highlighter.canHighlight(fieldType) == false) && fieldNameContainsWildcards) {
@ -109,7 +109,11 @@ public class HighlightPhase extends AbstractComponent implements FetchSubPhase {
} }
HighlightField highlightField = highlighter.highlight(highlighterContext); HighlightField highlightField = highlighter.highlight(highlighterContext);
if (highlightField != null) { if (highlightField != null) {
highlightFields.put(highlightField.name(), highlightField); // Note that we make sure to use the original field name in the response. This is because the
// original field could be an alias, and highlighter implementations may instead reference the
// concrete field it points to.
highlightFields.put(fieldName,
new HighlightField(fieldName, highlightField.fragments()));
} }
} }
} }

View File

@ -148,7 +148,7 @@ public class LeafFieldsLookup implements Map {
reader.document(docId, fieldVisitor); reader.document(docId, fieldVisitor);
fieldVisitor.postProcess(mapperService); fieldVisitor.postProcess(mapperService);
List<Object> storedFields = fieldVisitor.fields().get(data.fieldType().name()); List<Object> storedFields = fieldVisitor.fields().get(data.fieldType().name());
data.fields(singletonMap(name, storedFields)); data.fields(singletonMap(fieldName, storedFields));
} catch (IOException e) { } catch (IOException e) {
throw new ElasticsearchParseException("failed to load field [{}]", e, name); throw new ElasticsearchParseException("failed to load field [{}]", e, name);
} }

View File

@ -27,7 +27,6 @@ import org.elasticsearch.common.io.stream.NamedWriteable;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.lucene.BytesRefs; import org.elasticsearch.common.lucene.BytesRefs;
import org.elasticsearch.common.xcontent.ToXContent.Params;
import org.elasticsearch.common.xcontent.ToXContentFragment; import org.elasticsearch.common.xcontent.ToXContentFragment;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
@ -321,7 +320,7 @@ public abstract class SuggestionBuilder<T extends SuggestionBuilder<T>> implemen
suggestionContext.setAnalyzer(luceneAnalyzer); suggestionContext.setAnalyzer(luceneAnalyzer);
} }
suggestionContext.setField(field); suggestionContext.setField(fieldType.name());
if (size != null) { if (size != null) {
suggestionContext.setSize(size); suggestionContext.setSize(size);

View File

@ -29,8 +29,8 @@ import org.elasticsearch.common.unit.DistanceUnit;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.common.xcontent.XContentParser.Token; import org.elasticsearch.common.xcontent.XContentParser.Token;
import org.elasticsearch.index.mapper.FieldMapper;
import org.elasticsearch.index.mapper.GeoPointFieldMapper; import org.elasticsearch.index.mapper.GeoPointFieldMapper;
import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.ParseContext; import org.elasticsearch.index.mapper.ParseContext;
import org.elasticsearch.index.mapper.ParseContext.Document; import org.elasticsearch.index.mapper.ParseContext.Document;
@ -138,8 +138,8 @@ public class GeoContextMapping extends ContextMapping<GeoQueryContext> {
@Override @Override
public Set<CharSequence> parseContext(ParseContext parseContext, XContentParser parser) throws IOException, ElasticsearchParseException { public Set<CharSequence> parseContext(ParseContext parseContext, XContentParser parser) throws IOException, ElasticsearchParseException {
if (fieldName != null) { if (fieldName != null) {
FieldMapper mapper = parseContext.docMapper().mappers().getMapper(fieldName); MappedFieldType fieldType = parseContext.mapperService().fullName(fieldName);
if (!(mapper instanceof GeoPointFieldMapper)) { if (!(fieldType instanceof GeoPointFieldMapper.GeoPointFieldType)) {
throw new ElasticsearchParseException("referenced field must be mapped to geo_point"); throw new ElasticsearchParseException("referenced field must be mapped to geo_point");
} }
} }

View File

@ -21,12 +21,11 @@ package org.elasticsearch.index.analysis;
import org.apache.lucene.analysis.Analyzer; import org.apache.lucene.analysis.Analyzer;
import org.elasticsearch.Version; import org.elasticsearch.Version;
import org.elasticsearch.cluster.metadata.IndexMetaData; import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.common.Strings;
import org.elasticsearch.common.compress.CompressedXContent;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.index.mapper.DocumentMapper; import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.FieldMapper; import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.indices.analysis.PreBuiltAnalyzers; import org.elasticsearch.indices.analysis.PreBuiltAnalyzers;
import org.elasticsearch.plugins.Plugin; import org.elasticsearch.plugins.Plugin;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESSingleNodeTestCase;
@ -84,14 +83,14 @@ public class PreBuiltAnalyzerTests extends ESSingleNodeTestCase {
NamedAnalyzer namedAnalyzer = new PreBuiltAnalyzerProvider(analyzerName, AnalyzerScope.INDEX, randomPreBuiltAnalyzer.getAnalyzer(randomVersion)).get(); NamedAnalyzer namedAnalyzer = new PreBuiltAnalyzerProvider(analyzerName, AnalyzerScope.INDEX, randomPreBuiltAnalyzer.getAnalyzer(randomVersion)).get();
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties").startObject("field").field("type", "text").field("analyzer", analyzerName).endObject().endObject() .startObject("properties").startObject("field").field("type", "text").field("analyzer", analyzerName).endObject().endObject()
.endObject().endObject()); .endObject().endObject();
DocumentMapper docMapper = createIndex("test", indexSettings).mapperService().documentMapperParser().parse("type", new CompressedXContent(mapping)); MapperService mapperService = createIndex("test", indexSettings, "type", mapping).mapperService();
FieldMapper fieldMapper = docMapper.mappers().getMapper("field"); MappedFieldType fieldType = mapperService.fullName("field");
assertThat(fieldMapper.fieldType().searchAnalyzer(), instanceOf(NamedAnalyzer.class)); assertThat(fieldType.searchAnalyzer(), instanceOf(NamedAnalyzer.class));
NamedAnalyzer fieldMapperNamedAnalyzer = fieldMapper.fieldType().searchAnalyzer(); NamedAnalyzer fieldMapperNamedAnalyzer = fieldType.searchAnalyzer();
assertThat(fieldMapperNamedAnalyzer.analyzer(), is(namedAnalyzer.analyzer())); assertThat(fieldMapperNamedAnalyzer.analyzer(), is(namedAnalyzer.analyzer()));
} }

View File

@ -27,6 +27,8 @@ import org.elasticsearch.common.compress.CompressedXContent;
import org.elasticsearch.common.compress.CompressorFactory; import org.elasticsearch.common.compress.CompressorFactory;
import org.elasticsearch.common.io.stream.BytesStreamOutput; import org.elasticsearch.common.io.stream.BytesStreamOutput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.plugins.Plugin; import org.elasticsearch.plugins.Plugin;
@ -49,32 +51,32 @@ public class BinaryFieldMapperTests extends ESSingleNodeTestCase {
} }
public void testDefaultMapping() throws Exception { public void testDefaultMapping() throws Exception {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties") .startObject("properties")
.startObject("field") .startObject("field")
.field("type", "binary") .field("type", "binary")
.endObject() .endObject()
.endObject() .endObject()
.endObject().endObject()); .endObject().endObject();
DocumentMapper mapper = createIndex("test").mapperService().documentMapperParser().parse("type", new CompressedXContent(mapping)); MapperService mapperService = createIndex("test", Settings.EMPTY, "type", mapping).mapperService();
MappedFieldType fieldType = mapperService.fullName("field");
FieldMapper fieldMapper = mapper.mappers().getMapper("field"); assertThat(fieldType, instanceOf(BinaryFieldMapper.BinaryFieldType.class));
assertThat(fieldMapper, instanceOf(BinaryFieldMapper.class)); assertThat(fieldType.stored(), equalTo(false));
assertThat(fieldMapper.fieldType().stored(), equalTo(false));
} }
public void testStoredValue() throws IOException { public void testStoredValue() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties") .startObject("properties")
.startObject("field") .startObject("field")
.field("type", "binary") .field("type", "binary")
.field("store", true) .field("store", true)
.endObject() .endObject()
.endObject() .endObject()
.endObject().endObject()); .endObject().endObject();
DocumentMapper mapper = createIndex("test").mapperService().documentMapperParser().parse("type", new CompressedXContent(mapping)); MapperService mapperService = createIndex("test", Settings.EMPTY, "type", mapping).mapperService();
// case 1: a simple binary value // case 1: a simple binary value
final byte[] binaryValue1 = new byte[100]; final byte[] binaryValue1 = new byte[100];
@ -89,13 +91,14 @@ public class BinaryFieldMapperTests extends ESSingleNodeTestCase {
assertTrue(CompressorFactory.isCompressed(new BytesArray(binaryValue2))); assertTrue(CompressorFactory.isCompressed(new BytesArray(binaryValue2)));
for (byte[] value : Arrays.asList(binaryValue1, binaryValue2)) { for (byte[] value : Arrays.asList(binaryValue1, binaryValue2)) {
ParsedDocument doc = mapper.parse(SourceToParse.source("test", "type", "id", ParsedDocument doc = mapperService.documentMapper().parse(SourceToParse.source("test", "type", "id",
BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("field", value).endObject()), BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("field", value).endObject()),
XContentType.JSON)); XContentType.JSON));
BytesRef indexedValue = doc.rootDoc().getBinaryValue("field"); BytesRef indexedValue = doc.rootDoc().getBinaryValue("field");
assertEquals(new BytesRef(value), indexedValue); assertEquals(new BytesRef(value), indexedValue);
FieldMapper fieldMapper = mapper.mappers().getMapper("field");
Object originalValue = fieldMapper.fieldType().valueForDisplay(indexedValue); MappedFieldType fieldType = mapperService.fullName("field");
Object originalValue = fieldType.valueForDisplay(indexedValue);
assertEquals(new BytesArray(value), originalValue); assertEquals(new BytesArray(value), originalValue);
} }
} }

View File

@ -52,7 +52,6 @@ import static org.hamcrest.Matchers.containsString;
public class BooleanFieldMapperTests extends ESSingleNodeTestCase { public class BooleanFieldMapperTests extends ESSingleNodeTestCase {
private IndexService indexService; private IndexService indexService;
private DocumentMapperParser parser; private DocumentMapperParser parser;
private DocumentMapperParser preEs6Parser;
@Before @Before
public void setup() { public void setup() {
@ -101,7 +100,7 @@ public class BooleanFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = parser.parse("type", new CompressedXContent(mapping)); DocumentMapper defaultMapper = parser.parse("type", new CompressedXContent(mapping));
FieldMapper mapper = defaultMapper.mappers().getMapper("field"); Mapper mapper = defaultMapper.mappers().getMapper("field");
XContentBuilder builder = XContentFactory.jsonBuilder().startObject(); XContentBuilder builder = XContentFactory.jsonBuilder().startObject();
mapper.toXContent(builder, ToXContent.EMPTY_PARAMS); mapper.toXContent(builder, ToXContent.EMPTY_PARAMS);
builder.endObject(); builder.endObject();

View File

@ -32,6 +32,7 @@ import org.apache.lucene.util.automaton.RegExp;
import org.elasticsearch.common.Strings; import org.elasticsearch.common.Strings;
import org.elasticsearch.common.bytes.BytesReference; import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.compress.CompressedXContent; import org.elasticsearch.common.compress.CompressedXContent;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.unit.Fuzziness; import org.elasticsearch.common.unit.Fuzziness;
import org.elasticsearch.common.xcontent.ToXContent; import org.elasticsearch.common.xcontent.ToXContent;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
@ -61,10 +62,9 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
assertThat(fieldMapper, instanceOf(CompletionFieldMapper.class)); assertThat(fieldMapper, instanceOf(CompletionFieldMapper.class));
MappedFieldType completionFieldType = ((CompletionFieldMapper) fieldMapper).fieldType();
MappedFieldType completionFieldType = fieldMapper.fieldType();
NamedAnalyzer indexAnalyzer = completionFieldType.indexAnalyzer(); NamedAnalyzer indexAnalyzer = completionFieldType.indexAnalyzer();
assertThat(indexAnalyzer.name(), equalTo("simple")); assertThat(indexAnalyzer.name(), equalTo("simple"));
@ -94,10 +94,9 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
assertThat(fieldMapper, instanceOf(CompletionFieldMapper.class)); assertThat(fieldMapper, instanceOf(CompletionFieldMapper.class));
MappedFieldType completionFieldType = ((CompletionFieldMapper) fieldMapper).fieldType();
MappedFieldType completionFieldType = fieldMapper.fieldType();
NamedAnalyzer indexAnalyzer = completionFieldType.indexAnalyzer(); NamedAnalyzer indexAnalyzer = completionFieldType.indexAnalyzer();
assertThat(indexAnalyzer.name(), equalTo("simple")); assertThat(indexAnalyzer.name(), equalTo("simple"));
@ -129,12 +128,11 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
assertThat(fieldMapper, instanceOf(CompletionFieldMapper.class)); assertThat(fieldMapper, instanceOf(CompletionFieldMapper.class));
CompletionFieldMapper completionFieldMapper = (CompletionFieldMapper) fieldMapper;
XContentBuilder builder = jsonBuilder().startObject(); XContentBuilder builder = jsonBuilder().startObject();
completionFieldMapper.toXContent(builder, ToXContent.EMPTY_PARAMS).endObject(); fieldMapper.toXContent(builder, ToXContent.EMPTY_PARAMS).endObject();
builder.close(); builder.close();
Map<String, Object> serializedMap = createParser(JsonXContent.jsonXContent, BytesReference.bytes(builder)).map(); Map<String, Object> serializedMap = createParser(JsonXContent.jsonXContent, BytesReference.bytes(builder)).map();
Map<String, Object> configMap = (Map<String, Object>) serializedMap.get("completion"); Map<String, Object> configMap = (Map<String, Object>) serializedMap.get("completion");
@ -153,15 +151,15 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
MappedFieldType completionFieldType = fieldMapper.fieldType();
ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference
.bytes(XContentFactory.jsonBuilder() .bytes(XContentFactory.jsonBuilder()
.startObject() .startObject()
.field("completion", "suggestion") .field("completion", "suggestion")
.endObject()), .endObject()),
XContentType.JSON)); XContentType.JSON));
IndexableField[] fields = parsedDocument.rootDoc().getFields(completionFieldType.name()); IndexableField[] fields = parsedDocument.rootDoc().getFields(fieldMapper.name());
assertSuggestFields(fields, 1); assertSuggestFields(fields, 1);
} }
@ -192,15 +190,15 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
MappedFieldType completionFieldType = fieldMapper.fieldType();
ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference
.bytes(XContentFactory.jsonBuilder() .bytes(XContentFactory.jsonBuilder()
.startObject() .startObject()
.array("completion", "suggestion1", "suggestion2") .array("completion", "suggestion1", "suggestion2")
.endObject()), .endObject()),
XContentType.JSON)); XContentType.JSON));
IndexableField[] fields = parsedDocument.rootDoc().getFields(completionFieldType.name()); IndexableField[] fields = parsedDocument.rootDoc().getFields(fieldMapper.name());
assertSuggestFields(fields, 2); assertSuggestFields(fields, 2);
} }
@ -212,8 +210,8 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
MappedFieldType completionFieldType = fieldMapper.fieldType();
ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference
.bytes(XContentFactory.jsonBuilder() .bytes(XContentFactory.jsonBuilder()
.startObject() .startObject()
@ -223,7 +221,7 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject() .endObject()
.endObject()), .endObject()),
XContentType.JSON)); XContentType.JSON));
IndexableField[] fields = parsedDocument.rootDoc().getFields(completionFieldType.name()); IndexableField[] fields = parsedDocument.rootDoc().getFields(fieldMapper.name());
assertSuggestFields(fields, 1); assertSuggestFields(fields, 1);
} }
@ -235,8 +233,8 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
MappedFieldType completionFieldType = fieldMapper.fieldType();
ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference
.bytes(XContentFactory.jsonBuilder() .bytes(XContentFactory.jsonBuilder()
.startObject() .startObject()
@ -246,10 +244,50 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject() .endObject()
.endObject()), .endObject()),
XContentType.JSON)); XContentType.JSON));
IndexableField[] fields = parsedDocument.rootDoc().getFields(completionFieldType.name()); IndexableField[] fields = parsedDocument.rootDoc().getFields(fieldMapper.name());
assertSuggestFields(fields, 3); assertSuggestFields(fields, 3);
} }
public void testParsingWithGeoFieldAlias() throws Exception {
XContentBuilder mapping = XContentFactory.jsonBuilder().startObject()
.startObject("type1")
.startObject("properties")
.startObject("completion")
.field("type", "completion")
.startObject("contexts")
.field("name", "location")
.field("type", "geo")
.field("path", "alias")
.endObject()
.endObject()
.startObject("birth-place")
.field("type", "geo_point")
.endObject()
.startObject("alias")
.field("type", "alias")
.field("path", "birth-place")
.endObject()
.endObject()
.endObject()
.endObject();
MapperService mapperService = createIndex("test", Settings.EMPTY, "type1", mapping).mapperService();
Mapper fieldMapper = mapperService.documentMapper().mappers().getMapper("completion");
ParsedDocument parsedDocument = mapperService.documentMapper().parse(SourceToParse.source("test", "type1", "1",
BytesReference.bytes(XContentFactory.jsonBuilder()
.startObject()
.startObject("completion")
.field("input", "suggestion")
.startObject("contexts")
.field("location", "37.77,-122.42")
.endObject()
.endObject()
.endObject()), XContentType.JSON));
IndexableField[] fields = parsedDocument.rootDoc().getFields(fieldMapper.name());
assertSuggestFields(fields, 1);
}
public void testParsingFull() throws Exception { public void testParsingFull() throws Exception {
String mapping = Strings.toString(jsonBuilder().startObject().startObject("type1") String mapping = Strings.toString(jsonBuilder().startObject().startObject("type1")
.startObject("properties").startObject("completion") .startObject("properties").startObject("completion")
@ -258,8 +296,8 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
MappedFieldType completionFieldType = fieldMapper.fieldType();
ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference
.bytes(XContentFactory.jsonBuilder() .bytes(XContentFactory.jsonBuilder()
.startObject() .startObject()
@ -279,7 +317,7 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endArray() .endArray()
.endObject()), .endObject()),
XContentType.JSON)); XContentType.JSON));
IndexableField[] fields = parsedDocument.rootDoc().getFields(completionFieldType.name()); IndexableField[] fields = parsedDocument.rootDoc().getFields(fieldMapper.name());
assertSuggestFields(fields, 3); assertSuggestFields(fields, 3);
} }
@ -291,8 +329,8 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
MappedFieldType completionFieldType = fieldMapper.fieldType();
ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference ParsedDocument parsedDocument = defaultMapper.parse(SourceToParse.source("test", "type1", "1", BytesReference
.bytes(XContentFactory.jsonBuilder() .bytes(XContentFactory.jsonBuilder()
.startObject() .startObject()
@ -312,7 +350,7 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endArray() .endArray()
.endObject()), .endObject()),
XContentType.JSON)); XContentType.JSON));
IndexableField[] fields = parsedDocument.rootDoc().getFields(completionFieldType.name()); IndexableField[] fields = parsedDocument.rootDoc().getFields(fieldMapper.name());
assertSuggestFields(fields, 6); assertSuggestFields(fields, 6);
} }
@ -420,7 +458,7 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
CompletionFieldMapper completionFieldMapper = (CompletionFieldMapper) fieldMapper; CompletionFieldMapper completionFieldMapper = (CompletionFieldMapper) fieldMapper;
Query prefixQuery = completionFieldMapper.fieldType().prefixQuery(new BytesRef("co")); Query prefixQuery = completionFieldMapper.fieldType().prefixQuery(new BytesRef("co"));
assertThat(prefixQuery, instanceOf(PrefixCompletionQuery.class)); assertThat(prefixQuery, instanceOf(PrefixCompletionQuery.class));
@ -434,7 +472,7 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
CompletionFieldMapper completionFieldMapper = (CompletionFieldMapper) fieldMapper; CompletionFieldMapper completionFieldMapper = (CompletionFieldMapper) fieldMapper;
Query prefixQuery = completionFieldMapper.fieldType().fuzzyQuery("co", Query prefixQuery = completionFieldMapper.fieldType().fuzzyQuery("co",
Fuzziness.fromEdits(FuzzyCompletionQuery.DEFAULT_MAX_EDITS), FuzzyCompletionQuery.DEFAULT_NON_FUZZY_PREFIX, Fuzziness.fromEdits(FuzzyCompletionQuery.DEFAULT_MAX_EDITS), FuzzyCompletionQuery.DEFAULT_NON_FUZZY_PREFIX,
@ -451,7 +489,7 @@ public class CompletionFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("completion"); Mapper fieldMapper = defaultMapper.mappers().getMapper("completion");
CompletionFieldMapper completionFieldMapper = (CompletionFieldMapper) fieldMapper; CompletionFieldMapper completionFieldMapper = (CompletionFieldMapper) fieldMapper;
Query prefixQuery = completionFieldMapper.fieldType() Query prefixQuery = completionFieldMapper.fieldType()
.regexpQuery(new BytesRef("co"), RegExp.ALL, Operations.DEFAULT_MAX_DETERMINIZED_STATES); .regexpQuery(new BytesRef("co"), RegExp.ALL, Operations.DEFAULT_MAX_DETERMINIZED_STATES);

View File

@ -72,7 +72,7 @@ public class CopyToMapperTests extends ESSingleNodeTestCase {
IndexService index = createIndex("test"); IndexService index = createIndex("test");
client().admin().indices().preparePutMapping("test").setType("type1").setSource(mapping, XContentType.JSON).get(); client().admin().indices().preparePutMapping("test").setType("type1").setSource(mapping, XContentType.JSON).get();
DocumentMapper docMapper = index.mapperService().documentMapper("type1"); DocumentMapper docMapper = index.mapperService().documentMapper("type1");
FieldMapper fieldMapper = docMapper.mappers().getMapper("copy_test"); Mapper fieldMapper = docMapper.mappers().getMapper("copy_test");
// Check json serialization // Check json serialization
TextFieldMapper stringFieldMapper = (TextFieldMapper) fieldMapper; TextFieldMapper stringFieldMapper = (TextFieldMapper) fieldMapper;
@ -123,7 +123,7 @@ public class CopyToMapperTests extends ESSingleNodeTestCase {
docMapper = index.mapperService().documentMapper("type1"); docMapper = index.mapperService().documentMapper("type1");
fieldMapper = docMapper.mappers().getMapper("new_field"); fieldMapper = docMapper.mappers().getMapper("new_field");
assertThat(fieldMapper.fieldType().typeName(), equalTo("long")); assertThat(fieldMapper.typeName(), equalTo("long"));
} }
public void testCopyToFieldsInnerObjectParsing() throws Exception { public void testCopyToFieldsInnerObjectParsing() throws Exception {
@ -308,13 +308,15 @@ public class CopyToMapperTests extends ESSingleNodeTestCase {
MapperService mapperService = createIndex("test").mapperService(); MapperService mapperService = createIndex("test").mapperService();
DocumentMapper docMapperBefore = mapperService.merge("type1", new CompressedXContent(mappingBefore), MapperService.MergeReason.MAPPING_UPDATE); DocumentMapper docMapperBefore = mapperService.merge("type1", new CompressedXContent(mappingBefore), MapperService.MergeReason.MAPPING_UPDATE);
FieldMapper fieldMapperBefore = (FieldMapper) docMapperBefore.mappers().getMapper("copy_test");
assertEquals(Arrays.asList("foo", "bar"), docMapperBefore.mappers().getMapper("copy_test").copyTo().copyToFields()); assertEquals(Arrays.asList("foo", "bar"), fieldMapperBefore.copyTo().copyToFields());
DocumentMapper docMapperAfter = mapperService.merge("type1", new CompressedXContent(mappingAfter), MapperService.MergeReason.MAPPING_UPDATE); DocumentMapper docMapperAfter = mapperService.merge("type1", new CompressedXContent(mappingAfter), MapperService.MergeReason.MAPPING_UPDATE);
FieldMapper fieldMapperAfter = (FieldMapper) docMapperAfter.mappers().getMapper("copy_test");
assertEquals(Arrays.asList("baz", "bar"), docMapperAfter.mappers().getMapper("copy_test").copyTo().copyToFields()); assertEquals(Arrays.asList("baz", "bar"), fieldMapperAfter.copyTo().copyToFields());
assertEquals(Arrays.asList("foo", "bar"), docMapperBefore.mappers().getMapper("copy_test").copyTo().copyToFields()); assertEquals(Arrays.asList("foo", "bar"), fieldMapperBefore.copyTo().copyToFields());
} }
public void testCopyToNestedField() throws Exception { public void testCopyToNestedField() throws Exception {

View File

@ -382,11 +382,11 @@ public class DateFieldMapperTests extends ESSingleNodeTestCase {
.startObject("properties") .startObject("properties")
.startObject("release_date").field("type", "date").field("format", "yyyy/MM/dd").endObject() .startObject("release_date").field("type", "date").field("format", "yyyy/MM/dd").endObject()
.endObject().endObject().endObject()); .endObject().endObject().endObject());
DocumentMapper initMapper = indexService.mapperService().merge("movie", new CompressedXContent(initMapping), indexService.mapperService().merge("movie", new CompressedXContent(initMapping),
MapperService.MergeReason.MAPPING_UPDATE); MapperService.MergeReason.MAPPING_UPDATE);
assertThat(initMapper.mappers().getMapper("release_date"), notNullValue()); assertThat(indexService.mapperService().fullName("release_date"), notNullValue());
assertFalse(initMapper.mappers().getMapper("release_date").fieldType().stored()); assertFalse(indexService.mapperService().fullName("release_date").stored());
String updateFormatMapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("movie") String updateFormatMapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("movie")
.startObject("properties") .startObject("properties")

View File

@ -39,6 +39,7 @@ import org.elasticsearch.index.query.QueryShardContext;
import java.io.IOException; import java.io.IOException;
import java.io.StringReader; import java.io.StringReader;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collections;
import java.util.List; import java.util.List;
public class DocumentFieldMapperTests extends LuceneTestCase { public class DocumentFieldMapperTests extends LuceneTestCase {
@ -138,7 +139,12 @@ public class DocumentFieldMapperTests extends LuceneTestCase {
Analyzer defaultSearch = new FakeAnalyzer("default_search"); Analyzer defaultSearch = new FakeAnalyzer("default_search");
Analyzer defaultSearchQuote = new FakeAnalyzer("default_search_quote"); Analyzer defaultSearchQuote = new FakeAnalyzer("default_search_quote");
DocumentFieldMappers documentFieldMappers = new DocumentFieldMappers(Arrays.asList(fieldMapper1, fieldMapper2), defaultIndex, defaultSearch, defaultSearchQuote); DocumentFieldMappers documentFieldMappers = new DocumentFieldMappers(
Arrays.asList(fieldMapper1, fieldMapper2),
Collections.emptyList(),
defaultIndex,
defaultSearch,
defaultSearchQuote);
assertAnalyzes(documentFieldMappers.indexAnalyzer(), "field1", "index"); assertAnalyzes(documentFieldMappers.indexAnalyzer(), "field1", "index");
assertAnalyzes(documentFieldMappers.searchAnalyzer(), "field1", "search"); assertAnalyzes(documentFieldMappers.searchAnalyzer(), "field1", "search");

View File

@ -23,9 +23,10 @@ import org.elasticsearch.common.Strings;
import org.elasticsearch.common.bytes.BytesArray; import org.elasticsearch.common.bytes.BytesArray;
import org.elasticsearch.common.bytes.BytesReference; import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.compress.CompressedXContent; import org.elasticsearch.common.compress.CompressedXContent;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.index.analysis.NamedAnalyzer;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESSingleNodeTestCase;
import java.io.IOException; import java.io.IOException;
@ -104,38 +105,50 @@ public class DocumentMapperMergeTests extends ESSingleNodeTestCase {
} }
public void testMergeSearchAnalyzer() throws Exception { public void testMergeSearchAnalyzer() throws Exception {
DocumentMapperParser parser = createIndex("test").mapperService().documentMapperParser(); XContentBuilder mapping1 = XContentFactory.jsonBuilder().startObject().startObject("type")
String mapping1 = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") .startObject("properties").startObject("field")
.startObject("properties").startObject("field").field("type", "text").field("analyzer", "standard").field("search_analyzer", "whitespace").endObject().endObject() .field("type", "text")
.endObject().endObject()); .field("analyzer", "standard")
.field("search_analyzer", "whitespace")
.endObject().endObject()
.endObject().endObject();
MapperService mapperService = createIndex("test", Settings.EMPTY, "type", mapping1).mapperService();
assertThat(mapperService.fullName("field").searchAnalyzer().name(), equalTo("whitespace"));
String mapping2 = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") String mapping2 = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties").startObject("field").field("type", "text").field("analyzer", "standard").field("search_analyzer", "keyword").endObject().endObject() .startObject("properties").startObject("field")
.endObject().endObject()); .field("type", "text")
.field("analyzer", "standard")
.field("search_analyzer", "keyword")
.endObject().endObject()
.endObject().endObject());
DocumentMapper existing = parser.parse("type", new CompressedXContent(mapping1)); mapperService.merge("type", new CompressedXContent(mapping2), MapperService.MergeReason.MAPPING_UPDATE);
DocumentMapper changed = parser.parse("type", new CompressedXContent(mapping2)); assertThat(mapperService.fullName("field").searchAnalyzer().name(), equalTo("keyword"));
assertThat(((NamedAnalyzer) existing.mappers().getMapper("field").fieldType().searchAnalyzer()).name(), equalTo("whitespace"));
DocumentMapper merged = existing.merge(changed.mapping());
assertThat(((NamedAnalyzer) merged.mappers().getMapper("field").fieldType().searchAnalyzer()).name(), equalTo("keyword"));
} }
public void testChangeSearchAnalyzerToDefault() throws Exception { public void testChangeSearchAnalyzerToDefault() throws Exception {
MapperService mapperService = createIndex("test").mapperService(); XContentBuilder mapping1 = XContentFactory.jsonBuilder().startObject().startObject("type")
String mapping1 = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") .startObject("properties").startObject("field")
.startObject("properties").startObject("field").field("type", "text").field("analyzer", "standard").field("search_analyzer", "whitespace").endObject().endObject() .field("type", "text")
.endObject().endObject()); .field("analyzer", "standard")
.field("search_analyzer", "whitespace")
.endObject().endObject()
.endObject().endObject();
MapperService mapperService = createIndex("test", Settings.EMPTY, "type", mapping1).mapperService();
assertThat(mapperService.fullName("field").searchAnalyzer().name(), equalTo("whitespace"));
String mapping2 = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") String mapping2 = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties").startObject("field").field("type", "text").field("analyzer", "standard").endObject().endObject() .startObject("properties").startObject("field")
.endObject().endObject()); .field("type", "text")
.field("analyzer", "standard")
.endObject().endObject()
.endObject().endObject());
DocumentMapper existing = mapperService.merge("type", new CompressedXContent(mapping1), MapperService.MergeReason.MAPPING_UPDATE); mapperService.merge("type", new CompressedXContent(mapping2), MapperService.MergeReason.MAPPING_UPDATE);
DocumentMapper merged = mapperService.merge("type", new CompressedXContent(mapping2), MapperService.MergeReason.MAPPING_UPDATE); assertThat(mapperService.fullName("field").searchAnalyzer().name(), equalTo("standard"));
assertThat(((NamedAnalyzer) existing.mappers().getMapper("field").fieldType().searchAnalyzer()).name(), equalTo("whitespace"));
assertThat(((NamedAnalyzer) merged.mappers().getMapper("field").fieldType().searchAnalyzer()).name(), equalTo("standard"));
} }
public void testConcurrentMergeTest() throws Throwable { public void testConcurrentMergeTest() throws Throwable {

View File

@ -1001,7 +1001,7 @@ public class DocumentParserTests extends ESSingleNodeTestCase {
BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/simple/test1.json")); BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/simple/test1.json"));
Document doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc(); Document doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc();
assertThat(doc.get(docMapper.mappers().getMapper("name.first").fieldType().name()), equalTo("shay")); assertThat(doc.get(docMapper.mappers().getMapper("name.first").name()), equalTo("shay"));
doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc(); doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc();
} }
@ -1014,8 +1014,8 @@ public class DocumentParserTests extends ESSingleNodeTestCase {
DocumentMapper builtDocMapper = parser.parse("person", new CompressedXContent(builtMapping)); DocumentMapper builtDocMapper = parser.parse("person", new CompressedXContent(builtMapping));
BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/simple/test1.json")); BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/simple/test1.json"));
Document doc = builtDocMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc(); Document doc = builtDocMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc();
assertThat(doc.getBinaryValue(docMapper.idFieldMapper().fieldType().name()), equalTo(Uid.encodeId("1"))); assertThat(doc.getBinaryValue(docMapper.idFieldMapper().name()), equalTo(Uid.encodeId("1")));
assertThat(doc.get(docMapper.mappers().getMapper("name.first").fieldType().name()), equalTo("shay")); assertThat(doc.get(docMapper.mappers().getMapper("name.first").name()), equalTo("shay"));
} }
public void testSimpleParser() throws Exception { public void testSimpleParser() throws Exception {
@ -1026,8 +1026,8 @@ public class DocumentParserTests extends ESSingleNodeTestCase {
BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/simple/test1.json")); BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/simple/test1.json"));
Document doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc(); Document doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc();
assertThat(doc.getBinaryValue(docMapper.idFieldMapper().fieldType().name()), equalTo(Uid.encodeId("1"))); assertThat(doc.getBinaryValue(docMapper.idFieldMapper().name()), equalTo(Uid.encodeId("1")));
assertThat(doc.get(docMapper.mappers().getMapper("name.first").fieldType().name()), equalTo("shay")); assertThat(doc.get(docMapper.mappers().getMapper("name.first").name()), equalTo("shay"));
} }
public void testSimpleParserNoTypeNoId() throws Exception { public void testSimpleParserNoTypeNoId() throws Exception {
@ -1035,8 +1035,8 @@ public class DocumentParserTests extends ESSingleNodeTestCase {
DocumentMapper docMapper = createIndex("test").mapperService().documentMapperParser().parse("person", new CompressedXContent(mapping)); DocumentMapper docMapper = createIndex("test").mapperService().documentMapperParser().parse("person", new CompressedXContent(mapping));
BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/simple/test1-notype-noid.json")); BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/simple/test1-notype-noid.json"));
Document doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc(); Document doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc();
assertThat(doc.getBinaryValue(docMapper.idFieldMapper().fieldType().name()), equalTo(Uid.encodeId("1"))); assertThat(doc.getBinaryValue(docMapper.idFieldMapper().name()), equalTo(Uid.encodeId("1")));
assertThat(doc.get(docMapper.mappers().getMapper("name.first").fieldType().name()), equalTo("shay")); assertThat(doc.get(docMapper.mappers().getMapper("name.first").name()), equalTo("shay"));
} }
public void testAttributes() throws Exception { public void testAttributes() throws Exception {
@ -1389,4 +1389,98 @@ public class DocumentParserTests extends ESSingleNodeTestCase {
client().prepareIndex("idx", "type").setSource(bytes2, XContentType.JSON).get()); client().prepareIndex("idx", "type").setSource(bytes2, XContentType.JSON).get());
assertThat(ExceptionsHelper.detailedMessage(err), containsString("field name cannot be an empty string")); assertThat(ExceptionsHelper.detailedMessage(err), containsString("field name cannot be an empty string"));
} }
public void testWriteToFieldAlias() throws Exception {
String mapping = Strings.toString(XContentFactory.jsonBuilder()
.startObject()
.startObject("type")
.startObject("properties")
.startObject("alias-field")
.field("type", "alias")
.field("path", "concrete-field")
.endObject()
.startObject("concrete-field")
.field("type", "keyword")
.endObject()
.endObject()
.endObject()
.endObject());
DocumentMapperParser mapperParser = createIndex("test").mapperService().documentMapperParser();
DocumentMapper mapper = mapperParser.parse("type", new CompressedXContent(mapping));
BytesReference bytes = BytesReference.bytes(XContentFactory.jsonBuilder()
.startObject()
.field("alias-field", "value")
.endObject());
MapperParsingException exception = expectThrows(MapperParsingException.class,
() -> mapper.parse(SourceToParse.source("test", "type", "1", bytes, XContentType.JSON)));
assertEquals("Cannot write to a field alias [alias-field].", exception.getCause().getMessage());
}
public void testCopyToFieldAlias() throws Exception {
String mapping = Strings.toString(XContentFactory.jsonBuilder()
.startObject()
.startObject("type")
.startObject("properties")
.startObject("alias-field")
.field("type", "alias")
.field("path", "concrete-field")
.endObject()
.startObject("concrete-field")
.field("type", "keyword")
.endObject()
.startObject("text-field")
.field("type", "text")
.field("copy_to", "alias-field")
.endObject()
.endObject()
.endObject()
.endObject());
DocumentMapperParser mapperParser = createIndex("test").mapperService().documentMapperParser();
DocumentMapper mapper = mapperParser.parse("type", new CompressedXContent(mapping));
BytesReference bytes = BytesReference.bytes(XContentFactory.jsonBuilder()
.startObject()
.field("text-field", "value")
.endObject());
MapperParsingException exception = expectThrows(MapperParsingException.class,
() -> mapper.parse(SourceToParse.source("test", "type", "1", bytes, XContentType.JSON)));
assertEquals("Cannot copy to a field alias [alias-field].", exception.getCause().getMessage());
}
public void testDynamicDottedFieldNameWithFieldAlias() throws Exception {
String mapping = Strings.toString(XContentFactory.jsonBuilder()
.startObject()
.startObject("type")
.startObject("properties")
.startObject("alias-field")
.field("type", "alias")
.field("path", "concrete-field")
.endObject()
.startObject("concrete-field")
.field("type", "keyword")
.endObject()
.endObject()
.endObject()
.endObject());
DocumentMapperParser mapperParser = createIndex("test").mapperService().documentMapperParser();
DocumentMapper mapper = mapperParser.parse("type", new CompressedXContent(mapping));
BytesReference bytes = BytesReference.bytes(XContentFactory.jsonBuilder()
.startObject()
.startObject("alias-field.dynamic-field")
.field("type", "keyword")
.endObject()
.endObject());
MapperParsingException exception = expectThrows(MapperParsingException.class,
() -> mapper.parse(SourceToParse.source("test", "type", "1", bytes, XContentType.JSON)));
assertEquals("Could not dynamically add mapping for field [alias-field.dynamic-field]. "
+ "Existing mapping for [alias-field] must be of type object but found [alias].", exception.getMessage());
}
} }

View File

@ -45,7 +45,9 @@ public class DoubleIndexingDocTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
IndexService index = createIndex("test"); IndexService index = createIndex("test");
client().admin().indices().preparePutMapping("test").setType("type").setSource(mapping, XContentType.JSON).get(); client().admin().indices().preparePutMapping("test").setType("type").setSource(mapping, XContentType.JSON).get();
DocumentMapper mapper = index.mapperService().documentMapper("type"); MapperService mapperService = index.mapperService();
DocumentMapper mapper = mapperService.documentMapper();
QueryShardContext context = index.newQueryShardContext(0, null, () -> 0L, null); QueryShardContext context = index.newQueryShardContext(0, null, () -> 0L, null);
ParsedDocument doc = mapper.parse(SourceToParse.source("test", "type", "1", BytesReference ParsedDocument doc = mapper.parse(SourceToParse.source("test", "type", "1", BytesReference
@ -61,7 +63,6 @@ public class DoubleIndexingDocTests extends ESSingleNodeTestCase {
assertNotNull(doc.dynamicMappingsUpdate()); assertNotNull(doc.dynamicMappingsUpdate());
client().admin().indices().preparePutMapping("test").setType("type") client().admin().indices().preparePutMapping("test").setType("type")
.setSource(doc.dynamicMappingsUpdate().toString(), XContentType.JSON).get(); .setSource(doc.dynamicMappingsUpdate().toString(), XContentType.JSON).get();
mapper = index.mapperService().documentMapper("type");
writer.addDocument(doc.rootDoc()); writer.addDocument(doc.rootDoc());
writer.addDocument(doc.rootDoc()); writer.addDocument(doc.rootDoc());
@ -69,25 +70,25 @@ public class DoubleIndexingDocTests extends ESSingleNodeTestCase {
IndexReader reader = DirectoryReader.open(writer); IndexReader reader = DirectoryReader.open(writer);
IndexSearcher searcher = new IndexSearcher(reader); IndexSearcher searcher = new IndexSearcher(reader);
TopDocs topDocs = searcher.search(mapper.mappers().getMapper("field1").fieldType().termQuery("value1", context), 10); TopDocs topDocs = searcher.search(mapperService.fullName("field1").termQuery("value1", context), 10);
assertThat(topDocs.totalHits, equalTo(2L)); assertThat(topDocs.totalHits, equalTo(2L));
topDocs = searcher.search(mapper.mappers().getMapper("field2").fieldType().termQuery("1", context), 10); topDocs = searcher.search(mapperService.fullName("field2").termQuery("1", context), 10);
assertThat(topDocs.totalHits, equalTo(2L)); assertThat(topDocs.totalHits, equalTo(2L));
topDocs = searcher.search(mapper.mappers().getMapper("field3").fieldType().termQuery("1.1", context), 10); topDocs = searcher.search(mapperService.fullName("field3").termQuery("1.1", context), 10);
assertThat(topDocs.totalHits, equalTo(2L)); assertThat(topDocs.totalHits, equalTo(2L));
topDocs = searcher.search(mapper.mappers().getMapper("field4").fieldType().termQuery("2010-01-01", context), 10); topDocs = searcher.search(mapperService.fullName("field4").termQuery("2010-01-01", context), 10);
assertThat(topDocs.totalHits, equalTo(2L)); assertThat(topDocs.totalHits, equalTo(2L));
topDocs = searcher.search(mapper.mappers().getMapper("field5").fieldType().termQuery("1", context), 10); topDocs = searcher.search(mapperService.fullName("field5").termQuery("1", context), 10);
assertThat(topDocs.totalHits, equalTo(2L)); assertThat(topDocs.totalHits, equalTo(2L));
topDocs = searcher.search(mapper.mappers().getMapper("field5").fieldType().termQuery("2", context), 10); topDocs = searcher.search(mapperService.fullName("field5").termQuery("2", context), 10);
assertThat(topDocs.totalHits, equalTo(2L)); assertThat(topDocs.totalHits, equalTo(2L));
topDocs = searcher.search(mapper.mappers().getMapper("field5").fieldType().termQuery("3", context), 10); topDocs = searcher.search(mapperService.fullName("field5").termQuery("3", context), 10);
assertThat(topDocs.totalHits, equalTo(2L)); assertThat(topDocs.totalHits, equalTo(2L));
writer.close(); writer.close();
reader.close(); reader.close();

View File

@ -625,11 +625,11 @@ public class DynamicMappingTests extends ESSingleNodeTestCase {
.setSource(doc.dynamicMappingsUpdate().toString(), XContentType.JSON).get(); .setSource(doc.dynamicMappingsUpdate().toString(), XContentType.JSON).get();
defaultMapper = index.mapperService().documentMapper("type"); defaultMapper = index.mapperService().documentMapper("type");
FieldMapper mapper = defaultMapper.mappers().getMapper("s_long"); Mapper mapper = defaultMapper.mappers().getMapper("s_long");
assertThat(mapper.fieldType().typeName(), equalTo("long")); assertThat(mapper.typeName(), equalTo("long"));
mapper = defaultMapper.mappers().getMapper("s_double"); mapper = defaultMapper.mappers().getMapper("s_double");
assertThat(mapper.fieldType().typeName(), equalTo("float")); assertThat(mapper.typeName(), equalTo("float"));
} }
public void testNumericDetectionDefault() throws Exception { public void testNumericDetectionDefault() throws Exception {
@ -652,7 +652,7 @@ public class DynamicMappingTests extends ESSingleNodeTestCase {
.setSource(doc.dynamicMappingsUpdate().toString(), XContentType.JSON).get()); .setSource(doc.dynamicMappingsUpdate().toString(), XContentType.JSON).get());
defaultMapper = index.mapperService().documentMapper("type"); defaultMapper = index.mapperService().documentMapper("type");
FieldMapper mapper = defaultMapper.mappers().getMapper("s_long"); Mapper mapper = defaultMapper.mappers().getMapper("s_long");
assertThat(mapper, instanceOf(TextFieldMapper.class)); assertThat(mapper, instanceOf(TextFieldMapper.class));
mapper = defaultMapper.mappers().getMapper("s_double"); mapper = defaultMapper.mappers().getMapper("s_double");

View File

@ -30,11 +30,11 @@ import org.elasticsearch.common.xcontent.json.JsonXContent;
import org.elasticsearch.index.IndexService; import org.elasticsearch.index.IndexService;
import org.elasticsearch.index.mapper.ParseContext.Document; import org.elasticsearch.index.mapper.ParseContext.Document;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESSingleNodeTestCase;
import org.hamcrest.Matchers;
import static org.elasticsearch.test.StreamsUtils.copyToBytesFromClasspath; import static org.elasticsearch.test.StreamsUtils.copyToBytesFromClasspath;
import static org.elasticsearch.test.StreamsUtils.copyToStringFromClasspath; import static org.elasticsearch.test.StreamsUtils.copyToStringFromClasspath;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.notNullValue;
public class DynamicTemplatesTests extends ESSingleNodeTestCase { public class DynamicTemplatesTests extends ESSingleNodeTestCase {
public void testMatchTypeOnly() throws Exception { public void testMatchTypeOnly() throws Exception {
@ -45,7 +45,9 @@ public class DynamicTemplatesTests extends ESSingleNodeTestCase {
.endObject().endObject().endArray().endObject().endObject(); .endObject().endObject().endArray().endObject().endObject();
IndexService index = createIndex("test"); IndexService index = createIndex("test");
client().admin().indices().preparePutMapping("test").setType("person").setSource(builder).get(); client().admin().indices().preparePutMapping("test").setType("person").setSource(builder).get();
DocumentMapper docMapper = index.mapperService().documentMapper("person");
MapperService mapperService = index.mapperService();
DocumentMapper docMapper = mapperService.documentMapper("person");
builder = JsonXContent.contentBuilder(); builder = JsonXContent.contentBuilder();
builder.startObject().field("s", "hello").field("l", 1).endObject(); builder.startObject().field("s", "hello").field("l", 1).endObject();
ParsedDocument parsedDoc = docMapper.parse(SourceToParse.source("test", "person", "1", BytesReference.bytes(builder), ParsedDocument parsedDoc = docMapper.parse(SourceToParse.source("test", "person", "1", BytesReference.bytes(builder),
@ -53,14 +55,11 @@ public class DynamicTemplatesTests extends ESSingleNodeTestCase {
client().admin().indices().preparePutMapping("test").setType("person") client().admin().indices().preparePutMapping("test").setType("person")
.setSource(parsedDoc.dynamicMappingsUpdate().toString(), XContentType.JSON).get(); .setSource(parsedDoc.dynamicMappingsUpdate().toString(), XContentType.JSON).get();
docMapper = index.mapperService().documentMapper("person"); assertThat(mapperService.fullName("s"), notNullValue());
DocumentFieldMappers mappers = docMapper.mappers(); assertEquals(IndexOptions.NONE, mapperService.fullName("s").indexOptions());
assertThat(mappers.getMapper("s"), Matchers.notNullValue()); assertThat(mapperService.fullName("l"), notNullValue());
assertEquals(IndexOptions.NONE, mappers.getMapper("s").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("l").indexOptions());
assertThat(mappers.getMapper("l"), Matchers.notNullValue());
assertNotSame(IndexOptions.NONE, mappers.getMapper("l").fieldType().indexOptions());
} }
@ -84,7 +83,7 @@ public class DynamicTemplatesTests extends ESSingleNodeTestCase {
assertNotSame(IndexOptions.NONE, f.fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, f.fieldType().indexOptions());
assertThat(f.fieldType().tokenized(), equalTo(false)); assertThat(f.fieldType().tokenized(), equalTo(false));
FieldMapper fieldMapper = docMapper.mappers().getMapper("name"); Mapper fieldMapper = docMapper.mappers().getMapper("name");
assertNotNull(fieldMapper); assertNotNull(fieldMapper);
f = doc.getField("multi1"); f = doc.getField("multi1");
@ -143,7 +142,7 @@ public class DynamicTemplatesTests extends ESSingleNodeTestCase {
assertNotSame(IndexOptions.NONE, f.fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, f.fieldType().indexOptions());
assertThat(f.fieldType().tokenized(), equalTo(false)); assertThat(f.fieldType().tokenized(), equalTo(false));
FieldMapper fieldMapper = docMapper.mappers().getMapper("name"); Mapper fieldMapper = docMapper.mappers().getMapper("name");
assertNotNull(fieldMapper); assertNotNull(fieldMapper);
f = doc.getField("multi1"); f = doc.getField("multi1");

View File

@ -0,0 +1,167 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index.mapper;
import org.elasticsearch.common.Strings;
import org.elasticsearch.common.compress.CompressedXContent;
import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.index.IndexService;
import org.elasticsearch.index.mapper.MapperService.MergeReason;
import org.elasticsearch.test.ESSingleNodeTestCase;
import org.junit.Before;
import java.io.IOException;
public class FieldAliasMapperTests extends ESSingleNodeTestCase {
private MapperService mapperService;
private DocumentMapperParser parser;
@Before
public void setup() {
IndexService indexService = createIndex("test");
mapperService = indexService.mapperService();
parser = mapperService.documentMapperParser();
}
public void testParsing() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder()
.startObject()
.startObject("type")
.startObject("properties")
.startObject("alias-field")
.field("type", "alias")
.field("path", "concrete-field")
.endObject()
.startObject("concrete-field")
.field("type", "keyword")
.endObject()
.endObject()
.endObject()
.endObject());
DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping));
assertEquals(mapping, mapper.mappingSource().toString());
}
public void testParsingWithMissingPath() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder()
.startObject()
.startObject("type")
.startObject("properties")
.startObject("alias-field")
.field("type", "alias")
.endObject()
.endObject()
.endObject()
.endObject());
MapperParsingException exception = expectThrows(MapperParsingException.class,
() -> parser.parse("type", new CompressedXContent(mapping)));
assertEquals("The [path] property must be specified for field [alias-field].", exception.getMessage());
}
public void testParsingWithExtraArgument() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder()
.startObject()
.startObject("type")
.startObject("properties")
.startObject("alias-field")
.field("type", "alias")
.field("path", "concrete-field")
.field("extra-field", "extra-value")
.endObject()
.endObject()
.endObject()
.endObject());
MapperParsingException exception = expectThrows(MapperParsingException.class,
() -> parser.parse("type", new CompressedXContent(mapping)));
assertEquals("Mapping definition for [alias-field] has unsupported parameters: [extra-field : extra-value]",
exception.getMessage());
}
public void testMerge() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject()
.startObject("type")
.startObject("properties")
.startObject("first-field")
.field("type", "keyword")
.endObject()
.startObject("alias-field")
.field("type", "alias")
.field("path", "first-field")
.endObject()
.endObject()
.endObject()
.endObject());
mapperService.merge("type", new CompressedXContent(mapping), MergeReason.MAPPING_UPDATE);
MappedFieldType firstFieldType = mapperService.fullName("alias-field");
assertEquals("first-field", firstFieldType.name());
assertTrue(firstFieldType instanceof KeywordFieldMapper.KeywordFieldType);
String newMapping = Strings.toString(XContentFactory.jsonBuilder().startObject()
.startObject("type")
.startObject("properties")
.startObject("second-field")
.field("type", "text")
.endObject()
.startObject("alias-field")
.field("type", "alias")
.field("path", "second-field")
.endObject()
.endObject()
.endObject()
.endObject());
mapperService.merge("type", new CompressedXContent(newMapping), MergeReason.MAPPING_UPDATE);
MappedFieldType secondFieldType = mapperService.fullName("alias-field");
assertEquals("second-field", secondFieldType.name());
assertTrue(secondFieldType instanceof TextFieldMapper.TextFieldType);
}
public void testMergeFailure() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject()
.startObject("type")
.startObject("properties")
.startObject("concrete-field")
.field("type", "text")
.endObject()
.startObject("alias-field")
.field("type", "alias")
.field("path", "concrete-field")
.endObject()
.endObject()
.endObject()
.endObject());
mapperService.merge("type", new CompressedXContent(mapping), MergeReason.MAPPING_UPDATE);
String newMapping = Strings.toString(XContentFactory.jsonBuilder().startObject()
.startObject("type")
.startObject("properties")
.startObject("alias-field")
.field("type", "keyword")
.endObject()
.endObject()
.endObject()
.endObject());
IllegalArgumentException exception = expectThrows(IllegalArgumentException.class,
() -> mapperService.merge("type", new CompressedXContent(newMapping), MergeReason.MAPPING_UPDATE));
assertEquals("Cannot merge a field alias mapping [alias-field] with a mapping that is not for a field alias.",
exception.getMessage());
}
}

View File

@ -28,10 +28,11 @@ import org.elasticsearch.test.ESTestCase;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collection; import java.util.Collection;
import java.util.Collections;
import java.util.Iterator; import java.util.Iterator;
import java.util.List; import java.util.List;
import static java.util.Collections.emptyList;
public class FieldTypeLookupTests extends ESTestCase { public class FieldTypeLookupTests extends ESTestCase {
public void testEmpty() { public void testEmpty() {
@ -48,7 +49,7 @@ public class FieldTypeLookupTests extends ESTestCase {
public void testDefaultMapping() { public void testDefaultMapping() {
FieldTypeLookup lookup = new FieldTypeLookup(); FieldTypeLookup lookup = new FieldTypeLookup();
try { try {
lookup.copyAndAddAll(MapperService.DEFAULT_MAPPING, Collections.emptyList()); lookup.copyAndAddAll(MapperService.DEFAULT_MAPPING, emptyList(), emptyList());
fail(); fail();
} catch (IllegalArgumentException expected) { } catch (IllegalArgumentException expected) {
assertEquals("Default mappings should not be added to the lookup", expected.getMessage()); assertEquals("Default mappings should not be added to the lookup", expected.getMessage());
@ -58,7 +59,7 @@ public class FieldTypeLookupTests extends ESTestCase {
public void testAddNewField() { public void testAddNewField() {
FieldTypeLookup lookup = new FieldTypeLookup(); FieldTypeLookup lookup = new FieldTypeLookup();
MockFieldMapper f = new MockFieldMapper("foo"); MockFieldMapper f = new MockFieldMapper("foo");
FieldTypeLookup lookup2 = lookup.copyAndAddAll("type", newList(f)); FieldTypeLookup lookup2 = lookup.copyAndAddAll("type", newList(f), emptyList());
assertNull(lookup.get("foo")); assertNull(lookup.get("foo"));
assertNull(lookup.get("bar")); assertNull(lookup.get("bar"));
assertEquals(f.fieldType(), lookup2.get("foo")); assertEquals(f.fieldType(), lookup2.get("foo"));
@ -70,68 +71,203 @@ public class FieldTypeLookupTests extends ESTestCase {
MockFieldMapper f = new MockFieldMapper("foo"); MockFieldMapper f = new MockFieldMapper("foo");
MockFieldMapper f2 = new MockFieldMapper("foo"); MockFieldMapper f2 = new MockFieldMapper("foo");
FieldTypeLookup lookup = new FieldTypeLookup(); FieldTypeLookup lookup = new FieldTypeLookup();
lookup = lookup.copyAndAddAll("type1", newList(f)); lookup = lookup.copyAndAddAll("type1", newList(f), emptyList());
FieldTypeLookup lookup2 = lookup.copyAndAddAll("type2", newList(f2)); FieldTypeLookup lookup2 = lookup.copyAndAddAll("type2", newList(f2), emptyList());
assertEquals(1, size(lookup2.iterator())); assertEquals(1, size(lookup2.iterator()));
assertSame(f.fieldType(), lookup2.get("foo")); assertSame(f.fieldType(), lookup2.get("foo"));
assertEquals(f2.fieldType(), lookup2.get("foo")); assertEquals(f2.fieldType(), lookup2.get("foo"));
} }
public void testCheckCompatibilityMismatchedTypes() { public void testMismatchedFieldTypes() {
FieldMapper f1 = new MockFieldMapper("foo"); FieldMapper f1 = new MockFieldMapper("foo");
FieldTypeLookup lookup = new FieldTypeLookup(); FieldTypeLookup lookup = new FieldTypeLookup();
lookup = lookup.copyAndAddAll("type", newList(f1)); lookup = lookup.copyAndAddAll("type", newList(f1), emptyList());
OtherFakeFieldType ft2 = new OtherFakeFieldType(); OtherFakeFieldType ft2 = new OtherFakeFieldType();
ft2.setName("foo"); ft2.setName("foo");
FieldMapper f2 = new MockFieldMapper("foo", ft2); FieldMapper f2 = new MockFieldMapper("foo", ft2);
try { try {
lookup.copyAndAddAll("type2", newList(f2)); lookup.copyAndAddAll("type2", newList(f2), emptyList());
fail("expected type mismatch"); fail("expected type mismatch");
} catch (IllegalArgumentException e) { } catch (IllegalArgumentException e) {
assertTrue(e.getMessage().contains("cannot be changed from type [faketype] to [otherfaketype]")); assertTrue(e.getMessage().contains("cannot be changed from type [faketype] to [otherfaketype]"));
} }
} }
public void testCheckCompatibilityConflict() { public void testConflictingFieldTypes() {
FieldMapper f1 = new MockFieldMapper("foo"); FieldMapper f1 = new MockFieldMapper("foo");
FieldTypeLookup lookup = new FieldTypeLookup(); FieldTypeLookup lookup = new FieldTypeLookup();
lookup = lookup.copyAndAddAll("type", newList(f1)); lookup = lookup.copyAndAddAll("type", newList(f1), emptyList());
MappedFieldType ft2 = new MockFieldMapper.FakeFieldType(); MappedFieldType ft2 = new MockFieldMapper.FakeFieldType();
ft2.setName("foo"); ft2.setName("foo");
ft2.setBoost(2.0f); ft2.setBoost(2.0f);
FieldMapper f2 = new MockFieldMapper("foo", ft2); FieldMapper f2 = new MockFieldMapper("foo", ft2);
lookup.copyAndAddAll("type", newList(f2)); // boost is updateable, so ok since we are implicitly updating all types lookup.copyAndAddAll("type", newList(f2), emptyList()); // boost is updateable, so ok since we are implicitly updating all types
lookup.copyAndAddAll("type2", newList(f2)); // boost is updateable, so ok if forcing lookup.copyAndAddAll("type2", newList(f2), emptyList()); // boost is updateable, so ok if forcing
// now with a non changeable setting // now with a non changeable setting
MappedFieldType ft3 = new MockFieldMapper.FakeFieldType(); MappedFieldType ft3 = new MockFieldMapper.FakeFieldType();
ft3.setName("foo"); ft3.setName("foo");
ft3.setStored(true); ft3.setStored(true);
FieldMapper f3 = new MockFieldMapper("foo", ft3); FieldMapper f3 = new MockFieldMapper("foo", ft3);
try { try {
lookup.copyAndAddAll("type2", newList(f3)); lookup.copyAndAddAll("type2", newList(f3), emptyList());
fail("expected conflict"); fail("expected conflict");
} catch (IllegalArgumentException e) { } catch (IllegalArgumentException e) {
assertTrue(e.getMessage().contains("has different [store] values")); assertTrue(e.getMessage().contains("has different [store] values"));
} }
} }
public void testSimpleMatchFullNames() { public void testAddFieldAlias() {
MockFieldMapper f1 = new MockFieldMapper("foo"); MockFieldMapper field = new MockFieldMapper("foo");
MockFieldMapper f2 = new MockFieldMapper("bar"); FieldAliasMapper alias = new FieldAliasMapper("alias", "alias", "foo");
FieldTypeLookup lookup = new FieldTypeLookup(); FieldTypeLookup lookup = new FieldTypeLookup();
lookup = lookup.copyAndAddAll("type", newList(f1, f2)); lookup = lookup.copyAndAddAll("type", newList(field), newList(alias));
MappedFieldType aliasType = lookup.get("alias");
assertEquals(field.fieldType(), aliasType);
}
public void testUpdateFieldAlias() {
// Add an alias 'alias' to the concrete field 'foo'.
MockFieldMapper.FakeFieldType fieldType1 = new MockFieldMapper.FakeFieldType();
MockFieldMapper field1 = new MockFieldMapper("foo", fieldType1);
FieldAliasMapper alias1 = new FieldAliasMapper("alias", "alias", "foo");
FieldTypeLookup lookup = new FieldTypeLookup();
lookup = lookup.copyAndAddAll("type", newList(field1), newList(alias1));
// Check that the alias refers to 'foo'.
MappedFieldType aliasType1 = lookup.get("alias");
assertEquals(fieldType1, aliasType1);
// Update the alias to refer to a new concrete field 'bar'.
MockFieldMapper.FakeFieldType fieldType2 = new MockFieldMapper.FakeFieldType();
fieldType2.setStored(!fieldType1.stored());
MockFieldMapper field2 = new MockFieldMapper("bar", fieldType2);
FieldAliasMapper alias2 = new FieldAliasMapper("alias", "alias", "bar");
lookup = lookup.copyAndAddAll("type", newList(field2), newList(alias2));
// Check that the alias now refers to 'bar'.
MappedFieldType aliasType2 = lookup.get("alias");
assertEquals(fieldType2, aliasType2);
}
public void testUpdateConcreteFieldWithAlias() {
// Add an alias 'alias' to the concrete field 'foo'.
FieldAliasMapper alias1 = new FieldAliasMapper("alias", "alias", "foo");
MockFieldMapper.FakeFieldType fieldType1 = new MockFieldMapper.FakeFieldType();
fieldType1.setBoost(1.0f);
MockFieldMapper field1 = new MockFieldMapper("foo", fieldType1);
FieldTypeLookup lookup = new FieldTypeLookup();
lookup = lookup.copyAndAddAll("type", newList(field1), newList(alias1));
// Check that the alias maps to this field type.
MappedFieldType aliasType1 = lookup.get("alias");
assertEquals(fieldType1, aliasType1);
// Update the boost for field 'foo'.
MockFieldMapper.FakeFieldType fieldType2 = new MockFieldMapper.FakeFieldType();
fieldType2.setBoost(2.0f);
MockFieldMapper field2 = new MockFieldMapper("foo", fieldType2);
lookup = lookup.copyAndAddAll("type", newList(field2), emptyList());
// Check that the alias maps to the new field type.
MappedFieldType aliasType2 = lookup.get("alias");
assertEquals(fieldType2, aliasType2);
}
public void testAliasThatRefersToAlias() {
MockFieldMapper field = new MockFieldMapper("foo");
FieldAliasMapper alias = new FieldAliasMapper("alias", "alias", "foo");
FieldTypeLookup lookup = new FieldTypeLookup()
.copyAndAddAll("type", newList(field), newList(alias));
FieldAliasMapper invalidAlias = new FieldAliasMapper("invalid-alias", "invalid-alias", "alias");
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
() -> lookup.copyAndAddAll("type", emptyList(), newList(invalidAlias)));
assertEquals("Invalid [path] value [alias] for field alias [invalid-alias]: an alias" +
" cannot refer to another alias.", e.getMessage());
}
public void testAliasThatRefersToItself() {
FieldAliasMapper invalidAlias = new FieldAliasMapper("invalid-alias", "invalid-alias", "invalid-alias");
FieldTypeLookup lookup = new FieldTypeLookup();
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
() -> lookup.copyAndAddAll("type", emptyList(), newList(invalidAlias)));
assertEquals("Invalid [path] value [invalid-alias] for field alias [invalid-alias]: an alias" +
" cannot refer to itself.", e.getMessage());
}
public void testAliasWithNonExistentPath() {
FieldAliasMapper invalidAlias = new FieldAliasMapper("invalid-alias", "invalid-alias", "non-existent");
FieldTypeLookup lookup = new FieldTypeLookup();
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
() -> lookup.copyAndAddAll("type", emptyList(), newList(invalidAlias)));
assertEquals("Invalid [path] value [non-existent] for field alias [invalid-alias]: an alias" +
" must refer to an existing field in the mappings.", e.getMessage());
}
public void testAddAliasWithPreexistingField() {
MockFieldMapper field = new MockFieldMapper("field");
FieldTypeLookup lookup = new FieldTypeLookup()
.copyAndAddAll("type", newList(field), emptyList());
MockFieldMapper invalidField = new MockFieldMapper("invalid");
FieldAliasMapper invalidAlias = new FieldAliasMapper("invalid", "invalid", "field");
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
() -> lookup.copyAndAddAll("type", newList(invalidField), newList(invalidAlias)));
assertEquals("The name for field alias [invalid] has already been used to define a concrete field.",
e.getMessage());
}
public void testAddFieldWithPreexistingAlias() {
MockFieldMapper field = new MockFieldMapper("field");
FieldAliasMapper invalidAlias = new FieldAliasMapper("invalid", "invalid", "field");
FieldTypeLookup lookup = new FieldTypeLookup()
.copyAndAddAll("type", newList(field), newList(invalidAlias));
MockFieldMapper invalidField = new MockFieldMapper("invalid");
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
() -> lookup.copyAndAddAll("type", newList(invalidField), emptyList()));
assertEquals("The name for field [invalid] has already been used to define a field alias.",
e.getMessage());
}
public void testSimpleMatchToFullName() {
MockFieldMapper field1 = new MockFieldMapper("foo");
MockFieldMapper field2 = new MockFieldMapper("bar");
FieldAliasMapper alias1 = new FieldAliasMapper("food", "food", "foo");
FieldAliasMapper alias2 = new FieldAliasMapper("barometer", "barometer", "bar");
FieldTypeLookup lookup = new FieldTypeLookup();
lookup = lookup.copyAndAddAll("type",
newList(field1, field2),
newList(alias1, alias2));
Collection<String> names = lookup.simpleMatchToFullName("b*"); Collection<String> names = lookup.simpleMatchToFullName("b*");
assertFalse(names.contains("foo")); assertFalse(names.contains("foo"));
assertFalse(names.contains("food"));
assertTrue(names.contains("bar")); assertTrue(names.contains("bar"));
assertTrue(names.contains("barometer"));
} }
public void testIteratorImmutable() { public void testIteratorImmutable() {
MockFieldMapper f1 = new MockFieldMapper("foo"); MockFieldMapper f1 = new MockFieldMapper("foo");
FieldTypeLookup lookup = new FieldTypeLookup(); FieldTypeLookup lookup = new FieldTypeLookup();
lookup = lookup.copyAndAddAll("type", newList(f1)); lookup = lookup.copyAndAddAll("type", newList(f1), emptyList());
try { try {
Iterator<MappedFieldType> itr = lookup.iterator(); Iterator<MappedFieldType> itr = lookup.iterator();
@ -144,7 +280,11 @@ public class FieldTypeLookupTests extends ESTestCase {
} }
} }
static List<FieldMapper> newList(FieldMapper... mapper) { private static List<FieldMapper> newList(FieldMapper... mapper) {
return Arrays.asList(mapper);
}
private static List<FieldAliasMapper> newList(FieldAliasMapper... mapper) {
return Arrays.asList(mapper); return Arrays.asList(mapper);
} }

View File

@ -23,10 +23,7 @@ import org.apache.lucene.index.IndexableField;
import org.elasticsearch.common.bytes.BytesArray; import org.elasticsearch.common.bytes.BytesArray;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.index.IndexService; import org.elasticsearch.index.IndexService;
import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.FieldMapper;
import org.elasticsearch.index.mapper.ParseContext.Document; import org.elasticsearch.index.mapper.ParseContext.Document;
import org.elasticsearch.index.mapper.ParsedDocument;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESSingleNodeTestCase;
import static org.elasticsearch.test.StreamsUtils.copyToBytesFromClasspath; import static org.elasticsearch.test.StreamsUtils.copyToBytesFromClasspath;
@ -38,13 +35,14 @@ public class GenericStoreDynamicTemplateTests extends ESSingleNodeTestCase {
String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/dynamictemplate/genericstore/test-mapping.json"); String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/dynamictemplate/genericstore/test-mapping.json");
IndexService index = createIndex("test"); IndexService index = createIndex("test");
client().admin().indices().preparePutMapping("test").setType("person").setSource(mapping, XContentType.JSON).get(); client().admin().indices().preparePutMapping("test").setType("person").setSource(mapping, XContentType.JSON).get();
DocumentMapper docMapper = index.mapperService().documentMapper("person");
MapperService mapperService = index.mapperService();
byte[] json = copyToBytesFromClasspath("/org/elasticsearch/index/mapper/dynamictemplate/genericstore/test-data.json"); byte[] json = copyToBytesFromClasspath("/org/elasticsearch/index/mapper/dynamictemplate/genericstore/test-data.json");
ParsedDocument parsedDoc = docMapper.parse(SourceToParse.source("test", "person", "1", new BytesArray(json), ParsedDocument parsedDoc = mapperService.documentMapper().parse(
XContentType.JSON)); SourceToParse.source("test", "person", "1", new BytesArray(json), XContentType.JSON));
client().admin().indices().preparePutMapping("test").setType("person") client().admin().indices().preparePutMapping("test").setType("person")
.setSource(parsedDoc.dynamicMappingsUpdate().toString(), XContentType.JSON).get(); .setSource(parsedDoc.dynamicMappingsUpdate().toString(), XContentType.JSON).get();
docMapper = index.mapperService().documentMapper("person");
Document doc = parsedDoc.rootDoc(); Document doc = parsedDoc.rootDoc();
IndexableField f = doc.getField("name"); IndexableField f = doc.getField("name");
@ -52,8 +50,8 @@ public class GenericStoreDynamicTemplateTests extends ESSingleNodeTestCase {
assertThat(f.stringValue(), equalTo("some name")); assertThat(f.stringValue(), equalTo("some name"));
assertThat(f.fieldType().stored(), equalTo(true)); assertThat(f.fieldType().stored(), equalTo(true));
FieldMapper fieldMapper = docMapper.mappers().getMapper("name"); MappedFieldType fieldType = mapperService.fullName("name");
assertThat(fieldMapper.fieldType().stored(), equalTo(true)); assertThat(fieldType.stored(), equalTo(true));
boolean stored = false; boolean stored = false;
for (IndexableField field : doc.getFields("age")) { for (IndexableField field : doc.getFields("age")) {
@ -61,7 +59,7 @@ public class GenericStoreDynamicTemplateTests extends ESSingleNodeTestCase {
} }
assertTrue(stored); assertTrue(stored);
fieldMapper = docMapper.mappers().getMapper("age"); fieldType = mapperService.fullName("age");
assertThat(fieldMapper.fieldType().stored(), equalTo(true)); assertThat(fieldType.stored(), equalTo(true));
} }
} }

View File

@ -287,7 +287,7 @@ public class GeoPointFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser() DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser()
.parse("type1", new CompressedXContent(mapping)); .parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoPointFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoPointFieldMapper.class));
boolean ignoreZValue = ((GeoPointFieldMapper)fieldMapper).ignoreZValue().value(); boolean ignoreZValue = ((GeoPointFieldMapper)fieldMapper).ignoreZValue().value();
@ -364,10 +364,10 @@ public class GeoPointFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser() DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser()
.parse("type", new CompressedXContent(mapping)); .parse("type", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoPointFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoPointFieldMapper.class));
Object nullValue = fieldMapper.fieldType().nullValue(); Object nullValue = ((GeoPointFieldMapper) fieldMapper).fieldType().nullValue();
assertThat(nullValue, equalTo(new GeoPoint(1, 2))); assertThat(nullValue, equalTo(new GeoPoint(1, 2)));
ParsedDocument doc = defaultMapper.parse(SourceToParse.source("test", "type", "1", BytesReference ParsedDocument doc = defaultMapper.parse(SourceToParse.source("test", "type", "1", BytesReference

View File

@ -59,7 +59,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -83,7 +83,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
ShapeBuilder.Orientation orientation = ((GeoShapeFieldMapper)fieldMapper).fieldType().orientation(); ShapeBuilder.Orientation orientation = ((GeoShapeFieldMapper)fieldMapper).fieldType().orientation();
@ -121,7 +121,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
boolean coerce = ((GeoShapeFieldMapper)fieldMapper).coerce().value(); boolean coerce = ((GeoShapeFieldMapper)fieldMapper).coerce().value();
@ -157,7 +157,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser() DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser()
.parse("type1", new CompressedXContent(mapping)); .parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
boolean ignoreZValue = ((GeoShapeFieldMapper)fieldMapper).ignoreZValue().value(); boolean ignoreZValue = ((GeoShapeFieldMapper)fieldMapper).ignoreZValue().value();
@ -191,7 +191,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
Explicit<Boolean> ignoreMalformed = ((GeoShapeFieldMapper)fieldMapper).ignoreMalformed(); Explicit<Boolean> ignoreMalformed = ((GeoShapeFieldMapper)fieldMapper).ignoreMalformed();
@ -225,7 +225,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -248,7 +248,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -276,7 +276,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -300,7 +300,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -326,7 +326,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -350,7 +350,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -373,7 +373,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -395,7 +395,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -418,7 +418,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -440,7 +440,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = parser.parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -475,7 +475,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
} }
// verify nothing changed // verify nothing changed
FieldMapper fieldMapper = docMapper.mappers().getMapper("shape"); Mapper fieldMapper = docMapper.mappers().getMapper("shape");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;
@ -600,7 +600,7 @@ public class GeoShapeFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping)); DocumentMapper defaultMapper = createIndex("test").mapperService().documentMapperParser().parse("type1", new CompressedXContent(mapping));
FieldMapper fieldMapper = defaultMapper.mappers().getMapper("location"); Mapper fieldMapper = defaultMapper.mappers().getMapper("location");
assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class)); assertThat(fieldMapper, instanceOf(GeoShapeFieldMapper.class));
GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper; GeoShapeFieldMapper geoShapeFieldMapper = (GeoShapeFieldMapper) fieldMapper;

View File

@ -38,69 +38,68 @@ public class JavaMultiFieldMergeTests extends ESSingleNodeTestCase {
String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping1.json"); String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping1.json");
MapperService mapperService = createIndex("test").mapperService(); MapperService mapperService = createIndex("test").mapperService();
DocumentMapper docMapper = mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE); mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE);
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name").indexOptions());
assertThat(docMapper.mappers().getMapper("name.indexed"), nullValue()); assertThat(mapperService.fullName("name.indexed"), nullValue());
BytesReference json = BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("name", "some name").endObject()); BytesReference json = BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("name", "some name").endObject());
Document doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc(); Document doc = mapperService.documentMapper().parse(
SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc();
IndexableField f = doc.getField("name"); IndexableField f = doc.getField("name");
assertThat(f, notNullValue()); assertThat(f, notNullValue());
f = doc.getField("name.indexed"); f = doc.getField("name.indexed");
assertThat(f, nullValue()); assertThat(f, nullValue());
mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping2.json"); mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping2.json");
docMapper = mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE); mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE);
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name").indexOptions());
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertThat(mapperService.fullName("name.indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed2"), nullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed2"), nullValue()); assertThat(mapperService.fullName("name.not_indexed3"), nullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed3"), nullValue());
doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc(); doc = mapperService.documentMapper().parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc();
f = doc.getField("name"); f = doc.getField("name");
assertThat(f, notNullValue()); assertThat(f, notNullValue());
f = doc.getField("name.indexed"); f = doc.getField("name.indexed");
assertThat(f, notNullValue()); assertThat(f, notNullValue());
mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping3.json"); mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping3.json");
docMapper = mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE); mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE);
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name").indexOptions());
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertThat(mapperService.fullName("name.indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed2"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed2"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed3"), nullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed3"), nullValue());
mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping4.json"); mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping4.json");
docMapper = mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE); mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE);
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name").indexOptions());
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertThat(mapperService.fullName("name.indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed2"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed2"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed3"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed3"), notNullValue());
} }
public void testUpgradeFromMultiFieldTypeToMultiFields() throws Exception { public void testUpgradeFromMultiFieldTypeToMultiFields() throws Exception {
String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping1.json"); String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping1.json");
MapperService mapperService = createIndex("test").mapperService(); MapperService mapperService = createIndex("test").mapperService();
DocumentMapper docMapper = mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE); mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE);
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name").indexOptions());
assertThat(docMapper.mappers().getMapper("name.indexed"), nullValue()); assertThat(mapperService.fullName("name.indexed"), nullValue());
BytesReference json = BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("name", "some name").endObject()); BytesReference json = BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("name", "some name").endObject());
Document doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc(); Document doc = mapperService.documentMapper().parse(
SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc();
IndexableField f = doc.getField("name"); IndexableField f = doc.getField("name");
assertThat(f, notNullValue()); assertThat(f, notNullValue());
f = doc.getField("name.indexed"); f = doc.getField("name.indexed");
@ -108,32 +107,31 @@ public class JavaMultiFieldMergeTests extends ESSingleNodeTestCase {
mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/upgrade1.json"); mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/upgrade1.json");
docMapper = mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE); mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE);
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name").indexOptions());
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertThat(mapperService.fullName("name.indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed2"), nullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed2"), nullValue()); assertThat(mapperService.fullName("name.not_indexed3"), nullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed3"), nullValue());
doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc(); doc = mapperService.documentMapper().parse(
SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc();
f = doc.getField("name"); f = doc.getField("name");
assertThat(f, notNullValue()); assertThat(f, notNullValue());
f = doc.getField("name.indexed"); f = doc.getField("name.indexed");
assertThat(f, notNullValue()); assertThat(f, notNullValue());
mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/upgrade2.json"); mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/upgrade2.json");
docMapper = mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE); mapperService.merge("person", new CompressedXContent(mapping), MapperService.MergeReason.MAPPING_UPDATE);
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name").indexOptions());
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertThat(mapperService.fullName("name.indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed2"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed2"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed3"), nullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed3"), nullValue());
mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/upgrade3.json"); mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/upgrade3.json");
@ -146,10 +144,10 @@ public class JavaMultiFieldMergeTests extends ESSingleNodeTestCase {
} }
// There are conflicts, so the `name.not_indexed3` has not been added // There are conflicts, so the `name.not_indexed3` has not been added
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name").indexOptions());
assertThat(docMapper.mappers().getMapper("name.indexed"), notNullValue()); assertThat(mapperService.fullName("name.indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed2"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed2"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed3"), nullValue()); assertThat(mapperService.fullName("name.not_indexed3"), nullValue());
} }
} }

View File

@ -0,0 +1,118 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index.mapper;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.test.ESTestCase;
import java.util.Collections;
import java.util.HashMap;
import java.util.Map;
import static java.util.Collections.emptyList;
import static java.util.Collections.emptyMap;
import static java.util.Collections.singletonList;
public class MapperMergeValidatorTests extends ESTestCase {
public void testDuplicateFieldAliasAndObject() {
ObjectMapper objectMapper = createObjectMapper("some.path");
FieldAliasMapper aliasMapper = new FieldAliasMapper("path", "some.path", "field");
IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () ->
MapperMergeValidator.validateMapperStructure("type",
singletonList(objectMapper),
emptyList(),
singletonList(aliasMapper),
emptyMap(),
new FieldTypeLookup()));
assertEquals("Field [some.path] is defined both as an object and a field in [type]", e.getMessage());
}
public void testFieldAliasWithNestedScope() {
ObjectMapper objectMapper = createNestedObjectMapper("nested");
FieldAliasMapper aliasMapper = new FieldAliasMapper("alias", "nested.alias", "nested.field");
MapperMergeValidator.validateFieldReferences(emptyList(),
singletonList(aliasMapper),
Collections.singletonMap("nested", objectMapper),
new FieldTypeLookup());
}
public void testFieldAliasWithDifferentObjectScopes() {
Map<String, ObjectMapper> fullPathObjectMappers = new HashMap<>();
fullPathObjectMappers.put("object1", createObjectMapper("object1"));
fullPathObjectMappers.put("object2", createObjectMapper("object2"));
FieldAliasMapper aliasMapper = new FieldAliasMapper("alias", "object2.alias", "object1.field");
MapperMergeValidator.validateFieldReferences(emptyList(),
singletonList(aliasMapper),
fullPathObjectMappers,
new FieldTypeLookup());
}
public void testFieldAliasWithNestedTarget() {
ObjectMapper objectMapper = createNestedObjectMapper("nested");
FieldAliasMapper aliasMapper = new FieldAliasMapper("alias", "alias", "nested.field");
IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () ->
MapperMergeValidator.validateFieldReferences(emptyList(),
singletonList(aliasMapper),
Collections.singletonMap("nested", objectMapper),
new FieldTypeLookup()));
String expectedMessage = "Invalid [path] value [nested.field] for field alias [alias]: " +
"an alias must have the same nested scope as its target. The alias is not nested, " +
"but the target's nested scope is [nested].";
assertEquals(expectedMessage, e.getMessage());
}
public void testFieldAliasWithDifferentNestedScopes() {
Map<String, ObjectMapper> fullPathObjectMappers = new HashMap<>();
fullPathObjectMappers.put("nested1", createNestedObjectMapper("nested1"));
fullPathObjectMappers.put("nested2", createNestedObjectMapper("nested2"));
FieldAliasMapper aliasMapper = new FieldAliasMapper("alias", "nested2.alias", "nested1.field");
IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () ->
MapperMergeValidator.validateFieldReferences(emptyList(),
singletonList(aliasMapper),
fullPathObjectMappers,
new FieldTypeLookup()));
String expectedMessage = "Invalid [path] value [nested1.field] for field alias [nested2.alias]: " +
"an alias must have the same nested scope as its target. The alias's nested scope is [nested2], " +
"but the target's nested scope is [nested1].";
assertEquals(expectedMessage, e.getMessage());
}
private static ObjectMapper createObjectMapper(String name) {
return new ObjectMapper(name, name, true,
ObjectMapper.Nested.NO,
ObjectMapper.Dynamic.FALSE, emptyMap(), Settings.EMPTY);
}
private static ObjectMapper createNestedObjectMapper(String name) {
return new ObjectMapper(name, name, true,
ObjectMapper.Nested.newNested(false, false),
ObjectMapper.Dynamic.FALSE, emptyMap(), Settings.EMPTY);
}
}

View File

@ -235,6 +235,41 @@ public class MapperServiceTests extends ESSingleNodeTestCase {
containsString("cannot have nested fields when index sort is activated")); containsString("cannot have nested fields when index sort is activated"));
} }
public void testFieldAliasWithMismatchedNestedScope() throws Throwable {
IndexService indexService = createIndex("test");
MapperService mapperService = indexService.mapperService();
CompressedXContent mapping = new CompressedXContent(BytesReference.bytes(
XContentFactory.jsonBuilder().startObject()
.startObject("properties")
.startObject("nested")
.field("type", "nested")
.startObject("properties")
.startObject("field")
.field("type", "text")
.endObject()
.endObject()
.endObject()
.endObject()
.endObject()));
mapperService.merge("type", mapping, MergeReason.MAPPING_UPDATE);
CompressedXContent mappingUpdate = new CompressedXContent(BytesReference.bytes(
XContentFactory.jsonBuilder().startObject()
.startObject("properties")
.startObject("alias")
.field("type", "alias")
.field("path", "nested.field")
.endObject()
.endObject()
.endObject()));
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
() -> mapperService.merge("type", mappingUpdate, MergeReason.MAPPING_UPDATE));
assertThat(e.getMessage(), containsString("Invalid [path] value [nested.field] for field alias [alias]"));
}
public void testForbidMultipleTypes() throws IOException { public void testForbidMultipleTypes() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type").endObject().endObject()); String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type").endObject().endObject());
MapperService mapperService = createIndex("test").mapperService(); MapperService mapperService = createIndex("test").mapperService();

View File

@ -32,6 +32,7 @@ import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.common.xcontent.support.XContentMapValues; import org.elasticsearch.common.xcontent.support.XContentMapValues;
import org.elasticsearch.index.IndexService; import org.elasticsearch.index.IndexService;
import org.elasticsearch.index.mapper.ParseContext.Document; import org.elasticsearch.index.mapper.ParseContext.Document;
import org.elasticsearch.index.mapper.TextFieldMapper.TextFieldType;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESSingleNodeTestCase;
import java.io.IOException; import java.io.IOException;
@ -55,9 +56,15 @@ public class MultiFieldTests extends ESSingleNodeTestCase {
} }
private void testMultiField(String mapping) throws Exception { private void testMultiField(String mapping) throws Exception {
DocumentMapper docMapper = createIndex("test").mapperService().documentMapperParser().parse("person", new CompressedXContent(mapping)); IndexService indexService = createIndex("test");
MapperService mapperService = indexService.mapperService();
indexService.mapperService().merge("person", new CompressedXContent(mapping),
MapperService.MergeReason.MAPPING_UPDATE);
BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/multifield/test-data.json")); BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/multifield/test-data.json"));
Document doc = docMapper.parse(SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc(); Document doc = mapperService.documentMapper().parse(
SourceToParse.source("test", "person", "1", json, XContentType.JSON)).rootDoc();
IndexableField f = doc.getField("name"); IndexableField f = doc.getField("name");
assertThat(f.name(), equalTo("name")); assertThat(f.name(), equalTo("name"));
@ -84,37 +91,37 @@ public class MultiFieldTests extends ESSingleNodeTestCase {
assertThat(f.name(), equalTo("object1.multi1.string")); assertThat(f.name(), equalTo("object1.multi1.string"));
assertThat(f.binaryValue(), equalTo(new BytesRef("2010-01-01"))); assertThat(f.binaryValue(), equalTo(new BytesRef("2010-01-01")));
assertThat(docMapper.mappers().getMapper("name"), notNullValue()); assertThat(mapperService.fullName("name"), notNullValue());
assertThat(docMapper.mappers().getMapper("name"), instanceOf(TextFieldMapper.class)); assertThat(mapperService.fullName("name"), instanceOf(TextFieldType.class));
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name").indexOptions());
assertThat(docMapper.mappers().getMapper("name").fieldType().stored(), equalTo(true)); assertThat(mapperService.fullName("name").stored(), equalTo(true));
assertThat(docMapper.mappers().getMapper("name").fieldType().tokenized(), equalTo(true)); assertThat(mapperService.fullName("name").tokenized(), equalTo(true));
assertThat(docMapper.mappers().getMapper("name.indexed"), notNullValue()); assertThat(mapperService.fullName("name.indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.indexed"), instanceOf(TextFieldMapper.class)); assertThat(mapperService.fullName("name"), instanceOf(TextFieldType.class));
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name.indexed").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name.indexed").indexOptions());
assertThat(docMapper.mappers().getMapper("name.indexed").fieldType().stored(), equalTo(false)); assertThat(mapperService.fullName("name.indexed").stored(), equalTo(false));
assertThat(docMapper.mappers().getMapper("name.indexed").fieldType().tokenized(), equalTo(true)); assertThat(mapperService.fullName("name.indexed").tokenized(), equalTo(true));
assertThat(docMapper.mappers().getMapper("name.not_indexed"), notNullValue()); assertThat(mapperService.fullName("name.not_indexed"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.not_indexed"), instanceOf(TextFieldMapper.class)); assertThat(mapperService.fullName("name"), instanceOf(TextFieldType.class));
assertEquals(IndexOptions.NONE, docMapper.mappers().getMapper("name.not_indexed").fieldType().indexOptions()); assertEquals(IndexOptions.NONE, mapperService.fullName("name.not_indexed").indexOptions());
assertThat(docMapper.mappers().getMapper("name.not_indexed").fieldType().stored(), equalTo(true)); assertThat(mapperService.fullName("name.not_indexed").stored(), equalTo(true));
assertThat(docMapper.mappers().getMapper("name.not_indexed").fieldType().tokenized(), equalTo(true)); assertThat(mapperService.fullName("name.not_indexed").tokenized(), equalTo(true));
assertThat(docMapper.mappers().getMapper("name.test1"), notNullValue()); assertThat(mapperService.fullName("name.test1"), notNullValue());
assertThat(docMapper.mappers().getMapper("name.test1"), instanceOf(TextFieldMapper.class)); assertThat(mapperService.fullName("name"), instanceOf(TextFieldType.class));
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("name.test1").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("name.test1").indexOptions());
assertThat(docMapper.mappers().getMapper("name.test1").fieldType().stored(), equalTo(true)); assertThat(mapperService.fullName("name.test1").stored(), equalTo(true));
assertThat(docMapper.mappers().getMapper("name.test1").fieldType().tokenized(), equalTo(true)); assertThat(mapperService.fullName("name.test1").tokenized(), equalTo(true));
assertThat(docMapper.mappers().getMapper("name.test1").fieldType().eagerGlobalOrdinals(), equalTo(true)); assertThat(mapperService.fullName("name.test1").eagerGlobalOrdinals(), equalTo(true));
assertThat(docMapper.mappers().getMapper("object1.multi1"), notNullValue()); assertThat(mapperService.fullName("object1.multi1"), notNullValue());
assertThat(docMapper.mappers().getMapper("object1.multi1"), instanceOf(DateFieldMapper.class)); assertThat(mapperService.fullName("object1.multi1"), instanceOf(DateFieldMapper.DateFieldType.class));
assertThat(docMapper.mappers().getMapper("object1.multi1.string"), notNullValue()); assertThat(mapperService.fullName("object1.multi1.string"), notNullValue());
assertThat(docMapper.mappers().getMapper("object1.multi1.string"), instanceOf(KeywordFieldMapper.class)); assertThat(mapperService.fullName("object1.multi1.string"), instanceOf(KeywordFieldMapper.KeywordFieldType.class));
assertNotSame(IndexOptions.NONE, docMapper.mappers().getMapper("object1.multi1.string").fieldType().indexOptions()); assertNotSame(IndexOptions.NONE, mapperService.fullName("object1.multi1.string").indexOptions());
assertThat(docMapper.mappers().getMapper("object1.multi1.string").fieldType().tokenized(), equalTo(false)); assertThat(mapperService.fullName("object1.multi1.string").tokenized(), equalTo(false));
} }
public void testBuildThenParse() throws Exception { public void testBuildThenParse() throws Exception {

View File

@ -20,7 +20,6 @@
package org.elasticsearch.index.mapper; package org.elasticsearch.index.mapper;
import org.elasticsearch.common.compress.CompressedXContent; import org.elasticsearch.common.compress.CompressedXContent;
import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESSingleNodeTestCase;
import java.io.IOException; import java.io.IOException;

View File

@ -23,10 +23,7 @@ import org.apache.lucene.index.IndexableField;
import org.elasticsearch.common.bytes.BytesArray; import org.elasticsearch.common.bytes.BytesArray;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.index.IndexService; import org.elasticsearch.index.IndexService;
import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.FieldMapper;
import org.elasticsearch.index.mapper.ParseContext.Document; import org.elasticsearch.index.mapper.ParseContext.Document;
import org.elasticsearch.index.mapper.ParsedDocument;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESSingleNodeTestCase;
import static org.elasticsearch.test.StreamsUtils.copyToBytesFromClasspath; import static org.elasticsearch.test.StreamsUtils.copyToBytesFromClasspath;
@ -38,13 +35,14 @@ public class PathMatchDynamicTemplateTests extends ESSingleNodeTestCase {
String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/dynamictemplate/pathmatch/test-mapping.json"); String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/dynamictemplate/pathmatch/test-mapping.json");
IndexService index = createIndex("test"); IndexService index = createIndex("test");
client().admin().indices().preparePutMapping("test").setType("person").setSource(mapping, XContentType.JSON).get(); client().admin().indices().preparePutMapping("test").setType("person").setSource(mapping, XContentType.JSON).get();
DocumentMapper docMapper = index.mapperService().documentMapper("person");
MapperService mapperService = index.mapperService();
byte[] json = copyToBytesFromClasspath("/org/elasticsearch/index/mapper/dynamictemplate/pathmatch/test-data.json"); byte[] json = copyToBytesFromClasspath("/org/elasticsearch/index/mapper/dynamictemplate/pathmatch/test-data.json");
ParsedDocument parsedDoc = docMapper.parse(SourceToParse.source("test", "person", "1", new BytesArray(json), ParsedDocument parsedDoc = mapperService.documentMapper().parse(
XContentType.JSON)); SourceToParse.source("test", "person", "1", new BytesArray(json), XContentType.JSON));
client().admin().indices().preparePutMapping("test").setType("person") client().admin().indices().preparePutMapping("test").setType("person")
.setSource(parsedDoc.dynamicMappingsUpdate().toString(), XContentType.JSON).get(); .setSource(parsedDoc.dynamicMappingsUpdate().toString(), XContentType.JSON).get();
docMapper = index.mapperService().documentMapper("person");
Document doc = parsedDoc.rootDoc(); Document doc = parsedDoc.rootDoc();
IndexableField f = doc.getField("name"); IndexableField f = doc.getField("name");
@ -52,26 +50,26 @@ public class PathMatchDynamicTemplateTests extends ESSingleNodeTestCase {
assertThat(f.stringValue(), equalTo("top_level")); assertThat(f.stringValue(), equalTo("top_level"));
assertThat(f.fieldType().stored(), equalTo(false)); assertThat(f.fieldType().stored(), equalTo(false));
FieldMapper fieldMapper = docMapper.mappers().getMapper("name"); MappedFieldType fieldType = mapperService.fullName("name");
assertThat(fieldMapper.fieldType().stored(), equalTo(false)); assertThat(fieldType.stored(), equalTo(false));
f = doc.getField("obj1.name"); f = doc.getField("obj1.name");
assertThat(f.name(), equalTo("obj1.name")); assertThat(f.name(), equalTo("obj1.name"));
assertThat(f.fieldType().stored(), equalTo(true)); assertThat(f.fieldType().stored(), equalTo(true));
fieldMapper = docMapper.mappers().getMapper("obj1.name"); fieldType = mapperService.fullName("obj1.name");
assertThat(fieldMapper.fieldType().stored(), equalTo(true)); assertThat(fieldType.stored(), equalTo(true));
f = doc.getField("obj1.obj2.name"); f = doc.getField("obj1.obj2.name");
assertThat(f.name(), equalTo("obj1.obj2.name")); assertThat(f.name(), equalTo("obj1.obj2.name"));
assertThat(f.fieldType().stored(), equalTo(false)); assertThat(f.fieldType().stored(), equalTo(false));
fieldMapper = docMapper.mappers().getMapper("obj1.obj2.name"); fieldType = mapperService.fullName("obj1.obj2.name");
assertThat(fieldMapper.fieldType().stored(), equalTo(false)); assertThat(fieldType.stored(), equalTo(false));
// verify more complex path_match expressions // verify more complex path_match expressions
fieldMapper = docMapper.mappers().getMapper("obj3.obj4.prop1"); fieldType = mapperService.fullName("obj3.obj4.prop1");
assertNotNull(fieldMapper); assertNotNull(fieldType);
} }
} }

View File

@ -28,13 +28,14 @@ import org.elasticsearch.common.Strings;
import org.elasticsearch.common.bytes.BytesReference; import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.compress.CompressedXContent; import org.elasticsearch.common.compress.CompressedXContent;
import org.elasticsearch.common.lucene.Lucene; import org.elasticsearch.common.lucene.Lucene;
import org.elasticsearch.common.util.set.Sets;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.index.fieldvisitor.CustomFieldsVisitor; import org.elasticsearch.index.fieldvisitor.CustomFieldsVisitor;
import org.elasticsearch.index.mapper.MapperService.MergeReason; import org.elasticsearch.index.mapper.MapperService.MergeReason;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESSingleNodeTestCase;
import java.util.Collections; import java.util.Set;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
@ -84,9 +85,11 @@ public class StoredNumericValuesTests extends ESSingleNodeTestCase {
DirectoryReader reader = DirectoryReader.open(writer); DirectoryReader reader = DirectoryReader.open(writer);
IndexSearcher searcher = new IndexSearcher(reader); IndexSearcher searcher = new IndexSearcher(reader);
CustomFieldsVisitor fieldsVisitor = new CustomFieldsVisitor( Set<String> fieldNames = Sets.newHashSet("field1", "field2", "field3", "field4", "field5",
Collections.emptySet(), Collections.singletonList("field*"), false); "field6", "field7", "field8", "field9", "field10");
CustomFieldsVisitor fieldsVisitor = new CustomFieldsVisitor(fieldNames, false);
searcher.doc(0, fieldsVisitor); searcher.doc(0, fieldsVisitor);
fieldsVisitor.postProcess(mapperService); fieldsVisitor.postProcess(mapperService);
assertThat(fieldsVisitor.fields().size(), equalTo(10)); assertThat(fieldsVisitor.fields().size(), equalTo(10));
assertThat(fieldsVisitor.fields().get("field1").size(), equalTo(1)); assertThat(fieldsVisitor.fields().get("field1").size(), equalTo(1));

View File

@ -490,9 +490,10 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
.endObject().endObject()); .endObject().endObject());
DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping)); DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping));
assertEquals(mapping, mapper.mappingSource().toString()); assertEquals(mapping, mapper.mappingSource().toString());
assertTrue(mapper.mappers().getMapper("field").fieldType().eagerGlobalOrdinals());
FieldMapper fieldMapper = (FieldMapper) mapper.mappers().getMapper("field");
assertTrue(fieldMapper.fieldType().eagerGlobalOrdinals());
} }
public void testFielddata() throws IOException { public void testFielddata() throws IOException {
@ -504,8 +505,10 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper disabledMapper = parser.parse("type", new CompressedXContent(mapping)); DocumentMapper disabledMapper = parser.parse("type", new CompressedXContent(mapping));
assertEquals(mapping, disabledMapper.mappingSource().toString()); assertEquals(mapping, disabledMapper.mappingSource().toString());
IllegalArgumentException e = expectThrows(IllegalArgumentException.class, IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () -> {
() -> disabledMapper.mappers().getMapper("field").fieldType().fielddataBuilder("test")); FieldMapper fieldMapper = (FieldMapper) disabledMapper.mappers().getMapper("field");
fieldMapper.fieldType().fielddataBuilder("test");
});
assertThat(e.getMessage(), containsString("Fielddata is disabled")); assertThat(e.getMessage(), containsString("Fielddata is disabled"));
mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type")
@ -518,7 +521,9 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper enabledMapper = parser.parse("type", new CompressedXContent(mapping)); DocumentMapper enabledMapper = parser.parse("type", new CompressedXContent(mapping));
assertEquals(mapping, enabledMapper.mappingSource().toString()); assertEquals(mapping, enabledMapper.mappingSource().toString());
enabledMapper.mappers().getMapper("field").fieldType().fielddataBuilder("test"); // no exception this time
FieldMapper enabledFieldMapper = (FieldMapper) enabledMapper.mappers().getMapper("field");
enabledFieldMapper.fieldType().fielddataBuilder("test"); // no exception this time
String illegalMapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") String illegalMapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties").startObject("field") .startObject("properties").startObject("field")
@ -547,7 +552,9 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping)); DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping));
assertEquals(mapping, mapper.mappingSource().toString()); assertEquals(mapping, mapper.mappingSource().toString());
TextFieldType fieldType = (TextFieldType) mapper.mappers().getMapper("field").fieldType(); TextFieldMapper fieldMapper = (TextFieldMapper) mapper.mappers().getMapper("field");
TextFieldType fieldType = fieldMapper.fieldType();
assertThat(fieldType.fielddataMinFrequency(), equalTo(2d)); assertThat(fieldType.fielddataMinFrequency(), equalTo(2d));
assertThat(fieldType.fielddataMaxFrequency(), equalTo((double) Integer.MAX_VALUE)); assertThat(fieldType.fielddataMaxFrequency(), equalTo((double) Integer.MAX_VALUE));
assertThat(fieldType.fielddataMinSegmentSize(), equalTo(1000)); assertThat(fieldType.fielddataMinSegmentSize(), equalTo(1000));
@ -630,7 +637,7 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping)); DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping));
FieldMapper prefix = mapper.mappers().getMapper("field._index_prefix"); FieldMapper prefix = (FieldMapper) mapper.mappers().getMapper("field._index_prefix");
FieldType ft = prefix.fieldType; FieldType ft = prefix.fieldType;
assertEquals(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS, ft.indexOptions()); assertEquals(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS, ft.indexOptions());
} }
@ -646,7 +653,7 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping)); DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping));
FieldMapper prefix = mapper.mappers().getMapper("field._index_prefix"); FieldMapper prefix = (FieldMapper) mapper.mappers().getMapper("field._index_prefix");
FieldType ft = prefix.fieldType; FieldType ft = prefix.fieldType;
assertEquals(IndexOptions.DOCS, ft.indexOptions()); assertEquals(IndexOptions.DOCS, ft.indexOptions());
assertFalse(ft.storeTermVectors()); assertFalse(ft.storeTermVectors());
@ -663,7 +670,7 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping)); DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping));
FieldMapper prefix = mapper.mappers().getMapper("field._index_prefix"); FieldMapper prefix = (FieldMapper) mapper.mappers().getMapper("field._index_prefix");
FieldType ft = prefix.fieldType; FieldType ft = prefix.fieldType;
if (indexService.getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_4_0)) { if (indexService.getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_4_0)) {
assertEquals(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS, ft.indexOptions()); assertEquals(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS, ft.indexOptions());
@ -684,7 +691,7 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping)); DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping));
FieldMapper prefix = mapper.mappers().getMapper("field._index_prefix"); FieldMapper prefix = (FieldMapper) mapper.mappers().getMapper("field._index_prefix");
FieldType ft = prefix.fieldType; FieldType ft = prefix.fieldType;
if (indexService.getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_4_0)) { if (indexService.getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_4_0)) {
assertEquals(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS, ft.indexOptions()); assertEquals(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS, ft.indexOptions());
@ -705,7 +712,7 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping)); DocumentMapper mapper = parser.parse("type", new CompressedXContent(mapping));
FieldMapper prefix = mapper.mappers().getMapper("field._index_prefix"); FieldMapper prefix = (FieldMapper) mapper.mappers().getMapper("field._index_prefix");
FieldType ft = prefix.fieldType; FieldType ft = prefix.fieldType;
if (indexService.getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_4_0)) { if (indexService.getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_4_0)) {
assertEquals(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS, ft.indexOptions()); assertEquals(IndexOptions.DOCS_AND_FREQS_AND_POSITIONS, ft.indexOptions());
@ -836,10 +843,13 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
assertThat(mapper.mappers().getMapper("field._index_prefix").toString(), containsString("prefixChars=1:10")); assertThat(mapper.mappers().getMapper("field._index_prefix").toString(), containsString("prefixChars=1:10"));
Query q = mapper.mappers().getMapper("field").fieldType().prefixQuery("goin", CONSTANT_SCORE_REWRITE, queryShardContext); FieldMapper fieldMapper = (FieldMapper) mapper.mappers().getMapper("field");
MappedFieldType fieldType = fieldMapper.fieldType;
Query q = fieldType.prefixQuery("goin", CONSTANT_SCORE_REWRITE, queryShardContext);
assertEquals(new ConstantScoreQuery(new TermQuery(new Term("field._index_prefix", "goin"))), q); assertEquals(new ConstantScoreQuery(new TermQuery(new Term("field._index_prefix", "goin"))), q);
q = mapper.mappers().getMapper("field").fieldType().prefixQuery("internationalisatio", q = fieldType.prefixQuery("internationalisatio", CONSTANT_SCORE_REWRITE, queryShardContext);
CONSTANT_SCORE_REWRITE, queryShardContext);
assertEquals(new PrefixQuery(new Term("field", "internationalisatio")), q); assertEquals(new PrefixQuery(new Term("field", "internationalisatio")), q);
ParsedDocument doc = mapper.parse(SourceToParse.source("test", "type", "1", BytesReference ParsedDocument doc = mapper.parse(SourceToParse.source("test", "type", "1", BytesReference
@ -864,17 +874,16 @@ public class TextFieldMapperTests extends ESSingleNodeTestCase {
CompressedXContent json = new CompressedXContent(mapping); CompressedXContent json = new CompressedXContent(mapping);
DocumentMapper mapper = parser.parse("type", json); DocumentMapper mapper = parser.parse("type", json);
Query q1 = mapper.mappers().getMapper("field").fieldType().prefixQuery("g", FieldMapper fieldMapper = (FieldMapper) mapper.mappers().getMapper("field");
CONSTANT_SCORE_REWRITE, queryShardContext); MappedFieldType fieldType = fieldMapper.fieldType;
Query q1 = fieldType.prefixQuery("g", CONSTANT_SCORE_REWRITE, queryShardContext);
assertThat(q1, instanceOf(PrefixQuery.class)); assertThat(q1, instanceOf(PrefixQuery.class));
Query q2 = mapper.mappers().getMapper("field").fieldType().prefixQuery("go", Query q2 = fieldType.prefixQuery("go", CONSTANT_SCORE_REWRITE, queryShardContext);
CONSTANT_SCORE_REWRITE, queryShardContext);
assertThat(q2, instanceOf(ConstantScoreQuery.class)); assertThat(q2, instanceOf(ConstantScoreQuery.class));
Query q5 = mapper.mappers().getMapper("field").fieldType().prefixQuery("going", Query q5 = fieldType.prefixQuery("going", CONSTANT_SCORE_REWRITE, queryShardContext);
CONSTANT_SCORE_REWRITE, queryShardContext);
assertThat(q5, instanceOf(ConstantScoreQuery.class)); assertThat(q5, instanceOf(ConstantScoreQuery.class));
Query q6 = mapper.mappers().getMapper("field").fieldType().prefixQuery("goings", Query q6 = fieldType.prefixQuery("goings", CONSTANT_SCORE_REWRITE, queryShardContext);
CONSTANT_SCORE_REWRITE, queryShardContext);
assertThat(q6, instanceOf(PrefixQuery.class)); assertThat(q6, instanceOf(PrefixQuery.class));
} }

View File

@ -19,6 +19,7 @@
package org.elasticsearch.index.query; package org.elasticsearch.index.query;
import org.apache.lucene.index.Term;
import org.apache.lucene.queries.ExtendedCommonTermsQuery; import org.apache.lucene.queries.ExtendedCommonTermsQuery;
import org.apache.lucene.search.Query; import org.apache.lucene.search.Query;
import org.elasticsearch.common.ParsingException; import org.elasticsearch.common.ParsingException;
@ -27,6 +28,7 @@ import org.elasticsearch.test.AbstractQueryTestCase;
import java.io.IOException; import java.io.IOException;
import java.util.HashMap; import java.util.HashMap;
import java.util.List;
import java.util.Map; import java.util.Map;
import static org.elasticsearch.index.query.QueryBuilders.commonTermsQuery; import static org.elasticsearch.index.query.QueryBuilders.commonTermsQuery;
@ -39,19 +41,16 @@ public class CommonTermsQueryBuilderTests extends AbstractQueryTestCase<CommonTe
@Override @Override
protected CommonTermsQueryBuilder doCreateTestQueryBuilder() { protected CommonTermsQueryBuilder doCreateTestQueryBuilder() {
CommonTermsQueryBuilder query;
int numberOfTerms = randomIntBetween(0, 10); int numberOfTerms = randomIntBetween(0, 10);
StringBuilder text = new StringBuilder(""); StringBuilder text = new StringBuilder("");
for (int i = 0; i < numberOfTerms; i++) { for (int i = 0; i < numberOfTerms; i++) {
text.append(randomAlphaOfLengthBetween(1, 10)).append(" "); text.append(randomAlphaOfLengthBetween(1, 10)).append(" ");
} }
// mapped or unmapped field
if (randomBoolean()) { String fieldName = randomFrom(STRING_FIELD_NAME,
query = new CommonTermsQueryBuilder(STRING_FIELD_NAME, text.toString()); STRING_ALIAS_FIELD_NAME,
} else { randomAlphaOfLengthBetween(1, 10));
query = new CommonTermsQueryBuilder(randomAlphaOfLengthBetween(1, 10), text.toString()); CommonTermsQueryBuilder query = new CommonTermsQueryBuilder(fieldName, text.toString());
}
if (randomBoolean()) { if (randomBoolean()) {
query.cutoffFrequency(randomIntBetween(1, 10)); query.cutoffFrequency(randomIntBetween(1, 10));
@ -100,6 +99,14 @@ public class CommonTermsQueryBuilderTests extends AbstractQueryTestCase<CommonTe
protected void doAssertLuceneQuery(CommonTermsQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException { protected void doAssertLuceneQuery(CommonTermsQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException {
assertThat(query, instanceOf(ExtendedCommonTermsQuery.class)); assertThat(query, instanceOf(ExtendedCommonTermsQuery.class));
ExtendedCommonTermsQuery extendedCommonTermsQuery = (ExtendedCommonTermsQuery) query; ExtendedCommonTermsQuery extendedCommonTermsQuery = (ExtendedCommonTermsQuery) query;
List<Term> terms = extendedCommonTermsQuery.getTerms();
if (!terms.isEmpty()) {
String expectedFieldName = expectedFieldName(queryBuilder.fieldName());
String actualFieldName = terms.iterator().next().field();
assertThat(actualFieldName, equalTo(expectedFieldName));
}
assertThat(extendedCommonTermsQuery.getHighFreqMinimumNumberShouldMatchSpec(), equalTo(queryBuilder.highFreqMinimumShouldMatch())); assertThat(extendedCommonTermsQuery.getHighFreqMinimumNumberShouldMatchSpec(), equalTo(queryBuilder.highFreqMinimumShouldMatch()));
assertThat(extendedCommonTermsQuery.getLowFreqMinimumNumberShouldMatchSpec(), equalTo(queryBuilder.lowFreqMinimumShouldMatch())); assertThat(extendedCommonTermsQuery.getLowFreqMinimumNumberShouldMatchSpec(), equalTo(queryBuilder.lowFreqMinimumShouldMatch()));
} }

View File

@ -76,7 +76,7 @@ public class ExistsQueryBuilderTests extends AbstractQueryTestCase<ExistsQueryBu
if (fields.size() == 1) { if (fields.size() == 1) {
assertThat(query, instanceOf(ConstantScoreQuery.class)); assertThat(query, instanceOf(ConstantScoreQuery.class));
ConstantScoreQuery constantScoreQuery = (ConstantScoreQuery) query; ConstantScoreQuery constantScoreQuery = (ConstantScoreQuery) query;
String field = fields.iterator().next(); String field = expectedFieldName(fields.iterator().next());
assertThat(constantScoreQuery.getQuery(), instanceOf(TermQuery.class)); assertThat(constantScoreQuery.getQuery(), instanceOf(TermQuery.class));
TermQuery termQuery = (TermQuery) constantScoreQuery.getQuery(); TermQuery termQuery = (TermQuery) constantScoreQuery.getQuery();
assertEquals(field, termQuery.getTerm().text()); assertEquals(field, termQuery.getTerm().text());
@ -99,7 +99,7 @@ public class ExistsQueryBuilderTests extends AbstractQueryTestCase<ExistsQueryBu
} else if (fields.size() == 1) { } else if (fields.size() == 1) {
assertThat(query, instanceOf(ConstantScoreQuery.class)); assertThat(query, instanceOf(ConstantScoreQuery.class));
ConstantScoreQuery constantScoreQuery = (ConstantScoreQuery) query; ConstantScoreQuery constantScoreQuery = (ConstantScoreQuery) query;
String field = fields.iterator().next(); String field = expectedFieldName(fields.iterator().next());
if (context.getQueryShardContext().getObjectMapper(field) != null) { if (context.getQueryShardContext().getObjectMapper(field) != null) {
assertThat(constantScoreQuery.getQuery(), instanceOf(BooleanQuery.class)); assertThat(constantScoreQuery.getQuery(), instanceOf(BooleanQuery.class));
BooleanQuery booleanQuery = (BooleanQuery) constantScoreQuery.getQuery(); BooleanQuery booleanQuery = (BooleanQuery) constantScoreQuery.getQuery();

View File

@ -21,7 +21,6 @@ package org.elasticsearch.index.query;
import org.apache.lucene.search.Query; import org.apache.lucene.search.Query;
import org.apache.lucene.search.spans.FieldMaskingSpanQuery; import org.apache.lucene.search.spans.FieldMaskingSpanQuery;
import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.internal.SearchContext;
import org.elasticsearch.test.AbstractQueryTestCase; import org.elasticsearch.test.AbstractQueryTestCase;
@ -45,11 +44,7 @@ public class FieldMaskingSpanQueryBuilderTests extends AbstractQueryTestCase<Fie
@Override @Override
protected void doAssertLuceneQuery(FieldMaskingSpanQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException { protected void doAssertLuceneQuery(FieldMaskingSpanQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException {
String fieldInQuery = queryBuilder.fieldName(); String fieldInQuery = expectedFieldName(queryBuilder.fieldName());
MappedFieldType fieldType = context.getQueryShardContext().fieldMapper(fieldInQuery);
if (fieldType != null) {
fieldInQuery = fieldType.name();
}
assertThat(query, instanceOf(FieldMaskingSpanQuery.class)); assertThat(query, instanceOf(FieldMaskingSpanQuery.class));
FieldMaskingSpanQuery fieldMaskingSpanQuery = (FieldMaskingSpanQuery) query; FieldMaskingSpanQuery fieldMaskingSpanQuery = (FieldMaskingSpanQuery) query;
assertThat(fieldMaskingSpanQuery.getField(), equalTo(fieldInQuery)); assertThat(fieldMaskingSpanQuery.getField(), equalTo(fieldInQuery));

View File

@ -41,7 +41,8 @@ public class FuzzyQueryBuilderTests extends AbstractQueryTestCase<FuzzyQueryBuil
@Override @Override
protected FuzzyQueryBuilder doCreateTestQueryBuilder() { protected FuzzyQueryBuilder doCreateTestQueryBuilder() {
FuzzyQueryBuilder query = new FuzzyQueryBuilder(STRING_FIELD_NAME, getRandomValueForFieldName(STRING_FIELD_NAME)); String fieldName = randomFrom(STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME);
FuzzyQueryBuilder query = new FuzzyQueryBuilder(fieldName, getRandomValueForFieldName(fieldName));
if (randomBoolean()) { if (randomBoolean()) {
query.fuzziness(randomFuzziness(query.fieldName())); query.fuzziness(randomFuzziness(query.fieldName()));
} }
@ -76,6 +77,11 @@ public class FuzzyQueryBuilderTests extends AbstractQueryTestCase<FuzzyQueryBuil
@Override @Override
protected void doAssertLuceneQuery(FuzzyQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException { protected void doAssertLuceneQuery(FuzzyQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException {
assertThat(query, instanceOf(FuzzyQuery.class)); assertThat(query, instanceOf(FuzzyQuery.class));
FuzzyQuery fuzzyQuery = (FuzzyQuery) query;
String expectedFieldName = expectedFieldName(queryBuilder.fieldName());
String actualFieldName = fuzzyQuery.getTerm().field();
assertThat(actualFieldName, equalTo(expectedFieldName));
} }
public void testIllegalArguments() { public void testIllegalArguments() {

View File

@ -39,6 +39,7 @@ import java.io.IOException;
import static org.hamcrest.CoreMatchers.containsString; import static org.hamcrest.CoreMatchers.containsString;
import static org.hamcrest.CoreMatchers.instanceOf; import static org.hamcrest.CoreMatchers.instanceOf;
import static org.hamcrest.CoreMatchers.notNullValue; import static org.hamcrest.CoreMatchers.notNullValue;
import static org.hamcrest.Matchers.startsWith;
public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBoundingBoxQueryBuilder> { public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBoundingBoxQueryBuilder> {
/** Randomly generate either NaN or one of the two infinity values. */ /** Randomly generate either NaN or one of the two infinity values. */
@ -46,7 +47,8 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
@Override @Override
protected GeoBoundingBoxQueryBuilder doCreateTestQueryBuilder() { protected GeoBoundingBoxQueryBuilder doCreateTestQueryBuilder() {
GeoBoundingBoxQueryBuilder builder = new GeoBoundingBoxQueryBuilder(GEO_POINT_FIELD_NAME); String fieldName = randomFrom(GEO_POINT_FIELD_NAME, GEO_POINT_ALIAS_FIELD_NAME);
GeoBoundingBoxQueryBuilder builder = new GeoBoundingBoxQueryBuilder(fieldName);
Rectangle box = RandomShapeGenerator.xRandomRectangle(random(), RandomShapeGenerator.xRandomPoint(random())); Rectangle box = RandomShapeGenerator.xRandomRectangle(random(), RandomShapeGenerator.xRandomPoint(random()));
if (randomBoolean()) { if (randomBoolean()) {
@ -117,7 +119,7 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
public void testExceptionOnMissingTypes() throws IOException { public void testExceptionOnMissingTypes() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length == 0); assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length == 0);
QueryShardException e = expectThrows(QueryShardException.class, super::testToQuery); QueryShardException e = expectThrows(QueryShardException.class, super::testToQuery);
assertEquals("failed to find geo_point field [mapped_geo_point]", e.getMessage()); assertThat(e.getMessage(), startsWith("failed to find geo_point field [mapped_geo_point"));
} }
public void testBrokenCoordinateCannotBeSet() { public void testBrokenCoordinateCannotBeSet() {
@ -216,13 +218,14 @@ public class GeoBoundingBoxQueryBuilderTests extends AbstractQueryTestCase<GeoBo
assertTrue("Found no indexed geo query.", query instanceof MatchNoDocsQuery); assertTrue("Found no indexed geo query.", query instanceof MatchNoDocsQuery);
} else if (query instanceof IndexOrDocValuesQuery) { // TODO: remove the if statement once we always use LatLonPoint } else if (query instanceof IndexOrDocValuesQuery) { // TODO: remove the if statement once we always use LatLonPoint
Query indexQuery = ((IndexOrDocValuesQuery) query).getIndexQuery(); Query indexQuery = ((IndexOrDocValuesQuery) query).getIndexQuery();
assertEquals(LatLonPoint.newBoxQuery(queryBuilder.fieldName(), String expectedFieldName = expectedFieldName(queryBuilder.fieldName());
assertEquals(LatLonPoint.newBoxQuery(expectedFieldName,
queryBuilder.bottomRight().lat(), queryBuilder.bottomRight().lat(),
queryBuilder.topLeft().lat(), queryBuilder.topLeft().lat(),
queryBuilder.topLeft().lon(), queryBuilder.topLeft().lon(),
queryBuilder.bottomRight().lon()), indexQuery); queryBuilder.bottomRight().lon()), indexQuery);
Query dvQuery = ((IndexOrDocValuesQuery) query).getRandomAccessQuery(); Query dvQuery = ((IndexOrDocValuesQuery) query).getRandomAccessQuery();
assertEquals(LatLonDocValuesField.newSlowBoxQuery(queryBuilder.fieldName(), assertEquals(LatLonDocValuesField.newSlowBoxQuery(expectedFieldName,
queryBuilder.bottomRight().lat(), queryBuilder.bottomRight().lat(),
queryBuilder.topLeft().lat(), queryBuilder.topLeft().lat(),
queryBuilder.topLeft().lon(), queryBuilder.topLeft().lon(),

View File

@ -43,7 +43,8 @@ public class GeoDistanceQueryBuilderTests extends AbstractQueryTestCase<GeoDista
@Override @Override
protected GeoDistanceQueryBuilder doCreateTestQueryBuilder() { protected GeoDistanceQueryBuilder doCreateTestQueryBuilder() {
GeoDistanceQueryBuilder qb = new GeoDistanceQueryBuilder(GEO_POINT_FIELD_NAME); String fieldName = randomFrom(GEO_POINT_FIELD_NAME, GEO_POINT_ALIAS_FIELD_NAME);
GeoDistanceQueryBuilder qb = new GeoDistanceQueryBuilder(fieldName);
String distance = "" + randomDouble(); String distance = "" + randomDouble();
if (randomBoolean()) { if (randomBoolean()) {
DistanceUnit unit = randomFrom(DistanceUnit.values()); DistanceUnit unit = randomFrom(DistanceUnit.values());
@ -130,13 +131,15 @@ public class GeoDistanceQueryBuilderTests extends AbstractQueryTestCase<GeoDista
// TODO: remove the if statement once we always use LatLonPoint // TODO: remove the if statement once we always use LatLonPoint
if (query instanceof IndexOrDocValuesQuery) { if (query instanceof IndexOrDocValuesQuery) {
Query indexQuery = ((IndexOrDocValuesQuery) query).getIndexQuery(); Query indexQuery = ((IndexOrDocValuesQuery) query).getIndexQuery();
assertEquals(LatLonPoint.newDistanceQuery(queryBuilder.fieldName(),
String expectedFieldName = expectedFieldName(queryBuilder.fieldName());
assertEquals(LatLonPoint.newDistanceQuery(expectedFieldName,
queryBuilder.point().lat(), queryBuilder.point().lat(),
queryBuilder.point().lon(), queryBuilder.point().lon(),
queryBuilder.distance()), queryBuilder.distance()),
indexQuery); indexQuery);
Query dvQuery = ((IndexOrDocValuesQuery) query).getRandomAccessQuery(); Query dvQuery = ((IndexOrDocValuesQuery) query).getRandomAccessQuery();
assertEquals(LatLonDocValuesField.newSlowDistanceQuery(queryBuilder.fieldName(), assertEquals(LatLonDocValuesField.newSlowDistanceQuery(expectedFieldName,
queryBuilder.point().lat(), queryBuilder.point().lat(),
queryBuilder.point().lon(), queryBuilder.point().lon(),
queryBuilder.distance()), queryBuilder.distance()),

View File

@ -44,8 +44,9 @@ import static org.hamcrest.CoreMatchers.notNullValue;
public class GeoPolygonQueryBuilderTests extends AbstractQueryTestCase<GeoPolygonQueryBuilder> { public class GeoPolygonQueryBuilderTests extends AbstractQueryTestCase<GeoPolygonQueryBuilder> {
@Override @Override
protected GeoPolygonQueryBuilder doCreateTestQueryBuilder() { protected GeoPolygonQueryBuilder doCreateTestQueryBuilder() {
String fieldName = randomFrom(GEO_POINT_FIELD_NAME, GEO_POINT_ALIAS_FIELD_NAME);
List<GeoPoint> polygon = randomPolygon(); List<GeoPoint> polygon = randomPolygon();
GeoPolygonQueryBuilder builder = new GeoPolygonQueryBuilder(GEO_POINT_FIELD_NAME, polygon); GeoPolygonQueryBuilder builder = new GeoPolygonQueryBuilder(fieldName, polygon);
if (randomBoolean()) { if (randomBoolean()) {
builder.setValidationMethod(randomFrom(GeoValidationMethod.values())); builder.setValidationMethod(randomFrom(GeoValidationMethod.values()));
} }

View File

@ -42,13 +42,13 @@ import static org.hamcrest.Matchers.notNullValue;
public class MatchPhrasePrefixQueryBuilderTests extends AbstractQueryTestCase<MatchPhrasePrefixQueryBuilder> { public class MatchPhrasePrefixQueryBuilderTests extends AbstractQueryTestCase<MatchPhrasePrefixQueryBuilder> {
@Override @Override
protected MatchPhrasePrefixQueryBuilder doCreateTestQueryBuilder() { protected MatchPhrasePrefixQueryBuilder doCreateTestQueryBuilder() {
String fieldName = randomFrom(STRING_FIELD_NAME, BOOLEAN_FIELD_NAME, INT_FIELD_NAME, String fieldName = randomFrom(STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME, BOOLEAN_FIELD_NAME,
DOUBLE_FIELD_NAME, DATE_FIELD_NAME); INT_FIELD_NAME, DOUBLE_FIELD_NAME, DATE_FIELD_NAME);
if (fieldName.equals(DATE_FIELD_NAME)) { if (fieldName.equals(DATE_FIELD_NAME)) {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0); assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
} }
Object value; Object value;
if (fieldName.equals(STRING_FIELD_NAME)) { if (isTextField(fieldName)) {
int terms = randomIntBetween(0, 3); int terms = randomIntBetween(0, 3);
StringBuilder builder = new StringBuilder(); StringBuilder builder = new StringBuilder();
for (int i = 0; i < terms; i++) { for (int i = 0; i < terms; i++) {
@ -61,7 +61,7 @@ public class MatchPhrasePrefixQueryBuilderTests extends AbstractQueryTestCase<Ma
MatchPhrasePrefixQueryBuilder matchQuery = new MatchPhrasePrefixQueryBuilder(fieldName, value); MatchPhrasePrefixQueryBuilder matchQuery = new MatchPhrasePrefixQueryBuilder(fieldName, value);
if (randomBoolean() && fieldName.equals(STRING_FIELD_NAME)) { if (randomBoolean() && isTextField(fieldName)) {
matchQuery.analyzer(randomFrom("simple", "keyword", "whitespace")); matchQuery.analyzer(randomFrom("simple", "keyword", "whitespace"));
} }

View File

@ -45,13 +45,13 @@ import static org.hamcrest.Matchers.notNullValue;
public class MatchPhraseQueryBuilderTests extends AbstractQueryTestCase<MatchPhraseQueryBuilder> { public class MatchPhraseQueryBuilderTests extends AbstractQueryTestCase<MatchPhraseQueryBuilder> {
@Override @Override
protected MatchPhraseQueryBuilder doCreateTestQueryBuilder() { protected MatchPhraseQueryBuilder doCreateTestQueryBuilder() {
String fieldName = randomFrom(STRING_FIELD_NAME, BOOLEAN_FIELD_NAME, INT_FIELD_NAME, String fieldName = randomFrom(STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME, BOOLEAN_FIELD_NAME,
DOUBLE_FIELD_NAME, DATE_FIELD_NAME); INT_FIELD_NAME, DOUBLE_FIELD_NAME, DATE_FIELD_NAME);
if (fieldName.equals(DATE_FIELD_NAME)) { if (fieldName.equals(DATE_FIELD_NAME)) {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0); assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
} }
Object value; Object value;
if (fieldName.equals(STRING_FIELD_NAME)) { if (isTextField(fieldName)) {
int terms = randomIntBetween(0, 3); int terms = randomIntBetween(0, 3);
StringBuilder builder = new StringBuilder(); StringBuilder builder = new StringBuilder();
for (int i = 0; i < terms; i++) { for (int i = 0; i < terms; i++) {
@ -64,7 +64,7 @@ public class MatchPhraseQueryBuilderTests extends AbstractQueryTestCase<MatchPhr
MatchPhraseQueryBuilder matchQuery = new MatchPhraseQueryBuilder(fieldName, value); MatchPhraseQueryBuilder matchQuery = new MatchPhraseQueryBuilder(fieldName, value);
if (randomBoolean() && fieldName.equals(STRING_FIELD_NAME)) { if (randomBoolean() && isTextField(fieldName)) {
matchQuery.analyzer(randomFrom("simple", "keyword", "whitespace")); matchQuery.analyzer(randomFrom("simple", "keyword", "whitespace"));
} }

View File

@ -19,6 +19,7 @@
package org.elasticsearch.index.query; package org.elasticsearch.index.query;
import org.apache.lucene.index.Term;
import org.apache.lucene.queries.ExtendedCommonTermsQuery; import org.apache.lucene.queries.ExtendedCommonTermsQuery;
import org.apache.lucene.search.BooleanClause; import org.apache.lucene.search.BooleanClause;
import org.apache.lucene.search.BooleanQuery; import org.apache.lucene.search.BooleanQuery;
@ -47,6 +48,7 @@ import org.hamcrest.Matcher;
import java.io.IOException; import java.io.IOException;
import java.util.HashMap; import java.util.HashMap;
import java.util.List;
import java.util.Locale; import java.util.Locale;
import java.util.Map; import java.util.Map;
@ -59,13 +61,13 @@ import static org.hamcrest.Matchers.notNullValue;
public class MatchQueryBuilderTests extends AbstractQueryTestCase<MatchQueryBuilder> { public class MatchQueryBuilderTests extends AbstractQueryTestCase<MatchQueryBuilder> {
@Override @Override
protected MatchQueryBuilder doCreateTestQueryBuilder() { protected MatchQueryBuilder doCreateTestQueryBuilder() {
String fieldName = randomFrom(STRING_FIELD_NAME, BOOLEAN_FIELD_NAME, INT_FIELD_NAME, String fieldName = STRING_ALIAS_FIELD_NAME; //randomFrom(STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME, BOOLEAN_FIELD_NAME,
DOUBLE_FIELD_NAME, DATE_FIELD_NAME); //INT_FIELD_NAME, DOUBLE_FIELD_NAME, DATE_FIELD_NAME);
if (fieldName.equals(DATE_FIELD_NAME)) { if (fieldName.equals(DATE_FIELD_NAME)) {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0); assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
} }
Object value; Object value;
if (fieldName.equals(STRING_FIELD_NAME)) { if (isTextField(fieldName)) {
int terms = randomIntBetween(0, 3); int terms = randomIntBetween(0, 3);
StringBuilder builder = new StringBuilder(); StringBuilder builder = new StringBuilder();
for (int i = 0; i < terms; i++) { for (int i = 0; i < terms; i++) {
@ -79,11 +81,11 @@ public class MatchQueryBuilderTests extends AbstractQueryTestCase<MatchQueryBuil
MatchQueryBuilder matchQuery = new MatchQueryBuilder(fieldName, value); MatchQueryBuilder matchQuery = new MatchQueryBuilder(fieldName, value);
matchQuery.operator(randomFrom(Operator.values())); matchQuery.operator(randomFrom(Operator.values()));
if (randomBoolean() && fieldName.equals(STRING_FIELD_NAME)) { if (randomBoolean() && isTextField(fieldName)) {
matchQuery.analyzer(randomFrom("simple", "keyword", "whitespace")); matchQuery.analyzer(randomFrom("simple", "keyword", "whitespace"));
} }
if (fieldName.equals(STRING_FIELD_NAME) && randomBoolean()) { if (isTextField(fieldName) && randomBoolean()) {
matchQuery.fuzziness(randomFuzziness(fieldName)); matchQuery.fuzziness(randomFuzziness(fieldName));
} }
@ -179,6 +181,12 @@ public class MatchQueryBuilderTests extends AbstractQueryTestCase<MatchQueryBuil
if (query instanceof ExtendedCommonTermsQuery) { if (query instanceof ExtendedCommonTermsQuery) {
assertTrue(queryBuilder.cutoffFrequency() != null); assertTrue(queryBuilder.cutoffFrequency() != null);
ExtendedCommonTermsQuery ectq = (ExtendedCommonTermsQuery) query; ExtendedCommonTermsQuery ectq = (ExtendedCommonTermsQuery) query;
List<Term> terms = ectq.getTerms();
if (!terms.isEmpty()) {
Term term = terms.iterator().next();
String expectedFieldName = expectedFieldName(queryBuilder.fieldName());
assertThat(term.field(), equalTo(expectedFieldName));
}
assertEquals(queryBuilder.cutoffFrequency(), ectq.getMaxTermFrequency(), Float.MIN_VALUE); assertEquals(queryBuilder.cutoffFrequency(), ectq.getMaxTermFrequency(), Float.MIN_VALUE);
} }
@ -195,6 +203,9 @@ public class MatchQueryBuilderTests extends AbstractQueryTestCase<MatchQueryBuil
termLcMatcher = either(termLcMatcher).or(equalTo(originalTermLc.substring(0, 1))); termLcMatcher = either(termLcMatcher).or(equalTo(originalTermLc.substring(0, 1)));
} }
assertThat(actualTermLc, termLcMatcher); assertThat(actualTermLc, termLcMatcher);
String expectedFieldName = expectedFieldName(queryBuilder.fieldName());
assertThat(expectedFieldName, equalTo(fuzzyQuery.getTerm().field()));
assertThat(queryBuilder.prefixLength(), equalTo(fuzzyQuery.getPrefixLength())); assertThat(queryBuilder.prefixLength(), equalTo(fuzzyQuery.getPrefixLength()));
assertThat(queryBuilder.fuzzyTranspositions(), equalTo(fuzzyQuery.getTranspositions())); assertThat(queryBuilder.fuzzyTranspositions(), equalTo(fuzzyQuery.getTranspositions()));
} }

View File

@ -91,7 +91,7 @@ public class MoreLikeThisQueryBuilderTests extends AbstractQueryTestCase<MoreLik
} }
private static String[] randomStringFields() { private static String[] randomStringFields() {
String[] mappedStringFields = new String[]{STRING_FIELD_NAME, STRING_FIELD_NAME_2}; String[] mappedStringFields = new String[]{STRING_FIELD_NAME, STRING_FIELD_NAME_2, STRING_ALIAS_FIELD_NAME};
String[] unmappedStringFields = generateRandomStringArray(2, 5, false, false); String[] unmappedStringFields = generateRandomStringArray(2, 5, false, false);
return Stream.concat(Arrays.stream(mappedStringFields), Arrays.stream(unmappedStringFields)).toArray(String[]::new); return Stream.concat(Arrays.stream(mappedStringFields), Arrays.stream(unmappedStringFields)).toArray(String[]::new);
} }

View File

@ -230,9 +230,10 @@ public class MultiMatchQueryBuilderTests extends AbstractQueryTestCase<MultiMatc
assertThat(query, instanceOf(DisjunctionMaxQuery.class)); assertThat(query, instanceOf(DisjunctionMaxQuery.class));
DisjunctionMaxQuery dQuery = (DisjunctionMaxQuery) query; DisjunctionMaxQuery dQuery = (DisjunctionMaxQuery) query;
assertThat(dQuery.getTieBreakerMultiplier(), equalTo(1.0f)); assertThat(dQuery.getTieBreakerMultiplier(), equalTo(1.0f));
assertThat(dQuery.getDisjuncts().size(), equalTo(2)); assertThat(dQuery.getDisjuncts().size(), equalTo(3));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 0).getTerm(), equalTo(new Term(STRING_FIELD_NAME_2, "test"))); assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 0).getTerm(), equalTo(new Term(STRING_FIELD_NAME, "test")));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 1).getTerm(), equalTo(new Term(STRING_FIELD_NAME, "test"))); assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 1).getTerm(), equalTo(new Term(STRING_FIELD_NAME_2, "test")));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 2).getTerm(), equalTo(new Term(STRING_FIELD_NAME, "test")));
} }
public void testToQueryFieldMissing() throws Exception { public void testToQueryFieldMissing() throws Exception {

View File

@ -60,7 +60,9 @@ public class PrefixQueryBuilderTests extends AbstractQueryTestCase<PrefixQueryBu
} }
private static PrefixQueryBuilder randomPrefixQuery() { private static PrefixQueryBuilder randomPrefixQuery() {
String fieldName = randomBoolean() ? STRING_FIELD_NAME : randomAlphaOfLengthBetween(1, 10); String fieldName = randomFrom(STRING_FIELD_NAME,
STRING_ALIAS_FIELD_NAME,
randomAlphaOfLengthBetween(1, 10));
String value = randomAlphaOfLengthBetween(1, 10); String value = randomAlphaOfLengthBetween(1, 10);
return new PrefixQueryBuilder(fieldName, value); return new PrefixQueryBuilder(fieldName, value);
} }
@ -69,7 +71,9 @@ public class PrefixQueryBuilderTests extends AbstractQueryTestCase<PrefixQueryBu
protected void doAssertLuceneQuery(PrefixQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException { protected void doAssertLuceneQuery(PrefixQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException {
assertThat(query, instanceOf(PrefixQuery.class)); assertThat(query, instanceOf(PrefixQuery.class));
PrefixQuery prefixQuery = (PrefixQuery) query; PrefixQuery prefixQuery = (PrefixQuery) query;
assertThat(prefixQuery.getPrefix().field(), equalTo(queryBuilder.fieldName()));
String expectedFieldName = expectedFieldName(queryBuilder.fieldName());
assertThat(prefixQuery.getPrefix().field(), equalTo(expectedFieldName));
assertThat(prefixQuery.getPrefix().text(), equalTo(queryBuilder.value())); assertThat(prefixQuery.getPrefix().text(), equalTo(queryBuilder.value()));
} }

View File

@ -90,12 +90,16 @@ public class QueryStringQueryBuilderTests extends AbstractQueryTestCase<QueryStr
} }
QueryStringQueryBuilder queryStringQueryBuilder = new QueryStringQueryBuilder(query); QueryStringQueryBuilder queryStringQueryBuilder = new QueryStringQueryBuilder(query);
if (randomBoolean()) { if (randomBoolean()) {
queryStringQueryBuilder.defaultField(randomBoolean() ? String defaultFieldName = randomFrom(STRING_FIELD_NAME,
STRING_FIELD_NAME : randomAlphaOfLengthBetween(1, 10)); STRING_ALIAS_FIELD_NAME,
randomAlphaOfLengthBetween(1, 10));
queryStringQueryBuilder.defaultField(defaultFieldName);
} else { } else {
int numFields = randomIntBetween(1, 5); int numFields = randomIntBetween(1, 5);
for (int i = 0; i < numFields; i++) { for (int i = 0; i < numFields; i++) {
String fieldName = randomBoolean() ? STRING_FIELD_NAME : randomAlphaOfLengthBetween(1, 10); String fieldName = randomFrom(STRING_FIELD_NAME,
STRING_ALIAS_FIELD_NAME,
randomAlphaOfLengthBetween(1, 10));
if (randomBoolean()) { if (randomBoolean()) {
queryStringQueryBuilder.field(fieldName); queryStringQueryBuilder.field(fieldName);
} else { } else {
@ -508,10 +512,12 @@ public class QueryStringQueryBuilderTests extends AbstractQueryTestCase<QueryStr
Query query = queryStringQuery("test").field("mapped_str*").toQuery(createShardContext()); Query query = queryStringQuery("test").field("mapped_str*").toQuery(createShardContext());
assertThat(query, instanceOf(DisjunctionMaxQuery.class)); assertThat(query, instanceOf(DisjunctionMaxQuery.class));
DisjunctionMaxQuery dQuery = (DisjunctionMaxQuery) query; DisjunctionMaxQuery dQuery = (DisjunctionMaxQuery) query;
assertThat(dQuery.getDisjuncts().size(), equalTo(2)); assertThat(dQuery.getDisjuncts().size(), equalTo(3));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 0).getTerm(), assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 0).getTerm(),
equalTo(new Term(STRING_FIELD_NAME_2, "test"))); equalTo(new Term(STRING_FIELD_NAME, "test")));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 1).getTerm(), assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 1).getTerm(),
equalTo(new Term(STRING_FIELD_NAME_2, "test")));
assertThat(assertDisjunctionSubQuery(query, TermQuery.class, 2).getTerm(),
equalTo(new Term(STRING_FIELD_NAME, "test"))); equalTo(new Term(STRING_FIELD_NAME, "test")));
} }

View File

@ -22,10 +22,12 @@ package org.elasticsearch.index.query;
import com.carrotsearch.randomizedtesting.generators.RandomNumbers; import com.carrotsearch.randomizedtesting.generators.RandomNumbers;
import com.carrotsearch.randomizedtesting.generators.RandomStrings; import com.carrotsearch.randomizedtesting.generators.RandomStrings;
import org.elasticsearch.test.AbstractQueryTestCase;
import java.util.Random; import java.util.Random;
import static org.elasticsearch.test.AbstractBuilderTestCase.STRING_ALIAS_FIELD_NAME;
import static org.elasticsearch.test.AbstractBuilderTestCase.STRING_FIELD_NAME;
import static org.elasticsearch.test.ESTestCase.randomFrom;
/** /**
* Utility class for creating random QueryBuilders. * Utility class for creating random QueryBuilders.
* So far only leaf queries like {@link MatchAllQueryBuilder}, {@link TermQueryBuilder} or * So far only leaf queries like {@link MatchAllQueryBuilder}, {@link TermQueryBuilder} or
@ -62,9 +64,10 @@ public class RandomQueryBuilder {
// for now, only use String Rangequeries for MultiTerm test, numeric and date makes little sense // for now, only use String Rangequeries for MultiTerm test, numeric and date makes little sense
// see issue #12123 for discussion // see issue #12123 for discussion
MultiTermQueryBuilder multiTermQueryBuilder; MultiTermQueryBuilder multiTermQueryBuilder;
String fieldName = randomFrom(STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME);
switch(RandomNumbers.randomIntBetween(r, 0, 3)) { switch(RandomNumbers.randomIntBetween(r, 0, 3)) {
case 0: case 0:
RangeQueryBuilder stringRangeQuery = new RangeQueryBuilder(AbstractQueryTestCase.STRING_FIELD_NAME); RangeQueryBuilder stringRangeQuery = new RangeQueryBuilder(fieldName);
stringRangeQuery.from("a" + RandomStrings.randomAsciiOfLengthBetween(r, 1, 10)); stringRangeQuery.from("a" + RandomStrings.randomAsciiOfLengthBetween(r, 1, 10));
stringRangeQuery.to("z" + RandomStrings.randomAsciiOfLengthBetween(r, 1, 10)); stringRangeQuery.to("z" + RandomStrings.randomAsciiOfLengthBetween(r, 1, 10));
multiTermQueryBuilder = stringRangeQuery; multiTermQueryBuilder = stringRangeQuery;
@ -76,7 +79,7 @@ public class RandomQueryBuilder {
multiTermQueryBuilder = new WildcardQueryBuilderTests().createTestQueryBuilder(); multiTermQueryBuilder = new WildcardQueryBuilderTests().createTestQueryBuilder();
break; break;
case 3: case 3:
multiTermQueryBuilder = new FuzzyQueryBuilder(AbstractQueryTestCase.STRING_FIELD_NAME, multiTermQueryBuilder = new FuzzyQueryBuilder(fieldName,
RandomStrings.randomAsciiOfLengthBetween(r, 1, 10)); RandomStrings.randomAsciiOfLengthBetween(r, 1, 10));
break; break;
default: default:

View File

@ -65,13 +65,15 @@ public class RangeQueryBuilderTests extends AbstractQueryTestCase<RangeQueryBuil
switch (randomIntBetween(0, 2)) { switch (randomIntBetween(0, 2)) {
case 0: case 0:
// use mapped integer field for numeric range queries // use mapped integer field for numeric range queries
query = new RangeQueryBuilder(randomBoolean() ? INT_FIELD_NAME : INT_RANGE_FIELD_NAME); query = new RangeQueryBuilder(randomFrom(
INT_FIELD_NAME, INT_RANGE_FIELD_NAME, INT_ALIAS_FIELD_NAME));
query.from(randomIntBetween(1, 100)); query.from(randomIntBetween(1, 100));
query.to(randomIntBetween(101, 200)); query.to(randomIntBetween(101, 200));
break; break;
case 1: case 1:
// use mapped date field, using date string representation // use mapped date field, using date string representation
query = new RangeQueryBuilder(randomBoolean() ? DATE_FIELD_NAME : DATE_RANGE_FIELD_NAME); query = new RangeQueryBuilder(randomFrom(
DATE_FIELD_NAME, DATE_RANGE_FIELD_NAME, DATE_ALIAS_FIELD_NAME));
query.from(new DateTime(System.currentTimeMillis() - randomIntBetween(0, 1000000), DateTimeZone.UTC).toString()); query.from(new DateTime(System.currentTimeMillis() - randomIntBetween(0, 1000000), DateTimeZone.UTC).toString());
query.to(new DateTime(System.currentTimeMillis() + randomIntBetween(0, 1000000), DateTimeZone.UTC).toString()); query.to(new DateTime(System.currentTimeMillis() + randomIntBetween(0, 1000000), DateTimeZone.UTC).toString());
// Create timestamp option only then we have a date mapper, // Create timestamp option only then we have a date mapper,
@ -87,7 +89,7 @@ public class RangeQueryBuilderTests extends AbstractQueryTestCase<RangeQueryBuil
break; break;
case 2: case 2:
default: default:
query = new RangeQueryBuilder(STRING_FIELD_NAME); query = new RangeQueryBuilder(randomFrom(STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME));
query.from("a" + randomAlphaOfLengthBetween(1, 10)); query.from("a" + randomAlphaOfLengthBetween(1, 10));
query.to("z" + randomAlphaOfLengthBetween(1, 10)); query.to("z" + randomAlphaOfLengthBetween(1, 10));
break; break;
@ -128,17 +130,18 @@ public class RangeQueryBuilderTests extends AbstractQueryTestCase<RangeQueryBuil
@Override @Override
protected void doAssertLuceneQuery(RangeQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException { protected void doAssertLuceneQuery(RangeQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException {
String expectedFieldName = expectedFieldName(queryBuilder.fieldName());
if (queryBuilder.from() == null && queryBuilder.to() == null) { if (queryBuilder.from() == null && queryBuilder.to() == null) {
final Query expectedQuery; final Query expectedQuery;
if (getCurrentTypes().length > 0) { if (getCurrentTypes().length > 0) {
if (context.mapperService().getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_1_0) if (context.mapperService().getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_1_0)
&& context.mapperService().fullName(queryBuilder.fieldName()).hasDocValues()) { && context.mapperService().fullName(queryBuilder.fieldName()).hasDocValues()) {
expectedQuery = new ConstantScoreQuery(new DocValuesFieldExistsQuery(queryBuilder.fieldName())); expectedQuery = new ConstantScoreQuery(new DocValuesFieldExistsQuery(expectedFieldName));
} else if (context.mapperService().getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_1_0) } else if (context.mapperService().getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_1_0)
&& context.mapperService().fullName(queryBuilder.fieldName()).omitNorms() == false) { && context.mapperService().fullName(queryBuilder.fieldName()).omitNorms() == false) {
expectedQuery = new ConstantScoreQuery(new NormsFieldExistsQuery(queryBuilder.fieldName())); expectedQuery = new ConstantScoreQuery(new NormsFieldExistsQuery(expectedFieldName));
} else { } else {
expectedQuery = new ConstantScoreQuery(new TermQuery(new Term(FieldNamesFieldMapper.NAME, queryBuilder.fieldName()))); expectedQuery = new ConstantScoreQuery(new TermQuery(new Term(FieldNamesFieldMapper.NAME, expectedFieldName)));
} }
} else { } else {
expectedQuery = new MatchNoDocsQuery("no mappings yet"); expectedQuery = new MatchNoDocsQuery("no mappings yet");
@ -146,18 +149,18 @@ public class RangeQueryBuilderTests extends AbstractQueryTestCase<RangeQueryBuil
assertThat(query, equalTo(expectedQuery)); assertThat(query, equalTo(expectedQuery));
} else if (getCurrentTypes().length == 0 || } else if (getCurrentTypes().length == 0 ||
(queryBuilder.fieldName().equals(DATE_FIELD_NAME) == false (expectedFieldName.equals(DATE_FIELD_NAME) == false
&& queryBuilder.fieldName().equals(INT_FIELD_NAME) == false && expectedFieldName.equals(INT_FIELD_NAME) == false
&& queryBuilder.fieldName().equals(DATE_RANGE_FIELD_NAME) == false && expectedFieldName.equals(DATE_RANGE_FIELD_NAME) == false
&& queryBuilder.fieldName().equals(INT_RANGE_FIELD_NAME) == false)) { && expectedFieldName.equals(INT_RANGE_FIELD_NAME) == false)) {
assertThat(query, instanceOf(TermRangeQuery.class)); assertThat(query, instanceOf(TermRangeQuery.class));
TermRangeQuery termRangeQuery = (TermRangeQuery) query; TermRangeQuery termRangeQuery = (TermRangeQuery) query;
assertThat(termRangeQuery.getField(), equalTo(queryBuilder.fieldName())); assertThat(termRangeQuery.getField(), equalTo(expectedFieldName));
assertThat(termRangeQuery.getLowerTerm(), equalTo(BytesRefs.toBytesRef(queryBuilder.from()))); assertThat(termRangeQuery.getLowerTerm(), equalTo(BytesRefs.toBytesRef(queryBuilder.from())));
assertThat(termRangeQuery.getUpperTerm(), equalTo(BytesRefs.toBytesRef(queryBuilder.to()))); assertThat(termRangeQuery.getUpperTerm(), equalTo(BytesRefs.toBytesRef(queryBuilder.to())));
assertThat(termRangeQuery.includesLower(), equalTo(queryBuilder.includeLower())); assertThat(termRangeQuery.includesLower(), equalTo(queryBuilder.includeLower()));
assertThat(termRangeQuery.includesUpper(), equalTo(queryBuilder.includeUpper())); assertThat(termRangeQuery.includesUpper(), equalTo(queryBuilder.includeUpper()));
} else if (queryBuilder.fieldName().equals(DATE_FIELD_NAME)) { } else if (expectedFieldName.equals(DATE_FIELD_NAME)) {
assertThat(query, instanceOf(IndexOrDocValuesQuery.class)); assertThat(query, instanceOf(IndexOrDocValuesQuery.class));
query = ((IndexOrDocValuesQuery) query).getIndexQuery(); query = ((IndexOrDocValuesQuery) query).getIndexQuery();
assertThat(query, instanceOf(PointRangeQuery.class)); assertThat(query, instanceOf(PointRangeQuery.class));
@ -202,7 +205,7 @@ public class RangeQueryBuilderTests extends AbstractQueryTestCase<RangeQueryBuil
} }
} }
assertEquals(LongPoint.newRangeQuery(DATE_FIELD_NAME, minLong, maxLong), query); assertEquals(LongPoint.newRangeQuery(DATE_FIELD_NAME, minLong, maxLong), query);
} else if (queryBuilder.fieldName().equals(INT_FIELD_NAME)) { } else if (expectedFieldName.equals(INT_FIELD_NAME)) {
assertThat(query, instanceOf(IndexOrDocValuesQuery.class)); assertThat(query, instanceOf(IndexOrDocValuesQuery.class));
query = ((IndexOrDocValuesQuery) query).getIndexQuery(); query = ((IndexOrDocValuesQuery) query).getIndexQuery();
assertThat(query, instanceOf(PointRangeQuery.class)); assertThat(query, instanceOf(PointRangeQuery.class));
@ -225,7 +228,7 @@ public class RangeQueryBuilderTests extends AbstractQueryTestCase<RangeQueryBuil
maxInt--; maxInt--;
} }
} }
} else if (queryBuilder.fieldName().equals(DATE_RANGE_FIELD_NAME) || queryBuilder.fieldName().equals(INT_RANGE_FIELD_NAME)) { } else if (expectedFieldName.equals(DATE_RANGE_FIELD_NAME) || expectedFieldName.equals(INT_RANGE_FIELD_NAME)) {
// todo can't check RangeFieldQuery because its currently package private (this will change) // todo can't check RangeFieldQuery because its currently package private (this will change)
} else { } else {
throw new UnsupportedOperationException(); throw new UnsupportedOperationException();

View File

@ -71,7 +71,7 @@ public class RegexpQueryBuilderTests extends AbstractQueryTestCase<RegexpQueryBu
private static RegexpQueryBuilder randomRegexpQuery() { private static RegexpQueryBuilder randomRegexpQuery() {
// mapped or unmapped fields // mapped or unmapped fields
String fieldName = randomBoolean() ? STRING_FIELD_NAME : randomAlphaOfLengthBetween(1, 10); String fieldName = randomFrom(STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME, randomAlphaOfLengthBetween(1, 10));
String value = randomAlphaOfLengthBetween(1, 10); String value = randomAlphaOfLengthBetween(1, 10);
return new RegexpQueryBuilder(fieldName, value); return new RegexpQueryBuilder(fieldName, value);
} }
@ -80,7 +80,9 @@ public class RegexpQueryBuilderTests extends AbstractQueryTestCase<RegexpQueryBu
protected void doAssertLuceneQuery(RegexpQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException { protected void doAssertLuceneQuery(RegexpQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException {
assertThat(query, instanceOf(RegexpQuery.class)); assertThat(query, instanceOf(RegexpQuery.class));
RegexpQuery regexpQuery = (RegexpQuery) query; RegexpQuery regexpQuery = (RegexpQuery) query;
assertThat(regexpQuery.getField(), equalTo(queryBuilder.fieldName()));
String expectedFieldName = expectedFieldName( queryBuilder.fieldName());
assertThat(regexpQuery.getField(), equalTo(expectedFieldName));
} }
public void testIllegalArguments() { public void testIllegalArguments() {

View File

@ -98,7 +98,8 @@ public class SimpleQueryStringBuilderTests extends AbstractQueryTestCase<SimpleQ
Map<String, Float> fields = new HashMap<>(); Map<String, Float> fields = new HashMap<>();
for (int i = 0; i < fieldCount; i++) { for (int i = 0; i < fieldCount; i++) {
if (randomBoolean()) { if (randomBoolean()) {
fields.put(STRING_FIELD_NAME, AbstractQueryBuilder.DEFAULT_BOOST); String fieldName = randomFrom(STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME);
fields.put(fieldName, AbstractQueryBuilder.DEFAULT_BOOST);
} else { } else {
fields.put(STRING_FIELD_NAME_2, 2.0f / randomIntBetween(1, 20)); fields.put(STRING_FIELD_NAME_2, 2.0f / randomIntBetween(1, 20));
} }

View File

@ -63,7 +63,11 @@ public class SpanMultiTermQueryBuilderTests extends AbstractQueryTestCase<SpanMu
.field("type", "text") .field("type", "text")
.startObject("index_prefixes").endObject() .startObject("index_prefixes").endObject()
.endObject() .endObject()
.endObject().endObject().endObject(); .startObject("prefix_field_alias")
.field("type", "alias")
.field("path", "prefix_field")
.endObject()
.endObject().endObject().endObject();
mapperService.merge("_doc", mapperService.merge("_doc",
new CompressedXContent(Strings.toString(mapping)), MapperService.MergeReason.MAPPING_UPDATE); new CompressedXContent(Strings.toString(mapping)), MapperService.MergeReason.MAPPING_UPDATE);
@ -168,16 +172,16 @@ public class SpanMultiTermQueryBuilderTests extends AbstractQueryTestCase<SpanMu
} }
public void testToQueryInnerSpanMultiTerm() throws IOException { public void testToQueryInnerSpanMultiTerm() throws IOException {
Query query = new SpanOrQueryBuilder(createTestQueryBuilder()).toQuery(createShardContext()); Query query = new SpanOrQueryBuilder(createTestQueryBuilder()).toQuery(createShardContext());
//verify that the result is still a span query, despite the boost that might get set (SpanBoostQuery rather than BoostQuery) //verify that the result is still a span query, despite the boost that might get set (SpanBoostQuery rather than BoostQuery)
assertThat(query, instanceOf(SpanQuery.class)); assertThat(query, instanceOf(SpanQuery.class));
} }
public void testToQueryInnerTermQuery() throws IOException { public void testToQueryInnerTermQuery() throws IOException {
String fieldName = randomFrom("prefix_field", "prefix_field_alias");
final QueryShardContext context = createShardContext(); final QueryShardContext context = createShardContext();
if (context.getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_4_0)) { if (context.getIndexSettings().getIndexVersionCreated().onOrAfter(Version.V_6_4_0)) {
Query query = new SpanMultiTermQueryBuilder(new PrefixQueryBuilder("prefix_field", "foo")) Query query = new SpanMultiTermQueryBuilder(new PrefixQueryBuilder(fieldName, "foo"))
.toQuery(context); .toQuery(context);
assertThat(query, instanceOf(FieldMaskingSpanQuery.class)); assertThat(query, instanceOf(FieldMaskingSpanQuery.class));
FieldMaskingSpanQuery fieldSpanQuery = (FieldMaskingSpanQuery) query; FieldMaskingSpanQuery fieldSpanQuery = (FieldMaskingSpanQuery) query;
@ -186,7 +190,7 @@ public class SpanMultiTermQueryBuilderTests extends AbstractQueryTestCase<SpanMu
SpanTermQuery spanTermQuery = (SpanTermQuery) fieldSpanQuery.getMaskedQuery(); SpanTermQuery spanTermQuery = (SpanTermQuery) fieldSpanQuery.getMaskedQuery();
assertThat(spanTermQuery.getTerm().text(), equalTo("foo")); assertThat(spanTermQuery.getTerm().text(), equalTo("foo"));
query = new SpanMultiTermQueryBuilder(new PrefixQueryBuilder("prefix_field", "foo")) query = new SpanMultiTermQueryBuilder(new PrefixQueryBuilder(fieldName, "foo"))
.boost(2.0f) .boost(2.0f)
.toQuery(context); .toQuery(context);
assertThat(query, instanceOf(SpanBoostQuery.class)); assertThat(query, instanceOf(SpanBoostQuery.class));
@ -199,7 +203,7 @@ public class SpanMultiTermQueryBuilderTests extends AbstractQueryTestCase<SpanMu
spanTermQuery = (SpanTermQuery) fieldSpanQuery.getMaskedQuery(); spanTermQuery = (SpanTermQuery) fieldSpanQuery.getMaskedQuery();
assertThat(spanTermQuery.getTerm().text(), equalTo("foo")); assertThat(spanTermQuery.getTerm().text(), equalTo("foo"));
} else { } else {
Query query = new SpanMultiTermQueryBuilder(new PrefixQueryBuilder("prefix_field", "foo")) Query query = new SpanMultiTermQueryBuilder(new PrefixQueryBuilder(fieldName, "foo"))
.toQuery(context); .toQuery(context);
assertThat(query, instanceOf(SpanMultiTermQueryWrapper.class)); assertThat(query, instanceOf(SpanMultiTermQueryWrapper.class));
SpanMultiTermQueryWrapper wrapper = (SpanMultiTermQueryWrapper) query; SpanMultiTermQueryWrapper wrapper = (SpanMultiTermQueryWrapper) query;
@ -208,7 +212,7 @@ public class SpanMultiTermQueryBuilderTests extends AbstractQueryTestCase<SpanMu
assertThat(prefixQuery.getField(), equalTo("prefix_field")); assertThat(prefixQuery.getField(), equalTo("prefix_field"));
assertThat(prefixQuery.getPrefix().text(), equalTo("foo")); assertThat(prefixQuery.getPrefix().text(), equalTo("foo"));
query = new SpanMultiTermQueryBuilder(new PrefixQueryBuilder("prefix_field", "foo")) query = new SpanMultiTermQueryBuilder(new PrefixQueryBuilder(fieldName, "foo"))
.boost(2.0f) .boost(2.0f)
.toQuery(context); .toQuery(context);
assertThat(query, instanceOf(SpanBoostQuery.class)); assertThat(query, instanceOf(SpanBoostQuery.class));

View File

@ -38,12 +38,11 @@ public class SpanTermQueryBuilderTests extends AbstractTermQueryTestCase<SpanTer
@Override @Override
protected SpanTermQueryBuilder doCreateTestQueryBuilder() { protected SpanTermQueryBuilder doCreateTestQueryBuilder() {
String fieldName = null; String fieldName = randomFrom(STRING_FIELD_NAME,
Object value; STRING_ALIAS_FIELD_NAME,
randomAlphaOfLengthBetween(1, 10));
if (randomBoolean()) { Object value;
fieldName = STRING_FIELD_NAME;
}
if (frequently()) { if (frequently()) {
value = randomAlphaOfLengthBetween(1, 10); value = randomAlphaOfLengthBetween(1, 10);
} else { } else {
@ -51,10 +50,6 @@ public class SpanTermQueryBuilderTests extends AbstractTermQueryTestCase<SpanTer
JsonStringEncoder encoder = JsonStringEncoder.getInstance(); JsonStringEncoder encoder = JsonStringEncoder.getInstance();
value = new String(encoder.quoteAsString(randomUnicodeOfLength(10))); value = new String(encoder.quoteAsString(randomUnicodeOfLength(10)));
} }
if (fieldName == null) {
fieldName = randomAlphaOfLengthBetween(1, 10);
}
return createQueryBuilder(fieldName, value); return createQueryBuilder(fieldName, value);
} }
@ -67,7 +62,10 @@ public class SpanTermQueryBuilderTests extends AbstractTermQueryTestCase<SpanTer
protected void doAssertLuceneQuery(SpanTermQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException { protected void doAssertLuceneQuery(SpanTermQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException {
assertThat(query, instanceOf(SpanTermQuery.class)); assertThat(query, instanceOf(SpanTermQuery.class));
SpanTermQuery spanTermQuery = (SpanTermQuery) query; SpanTermQuery spanTermQuery = (SpanTermQuery) query;
assertThat(spanTermQuery.getTerm().field(), equalTo(queryBuilder.fieldName()));
String expectedFieldName = expectedFieldName(queryBuilder.fieldName);
assertThat(spanTermQuery.getTerm().field(), equalTo(expectedFieldName));
MappedFieldType mapper = context.getQueryShardContext().fieldMapper(queryBuilder.fieldName()); MappedFieldType mapper = context.getQueryShardContext().fieldMapper(queryBuilder.fieldName());
if (mapper != null) { if (mapper != null) {
Term term = ((TermQuery) mapper.termQuery(queryBuilder.value(), null)).getTerm(); Term term = ((TermQuery) mapper.termQuery(queryBuilder.value(), null)).getTerm();

View File

@ -50,7 +50,7 @@ public class TermQueryBuilderTests extends AbstractTermQueryTestCase<TermQueryBu
break; break;
case 1: case 1:
if (randomBoolean()) { if (randomBoolean()) {
fieldName = STRING_FIELD_NAME; fieldName = randomFrom(STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME);
} }
if (frequently()) { if (frequently()) {
value = randomAlphaOfLengthBetween(1, 10); value = randomAlphaOfLengthBetween(1, 10);
@ -96,7 +96,10 @@ public class TermQueryBuilderTests extends AbstractTermQueryTestCase<TermQueryBu
MappedFieldType mapper = context.getQueryShardContext().fieldMapper(queryBuilder.fieldName()); MappedFieldType mapper = context.getQueryShardContext().fieldMapper(queryBuilder.fieldName());
if (query instanceof TermQuery) { if (query instanceof TermQuery) {
TermQuery termQuery = (TermQuery) query; TermQuery termQuery = (TermQuery) query;
assertThat(termQuery.getTerm().field(), equalTo(queryBuilder.fieldName()));
String expectedFieldName = expectedFieldName(queryBuilder.fieldName());
assertThat(termQuery.getTerm().field(), equalTo(expectedFieldName));
if (mapper != null) { if (mapper != null) {
Term term = ((TermQuery) mapper.termQuery(queryBuilder.value(), null)).getTerm(); Term term = ((TermQuery) mapper.termQuery(queryBuilder.value(), null)).getTerm();
assertThat(termQuery.getTerm(), equalTo(term)); assertThat(termQuery.getTerm(), equalTo(term));

View File

@ -77,9 +77,13 @@ public class TermsQueryBuilderTests extends AbstractQueryTestCase<TermsQueryBuil
// terms query or lookup query // terms query or lookup query
if (randomBoolean()) { if (randomBoolean()) {
// make between 0 and 5 different values of the same type // make between 0 and 5 different values of the same type
String fieldName; String fieldName = randomValueOtherThanMany(choice ->
fieldName = randomValueOtherThanMany(choice -> choice.equals(GEO_POINT_FIELD_NAME) || choice.equals(GEO_SHAPE_FIELD_NAME) choice.equals(GEO_POINT_FIELD_NAME) ||
|| choice.equals(INT_RANGE_FIELD_NAME) || choice.equals(DATE_RANGE_FIELD_NAME), () -> getRandomFieldName()); choice.equals(GEO_POINT_ALIAS_FIELD_NAME) ||
choice.equals(GEO_SHAPE_FIELD_NAME) ||
choice.equals(INT_RANGE_FIELD_NAME) ||
choice.equals(DATE_RANGE_FIELD_NAME),
() -> getRandomFieldName());
Object[] values = new Object[randomInt(5)]; Object[] values = new Object[randomInt(5)];
for (int i = 0; i < values.length; i++) { for (int i = 0; i < values.length; i++) {
values[i] = getRandomValueForFieldName(fieldName); values[i] = getRandomValueForFieldName(fieldName);
@ -129,7 +133,8 @@ public class TermsQueryBuilderTests extends AbstractQueryTestCase<TermsQueryBuil
terms = queryBuilder.values(); terms = queryBuilder.values();
} }
TermInSetQuery expected = new TermInSetQuery(queryBuilder.fieldName(), String fieldName = expectedFieldName(queryBuilder.fieldName());
TermInSetQuery expected = new TermInSetQuery(fieldName,
terms.stream().filter(Objects::nonNull).map(Object::toString).map(BytesRef::new).collect(Collectors.toList())); terms.stream().filter(Objects::nonNull).map(Object::toString).map(BytesRef::new).collect(Collectors.toList()));
assertEquals(expected, query); assertEquals(expected, query);
} }

View File

@ -28,12 +28,14 @@ import org.apache.lucene.index.IndexReader;
import org.apache.lucene.index.IndexWriter; import org.apache.lucene.index.IndexWriter;
import org.apache.lucene.index.IndexWriterConfig; import org.apache.lucene.index.IndexWriterConfig;
import org.apache.lucene.index.NoMergePolicy; import org.apache.lucene.index.NoMergePolicy;
import org.apache.lucene.index.Term;
import org.apache.lucene.search.CoveringQuery; import org.apache.lucene.search.CoveringQuery;
import org.apache.lucene.search.IndexSearcher; import org.apache.lucene.search.IndexSearcher;
import org.apache.lucene.search.MatchNoDocsQuery; import org.apache.lucene.search.MatchNoDocsQuery;
import org.apache.lucene.search.Query; import org.apache.lucene.search.Query;
import org.apache.lucene.search.Sort; import org.apache.lucene.search.Sort;
import org.apache.lucene.search.SortField; import org.apache.lucene.search.SortField;
import org.apache.lucene.search.TermQuery;
import org.apache.lucene.search.TopDocs; import org.apache.lucene.search.TopDocs;
import org.apache.lucene.store.Directory; import org.apache.lucene.store.Directory;
import org.elasticsearch.action.admin.indices.mapping.put.PutMappingRequest; import org.elasticsearch.action.admin.indices.mapping.put.PutMappingRequest;
@ -83,10 +85,9 @@ public class TermsSetQueryBuilderTests extends AbstractQueryTestCase<TermsSetQue
@Override @Override
protected TermsSetQueryBuilder doCreateTestQueryBuilder() { protected TermsSetQueryBuilder doCreateTestQueryBuilder() {
String fieldName; String fieldName = randomValueOtherThanMany(
do { value -> value.equals(GEO_POINT_FIELD_NAME) || value.equals(GEO_SHAPE_FIELD_NAME),
fieldName = randomFrom(MAPPED_FIELD_NAMES); () -> randomFrom(MAPPED_FIELD_NAMES));
} while (fieldName.equals(GEO_POINT_FIELD_NAME) || fieldName.equals(GEO_SHAPE_FIELD_NAME));
List<?> randomTerms = randomValues(fieldName); List<?> randomTerms = randomValues(fieldName);
TermsSetQueryBuilder queryBuilder = new TermsSetQueryBuilder(STRING_FIELD_NAME, randomTerms); TermsSetQueryBuilder queryBuilder = new TermsSetQueryBuilder(STRING_FIELD_NAME, randomTerms);
if (randomBoolean()) { if (randomBoolean()) {
@ -261,6 +262,22 @@ public class TermsSetQueryBuilderTests extends AbstractQueryTestCase<TermsSetQue
} }
} }
public void testFieldAlias() {
List<String> randomTerms = Arrays.asList(generateRandomStringArray(5, 10, false, false));
TermsSetQueryBuilder queryBuilder = new TermsSetQueryBuilder(STRING_ALIAS_FIELD_NAME, randomTerms)
.setMinimumShouldMatchField("m_s_m");
QueryShardContext context = createShardContext();
List<Query> termQueries = queryBuilder.createTermQueries(context);
assertEquals(randomTerms.size(), termQueries.size());
String expectedFieldName = expectedFieldName(queryBuilder.getFieldName());
for (int i = 0; i < randomTerms.size(); i++) {
Term term = new Term(expectedFieldName, randomTerms.get(i));
assertThat(termQueries.get(i), equalTo(new TermQuery(term)));
}
}
private static List<?> randomValues(final String fieldName) { private static List<?> randomValues(final String fieldName) {
final int numValues = randomIntBetween(0, 10); final int numValues = randomIntBetween(0, 10);
final List<Object> values = new ArrayList<>(numValues); final List<Object> values = new ArrayList<>(numValues);

View File

@ -60,21 +60,22 @@ public class WildcardQueryBuilderTests extends AbstractQueryTestCase<WildcardQue
} }
private static WildcardQueryBuilder randomWildcardQuery() { private static WildcardQueryBuilder randomWildcardQuery() {
// mapped or unmapped field String fieldName = randomFrom(STRING_FIELD_NAME,
STRING_ALIAS_FIELD_NAME,
randomAlphaOfLengthBetween(1, 10));
String text = randomAlphaOfLengthBetween(1, 10); String text = randomAlphaOfLengthBetween(1, 10);
if (randomBoolean()) {
return new WildcardQueryBuilder(STRING_FIELD_NAME, text); return new WildcardQueryBuilder(fieldName, text);
} else {
return new WildcardQueryBuilder(randomAlphaOfLengthBetween(1, 10), text);
}
} }
@Override @Override
protected void doAssertLuceneQuery(WildcardQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException { protected void doAssertLuceneQuery(WildcardQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException {
assertThat(query, instanceOf(WildcardQuery.class)); assertThat(query, instanceOf(WildcardQuery.class));
WildcardQuery wildcardQuery = (WildcardQuery) query; WildcardQuery wildcardQuery = (WildcardQuery) query;
assertThat(wildcardQuery.getField(), equalTo(queryBuilder.fieldName()));
assertThat(wildcardQuery.getTerm().field(), equalTo(queryBuilder.fieldName())); String expectedFieldName = expectedFieldName(queryBuilder.fieldName());
assertThat(wildcardQuery.getField(), equalTo(expectedFieldName));
assertThat(wildcardQuery.getTerm().field(), equalTo(expectedFieldName));
assertThat(wildcardQuery.getTerm().text(), equalTo(queryBuilder.value())); assertThat(wildcardQuery.getTerm().text(), equalTo(queryBuilder.value()));
} }
@ -138,19 +139,19 @@ public class WildcardQueryBuilderTests extends AbstractQueryTestCase<WildcardQue
assertEquals(expected, query); assertEquals(expected, query);
} }
} }
public void testIndexWildcard() throws IOException { public void testIndexWildcard() throws IOException {
assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0); assumeTrue("test runs only when at least a type is registered", getCurrentTypes().length > 0);
QueryShardContext context = createShardContext(); QueryShardContext context = createShardContext();
String index = context.getFullyQualifiedIndexName(); String index = context.getFullyQualifiedIndexName();
Query query = new WildcardQueryBuilder("_index", index).doToQuery(context); Query query = new WildcardQueryBuilder("_index", index).doToQuery(context);
assertThat(query instanceof MatchAllDocsQuery, equalTo(true)); assertThat(query instanceof MatchAllDocsQuery, equalTo(true));
query = new WildcardQueryBuilder("_index", index + "*").doToQuery(context); query = new WildcardQueryBuilder("_index", index + "*").doToQuery(context);
assertThat(query instanceof MatchAllDocsQuery, equalTo(true)); assertThat(query instanceof MatchAllDocsQuery, equalTo(true));
query = new WildcardQueryBuilder("_index", "index_" + index + "*").doToQuery(context); query = new WildcardQueryBuilder("_index", "index_" + index + "*").doToQuery(context);
assertThat(query instanceof MatchNoDocsQuery, equalTo(true)); assertThat(query instanceof MatchNoDocsQuery, equalTo(true));
} }

View File

@ -38,11 +38,12 @@ import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.common.Strings; import org.elasticsearch.common.Strings;
import org.elasticsearch.common.compress.CompressedXContent; import org.elasticsearch.common.compress.CompressedXContent;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.index.IndexService; import org.elasticsearch.index.IndexService;
import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.MappedFieldType; import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.MapperParsingException; import org.elasticsearch.index.mapper.MapperParsingException;
import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.plugins.Plugin; import org.elasticsearch.plugins.Plugin;
import org.elasticsearch.test.ESSingleNodeTestCase; import org.elasticsearch.test.ESSingleNodeTestCase;
import org.elasticsearch.test.InternalSettingsPlugin; import org.elasticsearch.test.InternalSettingsPlugin;
@ -86,22 +87,21 @@ public class SimilarityTests extends ESSingleNodeTestCase {
} }
public void testResolveSimilaritiesFromMapping_classic() throws IOException { public void testResolveSimilaritiesFromMapping_classic() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties") .startObject("properties")
.startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject() .startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject()
.endObject() .endObject()
.endObject().endObject()); .endObject().endObject();
Settings indexSettings = Settings.builder() Settings indexSettings = Settings.builder()
.put("index.similarity.my_similarity.type", "classic") .put("index.similarity.my_similarity.type", "classic")
.put("index.similarity.my_similarity.discount_overlaps", false) .put("index.similarity.my_similarity.discount_overlaps", false)
.put(IndexMetaData.SETTING_VERSION_CREATED, Version.V_6_3_0) // otherwise classic is forbidden .put(IndexMetaData.SETTING_VERSION_CREATED, Version.V_6_3_0) // otherwise classic is forbidden
.build(); .build();
IndexService indexService = createIndex("foo", indexSettings); MapperService mapperService = createIndex("foo", indexSettings, "type", mapping).mapperService();
DocumentMapper documentMapper = indexService.mapperService().documentMapperParser().parse("type", new CompressedXContent(mapping)); assertThat(mapperService.fullName("field1").similarity().get(), instanceOf(ClassicSimilarity.class));
assertThat(documentMapper.mappers().getMapper("field1").fieldType().similarity().get(), instanceOf(ClassicSimilarity.class));
ClassicSimilarity similarity = (ClassicSimilarity) documentMapper.mappers().getMapper("field1").fieldType().similarity().get(); ClassicSimilarity similarity = (ClassicSimilarity) mapperService.fullName("field1").similarity().get();
assertThat(similarity.getDiscountOverlaps(), equalTo(false)); assertThat(similarity.getDiscountOverlaps(), equalTo(false));
} }
@ -117,11 +117,11 @@ public class SimilarityTests extends ESSingleNodeTestCase {
} }
public void testResolveSimilaritiesFromMapping_bm25() throws IOException { public void testResolveSimilaritiesFromMapping_bm25() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties") .startObject("properties")
.startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject() .startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject()
.endObject() .endObject()
.endObject().endObject()); .endObject().endObject();
Settings indexSettings = Settings.builder() Settings indexSettings = Settings.builder()
.put("index.similarity.my_similarity.type", "BM25") .put("index.similarity.my_similarity.type", "BM25")
@ -129,37 +129,32 @@ public class SimilarityTests extends ESSingleNodeTestCase {
.put("index.similarity.my_similarity.b", 0.5f) .put("index.similarity.my_similarity.b", 0.5f)
.put("index.similarity.my_similarity.discount_overlaps", false) .put("index.similarity.my_similarity.discount_overlaps", false)
.build(); .build();
IndexService indexService = createIndex("foo", indexSettings); MapperService mapperService = createIndex("foo", indexSettings, "type", mapping).mapperService();
DocumentMapper documentMapper = indexService.mapperService().documentMapperParser().parse("type", new CompressedXContent(mapping)); assertThat(mapperService.fullName("field1").similarity().get(), instanceOf(BM25Similarity.class));
assertThat(documentMapper.mappers().getMapper("field1").fieldType().similarity().get(), instanceOf(BM25Similarity.class));
BM25Similarity similarity = (BM25Similarity) documentMapper.mappers().getMapper("field1").fieldType().similarity().get(); BM25Similarity similarity = (BM25Similarity) mapperService.fullName("field1").similarity().get();
assertThat(similarity.getK1(), equalTo(2.0f)); assertThat(similarity.getK1(), equalTo(2.0f));
assertThat(similarity.getB(), equalTo(0.5f)); assertThat(similarity.getB(), equalTo(0.5f));
assertThat(similarity.getDiscountOverlaps(), equalTo(false)); assertThat(similarity.getDiscountOverlaps(), equalTo(false));
} }
public void testResolveSimilaritiesFromMapping_boolean() throws IOException { public void testResolveSimilaritiesFromMapping_boolean() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties") .startObject("properties")
.startObject("field1").field("type", "text").field("similarity", "boolean").endObject() .startObject("field1").field("type", "text").field("similarity", "boolean").endObject()
.endObject() .endObject()
.endObject().endObject()); .endObject().endObject();
IndexService indexService = createIndex("foo", Settings.EMPTY); MapperService mapperService = createIndex("foo", Settings.EMPTY, "type", mapping).mapperService();
DocumentMapper documentMapper = indexService.mapperService() assertThat(mapperService.fullName("field1").similarity().get(), instanceOf(BooleanSimilarity.class));
.documentMapperParser()
.parse("type", new CompressedXContent(mapping));
assertThat(documentMapper.mappers().getMapper("field1").fieldType().similarity().get(),
instanceOf(BooleanSimilarity.class));
} }
public void testResolveSimilaritiesFromMapping_DFR() throws IOException { public void testResolveSimilaritiesFromMapping_DFR() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties") .startObject("properties")
.startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject() .startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject()
.endObject() .endObject()
.endObject().endObject()); .endObject().endObject();
Settings indexSettings = Settings.builder() Settings indexSettings = Settings.builder()
.put("index.similarity.my_similarity.type", "DFR") .put("index.similarity.my_similarity.type", "DFR")
@ -168,11 +163,10 @@ public class SimilarityTests extends ESSingleNodeTestCase {
.put("index.similarity.my_similarity.normalization", "h2") .put("index.similarity.my_similarity.normalization", "h2")
.put("index.similarity.my_similarity.normalization.h2.c", 3f) .put("index.similarity.my_similarity.normalization.h2.c", 3f)
.build(); .build();
IndexService indexService = createIndex("foo", indexSettings); MapperService mapperService = createIndex("foo", indexSettings, "type", mapping).mapperService();
DocumentMapper documentMapper = indexService.mapperService().documentMapperParser().parse("type", new CompressedXContent(mapping)); assertThat(mapperService.fullName("field1").similarity().get(), instanceOf(DFRSimilarity.class));
assertThat(documentMapper.mappers().getMapper("field1").fieldType().similarity().get(), instanceOf(DFRSimilarity.class));
DFRSimilarity similarity = (DFRSimilarity) documentMapper.mappers().getMapper("field1").fieldType().similarity().get(); DFRSimilarity similarity = (DFRSimilarity) mapperService.fullName("field1").similarity().get();
assertThat(similarity.getBasicModel(), instanceOf(BasicModelG.class)); assertThat(similarity.getBasicModel(), instanceOf(BasicModelG.class));
assertThat(similarity.getAfterEffect(), instanceOf(AfterEffectL.class)); assertThat(similarity.getAfterEffect(), instanceOf(AfterEffectL.class));
assertThat(similarity.getNormalization(), instanceOf(NormalizationH2.class)); assertThat(similarity.getNormalization(), instanceOf(NormalizationH2.class));
@ -180,11 +174,11 @@ public class SimilarityTests extends ESSingleNodeTestCase {
} }
public void testResolveSimilaritiesFromMapping_IB() throws IOException { public void testResolveSimilaritiesFromMapping_IB() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties") .startObject("properties")
.startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject() .startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject()
.endObject() .endObject()
.endObject().endObject()); .endObject().endObject();
Settings indexSettings = Settings.builder() Settings indexSettings = Settings.builder()
.put("index.similarity.my_similarity.type", "IB") .put("index.similarity.my_similarity.type", "IB")
@ -193,11 +187,10 @@ public class SimilarityTests extends ESSingleNodeTestCase {
.put("index.similarity.my_similarity.normalization", "h2") .put("index.similarity.my_similarity.normalization", "h2")
.put("index.similarity.my_similarity.normalization.h2.c", 3f) .put("index.similarity.my_similarity.normalization.h2.c", 3f)
.build(); .build();
IndexService indexService = createIndex("foo", indexSettings); MapperService mapperService = createIndex("foo", indexSettings, "type", mapping).mapperService();
DocumentMapper documentMapper = indexService.mapperService().documentMapperParser().parse("type", new CompressedXContent(mapping)); assertThat(mapperService.fullName("field1").similarity().get(), instanceOf(IBSimilarity.class));
assertThat(documentMapper.mappers().getMapper("field1").fieldType().similarity().get(), instanceOf(IBSimilarity.class));
IBSimilarity similarity = (IBSimilarity) documentMapper.mappers().getMapper("field1").fieldType().similarity().get(); IBSimilarity similarity = (IBSimilarity) mapperService.fullName("field1").similarity().get();
assertThat(similarity.getDistribution(), instanceOf(DistributionSPL.class)); assertThat(similarity.getDistribution(), instanceOf(DistributionSPL.class));
assertThat(similarity.getLambda(), instanceOf(LambdaTTF.class)); assertThat(similarity.getLambda(), instanceOf(LambdaTTF.class));
assertThat(similarity.getNormalization(), instanceOf(NormalizationH2.class)); assertThat(similarity.getNormalization(), instanceOf(NormalizationH2.class));
@ -205,59 +198,58 @@ public class SimilarityTests extends ESSingleNodeTestCase {
} }
public void testResolveSimilaritiesFromMapping_DFI() throws IOException { public void testResolveSimilaritiesFromMapping_DFI() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties") .startObject("properties")
.startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject() .startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject()
.endObject() .endObject()
.endObject().endObject()); .endObject().endObject();
Settings indexSettings = Settings.builder() Settings indexSettings = Settings.builder()
.put("index.similarity.my_similarity.type", "DFI") .put("index.similarity.my_similarity.type", "DFI")
.put("index.similarity.my_similarity.independence_measure", "chisquared") .put("index.similarity.my_similarity.independence_measure", "chisquared")
.build(); .build();
IndexService indexService = createIndex("foo", indexSettings); MapperService mapperService = createIndex("foo", indexSettings, "type", mapping).mapperService();
DocumentMapper documentMapper = indexService.mapperService().documentMapperParser().parse("type", new CompressedXContent(mapping)); MappedFieldType fieldType = mapperService.fullName("field1");
MappedFieldType fieldType = documentMapper.mappers().getMapper("field1").fieldType();
assertThat(fieldType.similarity().get(), instanceOf(DFISimilarity.class)); assertThat(fieldType.similarity().get(), instanceOf(DFISimilarity.class));
DFISimilarity similarity = (DFISimilarity) fieldType.similarity().get(); DFISimilarity similarity = (DFISimilarity) fieldType.similarity().get();
assertThat(similarity.getIndependence(), instanceOf(IndependenceChiSquared.class)); assertThat(similarity.getIndependence(), instanceOf(IndependenceChiSquared.class));
} }
public void testResolveSimilaritiesFromMapping_LMDirichlet() throws IOException { public void testResolveSimilaritiesFromMapping_LMDirichlet() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties") .startObject("properties")
.startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject() .startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject()
.endObject() .endObject()
.endObject().endObject()); .endObject().endObject();
Settings indexSettings = Settings.builder() Settings indexSettings = Settings.builder()
.put("index.similarity.my_similarity.type", "LMDirichlet") .put("index.similarity.my_similarity.type", "LMDirichlet")
.put("index.similarity.my_similarity.mu", 3000f) .put("index.similarity.my_similarity.mu", 3000f)
.build(); .build();
IndexService indexService = createIndex("foo", indexSettings);
DocumentMapper documentMapper = indexService.mapperService().documentMapperParser().parse("type", new CompressedXContent(mapping));
assertThat(documentMapper.mappers().getMapper("field1").fieldType().similarity().get(), instanceOf(LMDirichletSimilarity.class));
LMDirichletSimilarity similarity = (LMDirichletSimilarity) documentMapper.mappers().getMapper("field1").fieldType().similarity().get(); MapperService mapperService = createIndex("foo", indexSettings, "type", mapping).mapperService();
assertThat(mapperService.fullName("field1").similarity().get(), instanceOf(LMDirichletSimilarity.class));
LMDirichletSimilarity similarity = (LMDirichletSimilarity) mapperService.fullName("field1").similarity().get();
assertThat(similarity.getMu(), equalTo(3000f)); assertThat(similarity.getMu(), equalTo(3000f));
} }
public void testResolveSimilaritiesFromMapping_LMJelinekMercer() throws IOException { public void testResolveSimilaritiesFromMapping_LMJelinekMercer() throws IOException {
String mapping = Strings.toString(XContentFactory.jsonBuilder().startObject().startObject("type") XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type")
.startObject("properties") .startObject("properties")
.startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject() .startObject("field1").field("type", "text").field("similarity", "my_similarity").endObject()
.endObject() .endObject()
.endObject().endObject()); .endObject().endObject();
Settings indexSettings = Settings.builder() Settings indexSettings = Settings.builder()
.put("index.similarity.my_similarity.type", "LMJelinekMercer") .put("index.similarity.my_similarity.type", "LMJelinekMercer")
.put("index.similarity.my_similarity.lambda", 0.7f) .put("index.similarity.my_similarity.lambda", 0.7f)
.build(); .build();
IndexService indexService = createIndex("foo", indexSettings); MapperService mapperService = createIndex("foo", indexSettings, "type", mapping).mapperService();
DocumentMapper documentMapper = indexService.mapperService().documentMapperParser().parse("type", new CompressedXContent(mapping)); assertThat(mapperService.fullName("field1").similarity().get(), instanceOf(LMJelinekMercerSimilarity.class));
assertThat(documentMapper.mappers().getMapper("field1").fieldType().similarity().get(), instanceOf(LMJelinekMercerSimilarity.class));
LMJelinekMercerSimilarity similarity = (LMJelinekMercerSimilarity) documentMapper.mappers().getMapper("field1").fieldType().similarity().get(); LMJelinekMercerSimilarity similarity = (LMJelinekMercerSimilarity) mapperService.fullName("field1").similarity().get();
assertThat(similarity.getLambda(), equalTo(0.7f)); assertThat(similarity.getLambda(), equalTo(0.7f));
} }

View File

@ -20,6 +20,7 @@
package org.elasticsearch.indices.mapping; package org.elasticsearch.indices.mapping;
import org.elasticsearch.action.admin.indices.mapping.get.GetFieldMappingsResponse; import org.elasticsearch.action.admin.indices.mapping.get.GetFieldMappingsResponse;
import org.elasticsearch.action.admin.indices.mapping.get.GetFieldMappingsResponse.FieldMappingMetaData;
import org.elasticsearch.common.Strings; import org.elasticsearch.common.Strings;
import org.elasticsearch.common.xcontent.ToXContent; import org.elasticsearch.common.xcontent.ToXContent;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
@ -68,10 +69,26 @@ public class SimpleGetFieldMappingsIT extends ESIntegTestCase {
} }
private XContentBuilder getMappingForType(String type) throws IOException { private XContentBuilder getMappingForType(String type) throws IOException {
return jsonBuilder().startObject().startObject(type).startObject("properties") return jsonBuilder().startObject()
.startObject("field1").field("type", "text").endObject() .startObject(type)
.startObject("obj").startObject("properties").startObject("subfield").field("type", "keyword").endObject().endObject().endObject() .startObject("properties")
.endObject().endObject().endObject(); .startObject("field1")
.field("type", "text")
.endObject()
.startObject("alias")
.field("type", "alias")
.field("path", "field1")
.endObject()
.startObject("obj")
.startObject("properties")
.startObject("subfield")
.field("type", "keyword")
.endObject()
.endObject()
.endObject()
.endObject()
.endObject()
.endObject();
} }
public void testGetFieldMappings() throws Exception { public void testGetFieldMappings() throws Exception {
@ -138,8 +155,23 @@ public class SimpleGetFieldMappingsIT extends ESIntegTestCase {
assertThat((Map<String, Object>) response.fieldMappings("test", "type", "field1").sourceAsMap().get("field1"), hasEntry("index", Boolean.TRUE)); assertThat((Map<String, Object>) response.fieldMappings("test", "type", "field1").sourceAsMap().get("field1"), hasEntry("index", Boolean.TRUE));
assertThat((Map<String, Object>) response.fieldMappings("test", "type", "field1").sourceAsMap().get("field1"), hasEntry("type", (Object) "text")); assertThat((Map<String, Object>) response.fieldMappings("test", "type", "field1").sourceAsMap().get("field1"), hasEntry("type", (Object) "text"));
assertThat((Map<String, Object>) response.fieldMappings("test", "type", "obj.subfield").sourceAsMap().get("subfield"), hasEntry("type", (Object) "keyword")); assertThat((Map<String, Object>) response.fieldMappings("test", "type", "obj.subfield").sourceAsMap().get("subfield"), hasEntry("type", (Object) "keyword"));
}
@SuppressWarnings("unchecked")
public void testGetFieldMappingsWithFieldAlias() throws Exception {
assertAcked(prepareCreate("test").addMapping("type", getMappingForType("type")));
GetFieldMappingsResponse response = client().admin().indices().prepareGetFieldMappings()
.setFields("alias", "field1").get();
FieldMappingMetaData aliasMapping = response.fieldMappings("test", "type", "alias");
assertThat(aliasMapping.fullName(), equalTo("alias"));
assertThat(aliasMapping.sourceAsMap(), hasKey("alias"));
assertThat((Map<String, Object>) aliasMapping.sourceAsMap().get("alias"), hasEntry("type", "alias"));
FieldMappingMetaData field1Mapping = response.fieldMappings("test", "type", "field1");
assertThat(field1Mapping.fullName(), equalTo("field1"));
assertThat(field1Mapping.sourceAsMap(), hasKey("field1"));
} }
//fix #6552 //fix #6552

View File

@ -116,6 +116,21 @@ public class RangeIT extends ESIntegTestCase {
.field(SINGLE_VALUED_FIELD_NAME, i * 2 - 1) .field(SINGLE_VALUED_FIELD_NAME, i * 2 - 1)
.endObject())); .endObject()));
} }
// Create two indices and add the field 'route_length_miles' as an alias in
// one, and a concrete field in the other.
prepareCreate("old_index")
.addMapping("_doc", "distance", "type=double", "route_length_miles", "type=alias,path=distance")
.get();
prepareCreate("new_index")
.addMapping("_doc", "route_length_miles", "type=double")
.get();
builders.add(client().prepareIndex("old_index", "_doc").setSource("distance", 42.0));
builders.add(client().prepareIndex("old_index", "_doc").setSource("distance", 50.5));
builders.add(client().prepareIndex("new_index", "_doc").setSource("route_length_miles", 100.2));
builders.add(client().prepareIndex("new_index", "_doc").setSource(Collections.emptyMap()));
indexRandom(true, builders); indexRandom(true, builders);
ensureSearchable(); ensureSearchable();
} }
@ -972,4 +987,72 @@ public class RangeIT extends ESIntegTestCase {
assertThat(client().admin().indices().prepareStats("cache_test_idx").setRequestCache(true).get().getTotal().getRequestCache() assertThat(client().admin().indices().prepareStats("cache_test_idx").setRequestCache(true).get().getTotal().getRequestCache()
.getMissCount(), equalTo(1L)); .getMissCount(), equalTo(1L));
} }
public void testFieldAlias() {
SearchResponse response = client().prepareSearch("old_index", "new_index")
.addAggregation(range("range")
.field("route_length_miles")
.addUnboundedTo(50.0)
.addRange(50.0, 150.0)
.addUnboundedFrom(150.0))
.execute().actionGet();
assertSearchResponse(response);
Range range = response.getAggregations().get("range");
assertThat(range, notNullValue());
assertThat(range.getName(), equalTo("range"));
List<? extends Range.Bucket> buckets = range.getBuckets();
assertThat(buckets.size(), equalTo(3));
Range.Bucket bucket = buckets.get(0);
assertThat(bucket, notNullValue());
assertThat(bucket.getKey(), equalTo("*-50.0"));
assertThat(bucket.getDocCount(), equalTo(1L));
bucket = buckets.get(1);
assertThat(bucket, notNullValue());
assertThat(bucket.getKey(), equalTo("50.0-150.0"));
assertThat(bucket.getDocCount(), equalTo(2L));
bucket = buckets.get(2);
assertThat(bucket, notNullValue());
assertThat(bucket.getKey(), equalTo("150.0-*"));
assertThat(bucket.getDocCount(), equalTo(0L));
}
public void testFieldAliasWithMissingValue() {
SearchResponse response = client().prepareSearch("old_index", "new_index")
.addAggregation(range("range")
.field("route_length_miles")
.missing(0.0)
.addUnboundedTo(50.0)
.addRange(50.0, 150.0)
.addUnboundedFrom(150.0))
.execute().actionGet();
assertSearchResponse(response);
Range range = response.getAggregations().get("range");
assertThat(range, notNullValue());
assertThat(range.getName(), equalTo("range"));
List<? extends Range.Bucket> buckets = range.getBuckets();
assertThat(buckets.size(), equalTo(3));
Range.Bucket bucket = buckets.get(0);
assertThat(bucket, notNullValue());
assertThat(bucket.getKey(), equalTo("*-50.0"));
assertThat(bucket.getDocCount(), equalTo(2L));
bucket = buckets.get(1);
assertThat(bucket, notNullValue());
assertThat(bucket.getKey(), equalTo("50.0-150.0"));
assertThat(bucket.getDocCount(), equalTo(2L));
bucket = buckets.get(2);
assertThat(bucket, notNullValue());
assertThat(bucket.getKey(), equalTo("150.0-*"));
assertThat(bucket.getDocCount(), equalTo(0L));
}
} }

View File

@ -66,6 +66,10 @@ public class ReverseNestedIT extends ESIntegTestCase {
"type", "type",
jsonBuilder().startObject().startObject("properties") jsonBuilder().startObject().startObject("properties")
.startObject("field1").field("type", "keyword").endObject() .startObject("field1").field("type", "keyword").endObject()
.startObject("alias")
.field("type", "alias")
.field("path", "field1")
.endObject()
.startObject("nested1").field("type", "nested").startObject("properties") .startObject("nested1").field("type", "nested").startObject("properties")
.startObject("field2").field("type", "keyword").endObject() .startObject("field2").field("type", "keyword").endObject()
.endObject().endObject() .endObject().endObject()
@ -649,4 +653,28 @@ public class ReverseNestedIT extends ESIntegTestCase {
assertThat(barCount.getValue(), equalTo(2L)); assertThat(barCount.getValue(), equalTo(2L));
} }
} }
public void testFieldAlias() {
SearchResponse response = client().prepareSearch("idx1")
.addAggregation(nested("nested1", "nested1")
.subAggregation(
terms("field2").field("nested1.field2")
.subAggregation(
reverseNested("nested1_to_field1")
.subAggregation(
terms("field1").field("alias")
.collectMode(randomFrom(SubAggCollectionMode.values())))))).get();
assertSearchResponse(response);
Nested nested = response.getAggregations().get("nested1");
Terms nestedTerms = nested.getAggregations().get("field2");
Terms.Bucket bucket = nestedTerms.getBuckets().iterator().next();
ReverseNested reverseNested = bucket.getAggregations().get("nested1_to_field1");
Terms reverseNestedTerms = reverseNested.getAggregations().get("field1");
assertThat(((InternalAggregation)reverseNested).getProperty("field1"), sameInstance(reverseNestedTerms));
assertThat(reverseNestedTerms.getBuckets().size(), equalTo(6));
}
} }

View File

@ -71,8 +71,14 @@ import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.List; import java.util.List;
import java.util.Locale; import java.util.Locale;
import java.util.Map;
import java.util.function.Function;
import java.util.stream.Collectors;
import java.util.stream.DoubleStream; import java.util.stream.DoubleStream;
import static org.elasticsearch.search.aggregations.AggregationBuilders.max;
import static org.elasticsearch.search.aggregations.AggregationBuilders.nested;
public class NestedAggregatorTests extends AggregatorTestCase { public class NestedAggregatorTests extends AggregatorTestCase {
private static final String VALUE_FIELD_NAME = "number"; private static final String VALUE_FIELD_NAME = "number";
@ -84,6 +90,15 @@ public class NestedAggregatorTests extends AggregatorTestCase {
private final SeqNoFieldMapper.SequenceIDFields sequenceIDFields = SeqNoFieldMapper.SequenceIDFields.emptySeqID(); private final SeqNoFieldMapper.SequenceIDFields sequenceIDFields = SeqNoFieldMapper.SequenceIDFields.emptySeqID();
/**
* For each provided field type, we also register an alias with name <field>-alias.
*/
@Override
protected Map<String, MappedFieldType> getFieldAliases(MappedFieldType... fieldTypes) {
return Arrays.stream(fieldTypes).collect(Collectors.toMap(
ft -> ft.name() + "-alias",
Function.identity()));
}
public void testNoDocs() throws IOException { public void testNoDocs() throws IOException {
try (Directory directory = newDirectory()) { try (Directory directory = newDirectory()) {
@ -638,6 +653,49 @@ public class NestedAggregatorTests extends AggregatorTestCase {
} }
} }
public void testFieldAlias() throws IOException {
int numRootDocs = randomIntBetween(1, 20);
MappedFieldType fieldType = new NumberFieldMapper.NumberFieldType(
NumberFieldMapper.NumberType.LONG);
fieldType.setName(VALUE_FIELD_NAME);
try (Directory directory = newDirectory()) {
try (RandomIndexWriter iw = new RandomIndexWriter(random(), directory)) {
for (int i = 0; i < numRootDocs; i++) {
List<Document> documents = new ArrayList<>();
int numNestedDocs = randomIntBetween(0, 20);
generateDocuments(documents, numNestedDocs, i, NESTED_OBJECT, VALUE_FIELD_NAME);
Document document = new Document();
document.add(new Field(IdFieldMapper.NAME, Uid.encodeId(Integer.toString(i)), IdFieldMapper.Defaults.FIELD_TYPE));
document.add(new Field(TypeFieldMapper.NAME, "test",
TypeFieldMapper.Defaults.FIELD_TYPE));
document.add(sequenceIDFields.primaryTerm);
documents.add(document);
iw.addDocuments(documents);
}
iw.commit();
}
try (IndexReader indexReader = wrap(DirectoryReader.open(directory))) {
NestedAggregationBuilder agg = nested(NESTED_AGG, NESTED_OBJECT).subAggregation(
max(MAX_AGG_NAME).field(VALUE_FIELD_NAME));
NestedAggregationBuilder aliasAgg = nested(NESTED_AGG, NESTED_OBJECT).subAggregation(
max(MAX_AGG_NAME).field(VALUE_FIELD_NAME + "-alias"));
Nested nested = search(newSearcher(indexReader, false, true),
new MatchAllDocsQuery(), agg, fieldType);
Nested aliasNested = search(newSearcher(indexReader, false, true),
new MatchAllDocsQuery(), aliasAgg, fieldType);
assertTrue(nested.getDocCount() > 0);
assertEquals(nested, aliasNested);
}
}
}
private double generateMaxDocs(List<Document> documents, int numNestedDocs, int id, String path, String fieldName) { private double generateMaxDocs(List<Document> documents, int numNestedDocs, int id, String path, String fieldName) {
return DoubleStream.of(generateDocuments(documents, numNestedDocs, id, path, fieldName)) return DoubleStream.of(generateDocuments(documents, numNestedDocs, id, path, fieldName))
.max().orElse(Double.NEGATIVE_INFINITY); .max().orElse(Double.NEGATIVE_INFINITY);

View File

@ -40,7 +40,15 @@ import org.elasticsearch.search.aggregations.metrics.max.MaxAggregationBuilder;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays;
import java.util.List; import java.util.List;
import java.util.Map;
import java.util.function.Function;
import java.util.stream.Collectors;
import static org.elasticsearch.search.aggregations.AggregationBuilders.max;
import static org.elasticsearch.search.aggregations.AggregationBuilders.nested;
import static org.elasticsearch.search.aggregations.AggregationBuilders.reverseNested;
public class ReverseNestedAggregatorTests extends AggregatorTestCase { public class ReverseNestedAggregatorTests extends AggregatorTestCase {
@ -50,6 +58,15 @@ public class ReverseNestedAggregatorTests extends AggregatorTestCase {
private static final String REVERSE_AGG_NAME = "reverseNestedAgg"; private static final String REVERSE_AGG_NAME = "reverseNestedAgg";
private static final String MAX_AGG_NAME = "maxAgg"; private static final String MAX_AGG_NAME = "maxAgg";
/**
* For each provided field type, we also register an alias with name <field>-alias.
*/
@Override
protected Map<String, MappedFieldType> getFieldAliases(MappedFieldType... fieldTypes) {
return Arrays.stream(fieldTypes).collect(Collectors.toMap(
ft -> ft.name() + "-alias",
Function.identity()));
}
public void testNoDocs() throws IOException { public void testNoDocs() throws IOException {
try (Directory directory = newDirectory()) { try (Directory directory = newDirectory()) {
@ -150,4 +167,63 @@ public class ReverseNestedAggregatorTests extends AggregatorTestCase {
} }
} }
public void testFieldAlias() throws IOException {
int numParentDocs = randomIntBetween(1, 20);
MappedFieldType fieldType = new NumberFieldMapper.NumberFieldType(
NumberFieldMapper.NumberType.LONG);
fieldType.setName(VALUE_FIELD_NAME);
try (Directory directory = newDirectory()) {
try (RandomIndexWriter iw = new RandomIndexWriter(random(), directory)) {
for (int i = 0; i < numParentDocs; i++) {
List<Document> documents = new ArrayList<>();
int numNestedDocs = randomIntBetween(0, 20);
for (int nested = 0; nested < numNestedDocs; nested++) {
Document document = new Document();
document.add(new Field(IdFieldMapper.NAME, Uid.encodeId(Integer.toString(i)),
IdFieldMapper.Defaults.NESTED_FIELD_TYPE));
document.add(new Field(TypeFieldMapper.NAME, "__" + NESTED_OBJECT,
TypeFieldMapper.Defaults.FIELD_TYPE));
documents.add(document);
}
Document document = new Document();
document.add(new Field(IdFieldMapper.NAME, Uid.encodeId(Integer.toString(i)),
IdFieldMapper.Defaults.FIELD_TYPE));
document.add(new Field(TypeFieldMapper.NAME, "test",
TypeFieldMapper.Defaults.FIELD_TYPE));
long value = randomNonNegativeLong() % 10000;
document.add(new SortedNumericDocValuesField(VALUE_FIELD_NAME, value));
document.add(SeqNoFieldMapper.SequenceIDFields.emptySeqID().primaryTerm);
documents.add(document);
iw.addDocuments(documents);
}
iw.commit();
}
try (IndexReader indexReader = wrap(DirectoryReader.open(directory))) {
MaxAggregationBuilder maxAgg = max(MAX_AGG_NAME).field(VALUE_FIELD_NAME);
MaxAggregationBuilder aliasMaxAgg = max(MAX_AGG_NAME).field(VALUE_FIELD_NAME + "-alias");
NestedAggregationBuilder agg = nested(NESTED_AGG, NESTED_OBJECT).subAggregation(
reverseNested(REVERSE_AGG_NAME).subAggregation(maxAgg));
NestedAggregationBuilder aliasAgg = nested(NESTED_AGG, NESTED_OBJECT).subAggregation(
reverseNested(REVERSE_AGG_NAME).subAggregation(aliasMaxAgg));
Nested nested = search(newSearcher(indexReader, false, true),
new MatchAllDocsQuery(), agg, fieldType);
Nested aliasNested = search(newSearcher(indexReader, false, true),
new MatchAllDocsQuery(), aliasAgg, fieldType);
ReverseNested reverseNested = nested.getAggregations().get(REVERSE_AGG_NAME);
ReverseNested aliasReverseNested = aliasNested.getAggregations().get(REVERSE_AGG_NAME);
assertTrue(reverseNested.getDocCount() > 0);
assertEquals(reverseNested, aliasReverseNested);
}
}
}
} }

View File

@ -55,7 +55,13 @@ import org.hamcrest.Matchers;
import org.junit.Before; import org.junit.Before;
import java.io.IOException; import java.io.IOException;
import java.util.Arrays;
import java.util.List; import java.util.List;
import java.util.Map;
import java.util.function.Function;
import java.util.stream.Collectors;
import static org.elasticsearch.search.aggregations.AggregationBuilders.significantTerms;
public class SignificantTermsAggregatorTests extends AggregatorTestCase { public class SignificantTermsAggregatorTests extends AggregatorTestCase {
@ -70,6 +76,16 @@ public class SignificantTermsAggregatorTests extends AggregatorTestCase {
fieldType.setName("field"); fieldType.setName("field");
} }
/**
* For each provided field type, we also register an alias with name <field>-alias.
*/
@Override
protected Map<String, MappedFieldType> getFieldAliases(MappedFieldType... fieldTypes) {
return Arrays.stream(fieldTypes).collect(Collectors.toMap(
ft -> ft.name() + "-alias",
Function.identity()));
}
public void testParsedAsFilter() throws IOException { public void testParsedAsFilter() throws IOException {
IndexReader indexReader = new MultiReader(); IndexReader indexReader = new MultiReader();
IndexSearcher indexSearcher = newSearcher(indexReader); IndexSearcher indexSearcher = newSearcher(indexReader);
@ -104,7 +120,7 @@ public class SignificantTermsAggregatorTests extends AggregatorTestCase {
IndexWriterConfig indexWriterConfig = newIndexWriterConfig(); IndexWriterConfig indexWriterConfig = newIndexWriterConfig();
indexWriterConfig.setMaxBufferedDocs(100); indexWriterConfig.setMaxBufferedDocs(100);
indexWriterConfig.setRAMBufferSizeMB(100); // flush on open to have a single segment indexWriterConfig.setRAMBufferSizeMB(100); // flush on open to have a single segment
try (Directory dir = newDirectory(); IndexWriter w = new IndexWriter(dir, indexWriterConfig)) { try (Directory dir = newDirectory(); IndexWriter w = new IndexWriter(dir, indexWriterConfig)) {
addMixedTextDocs(textFieldType, w); addMixedTextDocs(textFieldType, w);
@ -137,7 +153,7 @@ public class SignificantTermsAggregatorTests extends AggregatorTestCase {
assertNull(terms.getBucketByKey("odd")); assertNull(terms.getBucketByKey("odd"));
assertNull(terms.getBucketByKey("common")); assertNull(terms.getBucketByKey("common"));
assertNotNull(terms.getBucketByKey("even")); assertNotNull(terms.getBucketByKey("even"));
// Search odd with regex includeexcludes // Search odd with regex includeexcludes
sigAgg.includeExclude(new IncludeExclude("o.d", null)); sigAgg.includeExclude(new IncludeExclude("o.d", null));
terms = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")), sigAgg, textFieldType); terms = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")), sigAgg, textFieldType);
@ -149,7 +165,7 @@ public class SignificantTermsAggregatorTests extends AggregatorTestCase {
// Search with string-based includeexcludes // Search with string-based includeexcludes
String oddStrings[] = new String[] {"odd", "weird"}; String oddStrings[] = new String[] {"odd", "weird"};
String evenStrings[] = new String[] {"even", "regular"}; String evenStrings[] = new String[] {"even", "regular"};
sigAgg.includeExclude(new IncludeExclude(oddStrings, evenStrings)); sigAgg.includeExclude(new IncludeExclude(oddStrings, evenStrings));
sigAgg.significanceHeuristic(SignificanceHeuristicTests.getRandomSignificanceheuristic()); sigAgg.significanceHeuristic(SignificanceHeuristicTests.getRandomSignificanceheuristic());
terms = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")), sigAgg, textFieldType); terms = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")), sigAgg, textFieldType);
@ -159,7 +175,7 @@ public class SignificantTermsAggregatorTests extends AggregatorTestCase {
assertNull(terms.getBucketByKey("common")); assertNull(terms.getBucketByKey("common"));
assertNull(terms.getBucketByKey("even")); assertNull(terms.getBucketByKey("even"));
assertNull(terms.getBucketByKey("regular")); assertNull(terms.getBucketByKey("regular"));
sigAgg.includeExclude(new IncludeExclude(evenStrings, oddStrings)); sigAgg.includeExclude(new IncludeExclude(evenStrings, oddStrings));
terms = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")), sigAgg, textFieldType); terms = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")), sigAgg, textFieldType);
assertEquals(0, terms.getBuckets().size()); assertEquals(0, terms.getBuckets().size());
@ -168,7 +184,7 @@ public class SignificantTermsAggregatorTests extends AggregatorTestCase {
assertNull(terms.getBucketByKey("common")); assertNull(terms.getBucketByKey("common"));
assertNull(terms.getBucketByKey("even")); assertNull(terms.getBucketByKey("even"));
assertNull(terms.getBucketByKey("regular")); assertNull(terms.getBucketByKey("regular"));
} }
} }
} }
@ -232,7 +248,7 @@ public class SignificantTermsAggregatorTests extends AggregatorTestCase {
} }
} }
} }
/** /**
* Uses the significant terms aggregation on an index with unmapped field * Uses the significant terms aggregation on an index with unmapped field
*/ */
@ -266,7 +282,57 @@ public class SignificantTermsAggregatorTests extends AggregatorTestCase {
} }
} }
} }
public void testFieldAlias() throws IOException {
TextFieldType textFieldType = new TextFieldType();
textFieldType.setName("text");
textFieldType.setFielddata(true);
textFieldType.setIndexAnalyzer(new NamedAnalyzer("my_analyzer", AnalyzerScope.GLOBAL, new StandardAnalyzer()));
IndexWriterConfig indexWriterConfig = newIndexWriterConfig();
indexWriterConfig.setMaxBufferedDocs(100);
indexWriterConfig.setRAMBufferSizeMB(100); // flush on open to have a single segment
try (Directory dir = newDirectory(); IndexWriter w = new IndexWriter(dir, indexWriterConfig)) {
addMixedTextDocs(textFieldType, w);
SignificantTermsAggregationBuilder agg = significantTerms("sig_text").field("text");
SignificantTermsAggregationBuilder aliasAgg = significantTerms("sig_text").field("text-alias");
String executionHint = randomExecutionHint();
agg.executionHint(executionHint);
aliasAgg.executionHint(executionHint);
if (randomBoolean()) {
// Use a background filter which just happens to be same scope as whole-index.
QueryBuilder backgroundFilter = QueryBuilders.termsQuery("text", "common");
agg.backgroundFilter(backgroundFilter);
aliasAgg.backgroundFilter(backgroundFilter);
}
try (IndexReader reader = DirectoryReader.open(w)) {
assertEquals("test expects a single segment", 1, reader.leaves().size());
IndexSearcher searcher = new IndexSearcher(reader);
SignificantTerms evenTerms = searchAndReduce(searcher, new TermQuery(new Term("text", "even")),
agg, textFieldType);
SignificantTerms aliasEvenTerms = searchAndReduce(searcher, new TermQuery(new Term("text", "even")),
aliasAgg, textFieldType);
assertFalse(evenTerms.getBuckets().isEmpty());
assertEquals(evenTerms, aliasEvenTerms);
SignificantTerms oddTerms = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")),
agg, textFieldType);
SignificantTerms aliasOddTerms = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")),
aliasAgg, textFieldType);
assertFalse(oddTerms.getBuckets().isEmpty());
assertEquals(oddTerms, aliasOddTerms);
}
}
}
private void addMixedTextDocs(TextFieldType textFieldType, IndexWriter w) throws IOException { private void addMixedTextDocs(TextFieldType textFieldType, IndexWriter w) throws IOException {
for (int i = 0; i < 10; i++) { for (int i = 0; i < 10; i++) {
@ -284,7 +350,7 @@ public class SignificantTermsAggregatorTests extends AggregatorTestCase {
w.addDocument(doc); w.addDocument(doc);
} }
} }
private void addFields(Document doc, List<Field> createFields) { private void addFields(Document doc, List<Field> createFields) {
for (Field field : createFields) { for (Field field : createFields) {

View File

@ -34,19 +34,34 @@ import org.apache.lucene.store.Directory;
import org.apache.lucene.util.BytesRef; import org.apache.lucene.util.BytesRef;
import org.elasticsearch.index.analysis.AnalyzerScope; import org.elasticsearch.index.analysis.AnalyzerScope;
import org.elasticsearch.index.analysis.NamedAnalyzer; import org.elasticsearch.index.analysis.NamedAnalyzer;
import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.TextFieldMapper.TextFieldType; import org.elasticsearch.index.mapper.TextFieldMapper.TextFieldType;
import org.elasticsearch.search.aggregations.AggregatorTestCase; import org.elasticsearch.search.aggregations.AggregatorTestCase;
import org.elasticsearch.search.aggregations.bucket.sampler.Sampler; import org.elasticsearch.search.aggregations.bucket.sampler.Sampler;
import org.elasticsearch.search.aggregations.bucket.sampler.SamplerAggregationBuilder; import org.elasticsearch.search.aggregations.bucket.sampler.SamplerAggregationBuilder;
import org.elasticsearch.search.aggregations.bucket.significant.SignificantTerms;
import org.elasticsearch.search.aggregations.bucket.significant.SignificantTextAggregationBuilder;
import java.io.IOException; import java.io.IOException;
import java.util.Arrays; import java.util.Arrays;
import java.util.List;
import java.util.Map;
import java.util.function.Function;
import java.util.stream.Collectors;
import static org.elasticsearch.search.aggregations.AggregationBuilders.sampler;
import static org.elasticsearch.search.aggregations.AggregationBuilders.significantText;
public class SignificantTextAggregatorTests extends AggregatorTestCase { public class SignificantTextAggregatorTests extends AggregatorTestCase {
/**
* For each provided field type, we also register an alias with name <field>-alias.
*/
@Override
protected Map<String, MappedFieldType> getFieldAliases(MappedFieldType... fieldTypes) {
return Arrays.stream(fieldTypes).collect(Collectors.toMap(
ft -> ft.name() + "-alias",
Function.identity()));
}
/** /**
* Uses the significant text aggregation to find the keywords in text fields * Uses the significant text aggregation to find the keywords in text fields
*/ */
@ -59,22 +74,7 @@ public class SignificantTextAggregatorTests extends AggregatorTestCase {
indexWriterConfig.setMaxBufferedDocs(100); indexWriterConfig.setMaxBufferedDocs(100);
indexWriterConfig.setRAMBufferSizeMB(100); // flush on open to have a single segment indexWriterConfig.setRAMBufferSizeMB(100); // flush on open to have a single segment
try (Directory dir = newDirectory(); IndexWriter w = new IndexWriter(dir, indexWriterConfig)) { try (Directory dir = newDirectory(); IndexWriter w = new IndexWriter(dir, indexWriterConfig)) {
for (int i = 0; i < 10; i++) { indexDocuments(w, textFieldType);
Document doc = new Document();
StringBuilder text = new StringBuilder("common ");
if (i % 2 == 0) {
text.append("odd ");
} else {
text.append("even separator" + i + " duplicate duplicate duplicate duplicate duplicate duplicate ");
}
doc.add(new Field("text", text.toString(), textFieldType));
String json ="{ \"text\" : \"" + text.toString() + "\","+
" \"json_only_field\" : \"" + text.toString() + "\"" +
" }";
doc.add(new StoredField("_source", new BytesRef(json)));
w.addDocument(doc);
}
SignificantTextAggregationBuilder sigAgg = new SignificantTextAggregationBuilder("sig_text", "text").filterDuplicateText(true); SignificantTextAggregationBuilder sigAgg = new SignificantTextAggregationBuilder("sig_text", "text").filterDuplicateText(true);
if(randomBoolean()){ if(randomBoolean()){
@ -82,37 +82,104 @@ public class SignificantTextAggregatorTests extends AggregatorTestCase {
} }
SamplerAggregationBuilder aggBuilder = new SamplerAggregationBuilder("sampler") SamplerAggregationBuilder aggBuilder = new SamplerAggregationBuilder("sampler")
.subAggregation(sigAgg); .subAggregation(sigAgg);
try (IndexReader reader = DirectoryReader.open(w)) { try (IndexReader reader = DirectoryReader.open(w)) {
assertEquals("test expects a single segment", 1, reader.leaves().size()); assertEquals("test expects a single segment", 1, reader.leaves().size());
IndexSearcher searcher = new IndexSearcher(reader); IndexSearcher searcher = new IndexSearcher(reader);
// Search "odd" which should have no duplication // Search "odd" which should have no duplication
Sampler sampler = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")), aggBuilder, textFieldType); Sampler sampler = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")), aggBuilder, textFieldType);
SignificantTerms terms = sampler.getAggregations().get("sig_text"); SignificantTerms terms = sampler.getAggregations().get("sig_text");
assertNull(terms.getBucketByKey("even")); assertNull(terms.getBucketByKey("even"));
assertNull(terms.getBucketByKey("duplicate")); assertNull(terms.getBucketByKey("duplicate"));
assertNull(terms.getBucketByKey("common")); assertNull(terms.getBucketByKey("common"));
assertNotNull(terms.getBucketByKey("odd")); assertNotNull(terms.getBucketByKey("odd"));
// Search "even" which will have duplication // Search "even" which will have duplication
sampler = searchAndReduce(searcher, new TermQuery(new Term("text", "even")), aggBuilder, textFieldType); sampler = searchAndReduce(searcher, new TermQuery(new Term("text", "even")), aggBuilder, textFieldType);
terms = sampler.getAggregations().get("sig_text"); terms = sampler.getAggregations().get("sig_text");
assertNull(terms.getBucketByKey("odd")); assertNull(terms.getBucketByKey("odd"));
assertNull(terms.getBucketByKey("duplicate")); assertNull(terms.getBucketByKey("duplicate"));
assertNull(terms.getBucketByKey("common")); assertNull(terms.getBucketByKey("common"));
assertNull(terms.getBucketByKey("separator2")); assertNull(terms.getBucketByKey("separator2"));
assertNull(terms.getBucketByKey("separator4")); assertNull(terms.getBucketByKey("separator4"));
assertNull(terms.getBucketByKey("separator6")); assertNull(terms.getBucketByKey("separator6"));
assertNotNull(terms.getBucketByKey("even")); assertNotNull(terms.getBucketByKey("even"));
} }
} }
} }
public void testFieldAlias() throws IOException {
TextFieldType textFieldType = new TextFieldType();
textFieldType.setName("text");
textFieldType.setIndexAnalyzer(new NamedAnalyzer("my_analyzer", AnalyzerScope.GLOBAL, new StandardAnalyzer()));
IndexWriterConfig indexWriterConfig = newIndexWriterConfig();
indexWriterConfig.setMaxBufferedDocs(100);
indexWriterConfig.setRAMBufferSizeMB(100); // flush on open to have a single segment
try (Directory dir = newDirectory(); IndexWriter w = new IndexWriter(dir, indexWriterConfig)) {
indexDocuments(w, textFieldType);
SignificantTextAggregationBuilder agg = significantText("sig_text", "text")
.filterDuplicateText(true);
SignificantTextAggregationBuilder aliasAgg = significantText("sig_text", "text-alias")
.filterDuplicateText(true);
if (randomBoolean()) {
List<String> sourceFieldNames = Arrays.asList(new String [] {"json_only_field"});
agg.sourceFieldNames(sourceFieldNames);
aliasAgg.sourceFieldNames(sourceFieldNames);
}
try (IndexReader reader = DirectoryReader.open(w)) {
assertEquals("test expects a single segment", 1, reader.leaves().size());
IndexSearcher searcher = new IndexSearcher(reader);
SamplerAggregationBuilder samplerAgg = sampler("sampler").subAggregation(agg);
SamplerAggregationBuilder aliasSamplerAgg = sampler("sampler").subAggregation(aliasAgg);
Sampler sampler = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")), samplerAgg, textFieldType);
Sampler aliasSampler = searchAndReduce(searcher, new TermQuery(new Term("text", "odd")), aliasSamplerAgg, textFieldType);
SignificantTerms terms = sampler.getAggregations().get("sig_text");
SignificantTerms aliasTerms = aliasSampler.getAggregations().get("sig_text");
assertFalse(terms.getBuckets().isEmpty());
assertEquals(terms, aliasTerms);
sampler = searchAndReduce(searcher, new TermQuery(new Term("text", "even")), samplerAgg, textFieldType);
aliasSampler = searchAndReduce(searcher, new TermQuery(new Term("text", "even")), aliasSamplerAgg, textFieldType);
terms = sampler.getAggregations().get("sig_text");
aliasTerms = aliasSampler.getAggregations().get("sig_text");
assertFalse(terms.getBuckets().isEmpty());
assertEquals(terms, aliasTerms);
}
}
}
private void indexDocuments(IndexWriter writer, TextFieldType textFieldType) throws IOException {
for (int i = 0; i < 10; i++) {
Document doc = new Document();
StringBuilder text = new StringBuilder("common ");
if (i % 2 == 0) {
text.append("odd ");
} else {
text.append("even separator" + i + " duplicate duplicate duplicate duplicate duplicate duplicate ");
}
doc.add(new Field("text", text.toString(), textFieldType));
String json ="{ \"text\" : \"" + text.toString() + "\","+
" \"json_only_field\" : \"" + text.toString() + "\"" +
" }";
doc.add(new StoredField("_source", new BytesRef(json)));
writer.addDocument(doc);
}
}
/** /**
* Test documents with arrays of text * Test documents with arrays of text
*/ */
@ -137,13 +204,13 @@ public class SignificantTextAggregatorTests extends AggregatorTestCase {
sigAgg.sourceFieldNames(Arrays.asList(new String [] {"title", "text"})); sigAgg.sourceFieldNames(Arrays.asList(new String [] {"title", "text"}));
try (IndexReader reader = DirectoryReader.open(w)) { try (IndexReader reader = DirectoryReader.open(w)) {
assertEquals("test expects a single segment", 1, reader.leaves().size()); assertEquals("test expects a single segment", 1, reader.leaves().size());
IndexSearcher searcher = new IndexSearcher(reader); IndexSearcher searcher = new IndexSearcher(reader);
searchAndReduce(searcher, new TermQuery(new Term("text", "foo")), sigAgg, textFieldType); searchAndReduce(searcher, new TermQuery(new Term("text", "foo")), sigAgg, textFieldType);
// No significant results to be found in this test - only checking we don't end up // No significant results to be found in this test - only checking we don't end up
// with the internal exception discovered in issue https://github.com/elastic/elasticsearch/issues/25029 // with the internal exception discovered in issue https://github.com/elastic/elasticsearch/issues/25029
} }
} }
} }
} }

View File

@ -809,12 +809,12 @@ public class TermsAggregatorTests extends AggregatorTestCase {
fieldType1.setHasDocValues(true); fieldType1.setHasDocValues(true);
MappedFieldType fieldType2 = new NumberFieldMapper.NumberFieldType(NumberFieldMapper.NumberType.LONG); MappedFieldType fieldType2 = new NumberFieldMapper.NumberFieldType(NumberFieldMapper.NumberType.LONG);
fieldType1.setName("another_long"); fieldType2.setName("another_long");
fieldType1.setHasDocValues(true); fieldType2.setHasDocValues(true);
MappedFieldType fieldType3 = new NumberFieldMapper.NumberFieldType(NumberFieldMapper.NumberType.DOUBLE); MappedFieldType fieldType3 = new NumberFieldMapper.NumberFieldType(NumberFieldMapper.NumberType.DOUBLE);
fieldType1.setName("another_double"); fieldType3.setName("another_double");
fieldType1.setHasDocValues(true); fieldType3.setHasDocValues(true);
try (IndexReader indexReader = maybeWrapReaderEs(indexWriter.getReader())) { try (IndexReader indexReader = maybeWrapReaderEs(indexWriter.getReader())) {
IndexSearcher indexSearcher = newIndexSearcher(indexReader); IndexSearcher indexSearcher = newIndexSearcher(indexReader);
ValueType[] valueTypes = new ValueType[]{ValueType.STRING, ValueType.LONG, ValueType.DOUBLE}; ValueType[] valueTypes = new ValueType[]{ValueType.STRING, ValueType.LONG, ValueType.DOUBLE};

View File

@ -18,12 +18,7 @@
*/ */
package org.elasticsearch.search.aggregations.metrics; package org.elasticsearch.search.aggregations.metrics;
import java.util.Collection; import org.elasticsearch.action.index.IndexRequestBuilder;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.elasticsearch.action.search.SearchResponse; import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.plugins.Plugin; import org.elasticsearch.plugins.Plugin;
@ -36,6 +31,14 @@ import org.elasticsearch.search.aggregations.bucket.global.Global;
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram; import org.elasticsearch.search.aggregations.bucket.histogram.Histogram;
import org.elasticsearch.search.aggregations.bucket.terms.Terms; import org.elasticsearch.search.aggregations.bucket.terms.Terms;
import org.elasticsearch.search.aggregations.metrics.sum.Sum; import org.elasticsearch.search.aggregations.metrics.sum.Sum;
import org.hamcrest.core.IsNull;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery; import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
import static org.elasticsearch.index.query.QueryBuilders.termQuery; import static org.elasticsearch.index.query.QueryBuilders.termQuery;
@ -61,6 +64,33 @@ public class SumIT extends AbstractNumericTestCase {
return Collections.singleton(MetricAggScriptPlugin.class); return Collections.singleton(MetricAggScriptPlugin.class);
} }
@Override
public void setupSuiteScopeCluster() throws Exception {
super.setupSuiteScopeCluster();
// Create two indices and add the field 'route_length_miles' as an alias in
// one, and a concrete field in the other.
prepareCreate("old_index")
.addMapping("_doc",
"transit_mode", "type=keyword",
"distance", "type=double",
"route_length_miles", "type=alias,path=distance")
.get();
prepareCreate("new_index")
.addMapping("_doc",
"transit_mode", "type=keyword",
"route_length_miles", "type=double")
.get();
List<IndexRequestBuilder> builders = new ArrayList<>();
builders.add(client().prepareIndex("old_index", "_doc").setSource("transit_mode", "train", "distance", 42.0));
builders.add(client().prepareIndex("old_index", "_doc").setSource("transit_mode", "bus", "distance", 50.5));
builders.add(client().prepareIndex("new_index", "_doc").setSource("transit_mode", "train", "route_length_miles", 100.2));
indexRandom(true, builders);
ensureSearchable();
}
@Override @Override
public void testEmptyAggregation() throws Exception { public void testEmptyAggregation() throws Exception {
@ -382,4 +412,54 @@ public class SumIT extends AbstractNumericTestCase {
assertThat(client().admin().indices().prepareStats("cache_test_idx").setRequestCache(true).get().getTotal().getRequestCache() assertThat(client().admin().indices().prepareStats("cache_test_idx").setRequestCache(true).get().getTotal().getRequestCache()
.getMissCount(), equalTo(1L)); .getMissCount(), equalTo(1L));
} }
public void testFieldAlias() {
SearchResponse response = client().prepareSearch("old_index", "new_index")
.addAggregation(sum("sum")
.field("route_length_miles"))
.execute().actionGet();
assertSearchResponse(response);
Sum sum = response.getAggregations().get("sum");
assertThat(sum, IsNull.notNullValue());
assertThat(sum.getName(), equalTo("sum"));
assertThat(sum.getValue(), equalTo(192.7));
}
public void testFieldAliasInSubAggregation() {
SearchResponse response = client().prepareSearch("old_index", "new_index")
.addAggregation(terms("terms")
.field("transit_mode")
.subAggregation(sum("sum")
.field("route_length_miles")))
.execute().actionGet();
assertSearchResponse(response);
Terms terms = response.getAggregations().get("terms");
assertThat(terms, notNullValue());
assertThat(terms.getName(), equalTo("terms"));
List<? extends Terms.Bucket> buckets = terms.getBuckets();
assertThat(buckets.size(), equalTo(2));
Terms.Bucket bucket = buckets.get(0);
assertThat(bucket, notNullValue());
assertThat(bucket.getKey(), equalTo("train"));
assertThat(bucket.getDocCount(), equalTo(2L));
Sum sum = bucket.getAggregations().get("sum");
assertThat(sum, notNullValue());
assertThat(sum.getValue(), equalTo(142.2));
bucket = buckets.get(1);
assertThat(bucket, notNullValue());
assertThat(bucket.getKey(), equalTo("bus"));
assertThat(bucket.getDocCount(), equalTo(1L));
sum = bucket.getAggregations().get("sum");
assertThat(sum, notNullValue());
assertThat(sum.getValue(), equalTo(50.5));
}
} }

Some files were not shown because too many files have changed in this diff Show More