Replaced the multi-field type in favour for the multi fields option that can be set on any core field.

When upgrading to ES 1.0 the existing mappings with a multi-field type automatically get replaced to a core field with the new `fields` option.

If a `multi_field` type-ed field doesn't have a main / default field, a default field will be chosen for the multi fields syntax. The new main field type
will be equal to the first `multi_field` fields' field or type string if no fields have been configured for the `multi_field` field and in both cases
the default index will not be indexed (`index=no` is set on the default field).

If a `multi_field` typed field has a default field, that field will replace the `multi_field` typed field.

Closes to #4521
This commit is contained in:
Martijn van Groningen 2013-12-28 02:05:31 +01:00
parent c386155a73
commit 943b62634c
54 changed files with 1195 additions and 587 deletions

View File

@ -38,7 +38,7 @@ which means conflicts are *not* ignored.
The definition of conflict is really dependent on the type merged, but
in general, if a different core type is defined, it is considered as a
conflict. New mapping definitions can be added to object types, and core
type mapping can be upgraded to `multi_field` type.
type mapping can be upgraded by specifying multi fields on a core type.
[float]
[[put-mapping-multi-index]]

View File

@ -6,3 +6,5 @@ include::mapping/date-format.asciidoc[]
include::mapping/conf-mappings.asciidoc[]
include::mapping/meta.asciidoc[]
include::mapping/multi-fields.asciidoc[]

View File

@ -0,0 +1,120 @@
[[multi-fields]]
== Multi Fields
The `fields` options allows to map several core types fields into a single
json source field. This can be useful if a single field need to be
used in different ways. For example a single field is to be used for both
free text search and sorting.
[source,js]
--------------------------------------------------
{
"tweet" : {
"properties" : {
"name" : {
"type" : "string",
"index" : "analyzed",
"fields" : {
"raw" : {"type" : "string", "index" : "not_analyzed"}
}
}
}
}
}
--------------------------------------------------
In the above example the field name gets processed twice. The first time is gets
processed as an analyzed string and this version is accessible under the field name
`name`, this is the main field and is in fact just like any other field. The second time
its get processed as a not analyzed string and is accessible under the name `name.raw`.
[float]
=== Accessing Fields
The multi fields defined in the `fields` are prefixed with the
name of the main field and can be accessed by their full path using the
navigation notation: `name.raw`, or using the typed navigation notation
`tweet.name.raw`. The `path` option allows to control how fields are accessed.
If the `path` option is set to `full`, then the full path of the main field
is prefixed, but if the `path` option is set to `just_name` the actual
multi field name without any prefix is used. The default value for
the `path` option is `full`.
The `just_name` setting, among other things, allows indexing content of multiple
fields under the same name. In the example below the content of both fields
`first_name` and `last_name` can be accessed by using `any_name` or `tweet.any_name`.
[source,js]
--------------------------------------------------
{
"tweet" : {
"properties": {
"first_name": {
"type": "string",
"index": "analyzed",
"path": "just_name",
"fields": {
"any_name": {"type": "string","index": "analyzed"}
}
},
"last_name": {
"type": "string",
"index": "analyzed",
"path": "just_name",
"fields": {
"any_name": {"type": "string","index": "analyzed"}
}
}
}
}
}
--------------------------------------------------
[float]
=== Include in All
The `include_in_all` setting is ignored on any field that is defined in
the `fields` options. Setting the `include_in_all` only makes sense on
the main field, since the raw field value to copied to the `_all` field,
the tokens aren't copied.
[float]
=== Updating a field
In the essence a field can't be updated. However multi fields can be
added to existing fields. This allows for example to have a different
`index_analyzer` configuration in addition to the already configured
`index_analyzer` configuration specified in the main and other multi fields.
Also the new multi field will only be applied on document that have been
added after the multi field has been added and in fact the new multi field
doesn't exist in existing documents.
Another important note is that new multi fields will be merged into the
list of existing multi fields, so when adding new multi fields for a field
previous added multi fields don't need to be specified.
[float]
=== Inherit settings from main field
Any settings defined on the main field are automatically inherited by all
multi fields and act as a default for a multi field.
The first example could also be defined as in the example below:
[source,js]
--------------------------------------------------
{
"tweet" : {
"properties" : {
"name" : {
"type" : "string",
"index" : "analyzed",
"fields" : {
"raw" : {"index" : "not_analyzed"}
}
}
}
}
}
--------------------------------------------------

View File

@ -14,8 +14,6 @@ include::types/root-object-type.asciidoc[]
include::types/nested-type.asciidoc[]
include::types/multi-field-type.asciidoc[]
include::types/ip-type.asciidoc[]
include::types/geo-point-type.asciidoc[]

View File

@ -245,12 +245,9 @@ example:
{
"tweet" : {
"properties" : {
"message" : {
"type" : "multi_field",
"name" : {
"type" : "string",
"fields" : {
"name": {
"type": "string"
},
"word_count": {
"type" : "token_count",
"store" : "yes",

View File

@ -1,93 +0,0 @@
[[mapping-multi-field-type]]
=== Multi Field Type
The `multi_field` type allows to map several
<<mapping-core-types,core_types>> of the same
value. This can come very handy, for example, when wanting to map a
`string` type, once when it's `analyzed` and once when it's
`not_analyzed`. For example:
[source,js]
--------------------------------------------------
{
"tweet" : {
"properties" : {
"name" : {
"type" : "multi_field",
"fields" : {
"name" : {"type" : "string", "index" : "analyzed"},
"untouched" : {"type" : "string", "index" : "not_analyzed"}
}
}
}
}
}
--------------------------------------------------
The above example shows how the `name` field, which is of simple
`string` type, gets mapped twice, once with it being `analyzed` under
`name`, and once with it being `not_analyzed` under `untouched`.
[float]
==== Accessing Fields
With `multi_field` mapping, the field that has the same name as the
property is treated as if it was mapped without a multi field. That's
the "default" field. It can be accessed regularly, for example using
`name` or using typed navigation `tweet.name`.
The `path` attribute allows to control how non-default fields can be
accessed. If the `path` attribute is set to `full`, which is the default
setting, all non-default fields are prefixed with the name of the
property and can be accessed by their full path using the navigation
notation: `name.untouched`, or using the typed navigation notation
`tweet.name.untouched`. If the `path` attribute is set to `just_name`
the actual field name without a prefix is used. The `just_name` setting,
among other things, allows indexing content of multiple fields under the
same name. In the example below the content of both fields `first_name`
and `last_name` can be accessed by using `any_name` or `tweet.any_name`
[source,js]
--------------------------------------------------
{
"tweet" : {
"properties": {
"first_name": {
"type": "multi_field",
"path": "just_name",
"fields": {
"first_name": {"type": "string", "index": "analyzed"},
"any_name": {"type": "string","index": "analyzed"}
}
},
"last_name": {
"type": "multi_field",
"path": "just_name",
"fields": {
"last_name": {"type": "string", "index": "analyzed"},
"any_name": {"type": "string","index": "analyzed"}
}
}
}
}
}
--------------------------------------------------
[float]
==== Include in All
The `include_in_all` setting on the "default" field allows to control if
the value of the field should be included in the `_all` field. Note, the
value of the field is copied to `_all`, not the tokens. So, it only
makes sense to copy the field value once. Because of this, the
`include_in_all` setting on all non-default fields is automatically set
to `false` and can't be changed.
[float]
==== Merging
When updating mapping definition using the `put_mapping` API, a core
type mapping can be "upgraded" to a `multi_field` mapping. This means
that if the old mapping has a plain core type mapping, the updated
mapping for the same property can be a `multi_field` type, with the
default field being the one being replaced.

View File

@ -133,8 +133,8 @@ when dynamic introduction of fields / objects happens.
For example, we might want to have all fields to be stored by default,
or all `string` fields to be stored, or have `string` fields to always
be indexed as `multi_field`, once analyzed and once not_analyzed. Here
is a simple example:
be indexed with multi fields syntax, once analyzed and once not_analyzed.
Here is a simple example:
[source,js]
--------------------------------------------------
@ -145,9 +145,9 @@ is a simple example:
"template_1" : {
"match" : "multi*",
"mapping" : {
"type" : "multi_field",
"type" : "{dynamic_type}",
"index" : "analyzed",
"fields" : {
"{name}" : {"type": "{dynamic_type}", "index" : "analyzed"},
"org" : {"type": "{dynamic_type}", "index" : "not_analyzed"}
}
}
@ -168,7 +168,7 @@ is a simple example:
}
--------------------------------------------------
The above mapping will create a `multi_field` mapping for all field
The above mapping will create a field with multi fields for all field
names starting with multi, and will map all `string` types to be
`not_analyzed`.

View File

@ -128,7 +128,7 @@ The following parameters are supported:
NOTE: Even though you are losing most of the features of the
completion suggest, you can opt in for the shortest form, which even
allows you to use inside of multi_field. But keep in mind, that you will
allows you to use inside of multi fields. But keep in mind, that you will
not be able to use several inputs, an output, payloads or weights.
[source,js]

View File

@ -11,7 +11,10 @@
body:
test_type:
properties:
text:
text1:
type: string
analyzer: whitespace
text2:
type: string
analyzer: whitespace
@ -19,8 +22,10 @@
indices.get_mapping:
index: test_index
- match: {test_index.test_type.properties.text.type: string}
- match: {test_index.test_type.properties.text.analyzer: whitespace}
- match: {test_index.test_type.properties.text1.type: string}
- match: {test_index.test_type.properties.text1.analyzer: whitespace}
- match: {test_index.test_type.properties.text2.type: string}
- match: {test_index.test_type.properties.text2.analyzer: whitespace}
- do:
indices.put_mapping:
@ -29,19 +34,29 @@
body:
test_type:
properties:
text:
text1:
type: multi_field
fields:
text:
text1:
type: string
analyzer: whitespace
text_raw:
type: string
index: not_analyzed
text2:
type: string
analyzer: whitespace
fields:
text_raw:
type: string
index: not_analyzed
- do:
indices.get_mapping:
index: test_index
- match: {test_index.test_type.properties.text.type: multi_field}
- match: {test_index.test_type.properties.text.fields.text_raw.index: not_analyzed }
- match: {test_index.test_type.properties.text1.type: string}
- match: {test_index.test_type.properties.text1.fields.text_raw.index: not_analyzed}
- match: {test_index.test_type.properties.text2.type: string}
- match: {test_index.test_type.properties.text2.fields.text_raw.index: not_analyzed}

View File

@ -572,13 +572,9 @@ public class DocumentMapper implements ToXContent {
return doc;
}
public void addFieldMappers(Collection<FieldMapper> fieldMappers) {
addFieldMappers(fieldMappers.toArray(new FieldMapper[fieldMappers.size()]));
}
public void addFieldMappers(FieldMapper... fieldMappers) {
public void addFieldMappers(Iterable<FieldMapper> fieldMappers) {
synchronized (mappersMutex) {
this.fieldMappers.addNewMappers(Arrays.asList(fieldMappers));
this.fieldMappers.addNewMappers(fieldMappers);
}
for (FieldMapperListener listener : fieldMapperListeners) {
listener.fieldMappers(fieldMappers);

View File

@ -42,7 +42,6 @@ import org.elasticsearch.index.mapper.geo.GeoPointFieldMapper;
import org.elasticsearch.index.mapper.geo.GeoShapeFieldMapper;
import org.elasticsearch.index.mapper.internal.*;
import org.elasticsearch.index.mapper.ip.IpFieldMapper;
import org.elasticsearch.index.mapper.multifield.MultiFieldMapper;
import org.elasticsearch.index.mapper.object.ObjectMapper;
import org.elasticsearch.index.mapper.object.RootObjectMapper;
import org.elasticsearch.index.settings.IndexSettings;
@ -98,7 +97,7 @@ public class DocumentMapperParser extends AbstractIndexComponent {
.put(TokenCountFieldMapper.CONTENT_TYPE, new TokenCountFieldMapper.TypeParser())
.put(ObjectMapper.CONTENT_TYPE, new ObjectMapper.TypeParser())
.put(ObjectMapper.NESTED_CONTENT_TYPE, new ObjectMapper.TypeParser())
.put(MultiFieldMapper.CONTENT_TYPE, new MultiFieldMapper.TypeParser())
.put(TypeParsers.MULTI_FIELD_CONTENT_TYPE, TypeParsers.multiFieldConverterTypeParser)
.put(CompletionFieldMapper.CONTENT_TYPE, new CompletionFieldMapper.TypeParser())
.put(GeoPointFieldMapper.CONTENT_TYPE, new GeoPointFieldMapper.TypeParser());

View File

@ -38,7 +38,7 @@ public abstract class FieldMapperListener {
public abstract void fieldMapper(FieldMapper fieldMapper);
public void fieldMappers(FieldMapper... fieldMappers) {
public void fieldMappers(Iterable<FieldMapper> fieldMappers) {
for (FieldMapper mapper : fieldMappers) {
fieldMapper(mapper);
}

View File

@ -26,7 +26,6 @@ import org.elasticsearch.index.mapper.geo.GeoPointFieldMapper;
import org.elasticsearch.index.mapper.geo.GeoShapeFieldMapper;
import org.elasticsearch.index.mapper.internal.*;
import org.elasticsearch.index.mapper.ip.IpFieldMapper;
import org.elasticsearch.index.mapper.multifield.MultiFieldMapper;
import org.elasticsearch.index.mapper.object.ObjectMapper;
import org.elasticsearch.index.mapper.object.RootObjectMapper;
@ -103,10 +102,6 @@ public final class MapperBuilders {
return new AnalyzerMapper.Builder();
}
public static MultiFieldMapper.Builder multiField(String name) {
return new MultiFieldMapper.Builder(name);
}
public static RootObjectMapper.Builder rootObject(String name) {
return new RootObjectMapper.Builder(name);
}

View File

@ -291,7 +291,7 @@ public class MapperService extends AbstractIndexComponent implements Iterable<Do
} else {
FieldMapperListener.Aggregator fieldMappersAgg = new FieldMapperListener.Aggregator();
mapper.traverse(fieldMappersAgg);
addFieldMappers(fieldMappersAgg.mappers.toArray(new FieldMapper[fieldMappersAgg.mappers.size()]));
addFieldMappers(fieldMappersAgg.mappers);
mapper.addFieldMapperListener(fieldMapperListener, false);
ObjectMapperListener.Aggregator objectMappersAgg = new ObjectMapperListener.Aggregator();
@ -328,9 +328,9 @@ public class MapperService extends AbstractIndexComponent implements Iterable<Do
}
}
private void addFieldMappers(FieldMapper[] fieldMappers) {
private void addFieldMappers(Iterable<FieldMapper> fieldMappers) {
synchronized (mappersMutex) {
this.fieldMappers.addNewMappers(Arrays.asList(fieldMappers));
this.fieldMappers.addNewMappers(fieldMappers);
}
}
@ -1045,11 +1045,11 @@ public class MapperService extends AbstractIndexComponent implements Iterable<Do
class InternalFieldMapperListener extends FieldMapperListener {
@Override
public void fieldMapper(FieldMapper fieldMapper) {
addFieldMappers(new FieldMapper[]{fieldMapper});
addFieldMappers(Arrays.asList(fieldMapper));
}
@Override
public void fieldMappers(FieldMapper... fieldMappers) {
public void fieldMappers(Iterable<FieldMapper> fieldMappers) {
addFieldMappers(fieldMappers);
}
}

View File

@ -20,6 +20,8 @@
package org.elasticsearch.index.mapper.core;
import com.carrotsearch.hppc.ObjectOpenHashSet;
import com.carrotsearch.hppc.cursors.ObjectCursor;
import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
import com.google.common.base.Objects;
import org.apache.lucene.analysis.Analyzer;
import org.apache.lucene.document.Field;
@ -32,6 +34,7 @@ import org.apache.lucene.search.*;
import org.apache.lucene.util.BytesRef;
import org.elasticsearch.ElasticsearchIllegalArgumentException;
import org.elasticsearch.common.Nullable;
import org.elasticsearch.common.collect.ImmutableOpenMap;
import org.elasticsearch.common.lucene.BytesRefs;
import org.elasticsearch.common.lucene.Lucene;
import org.elasticsearch.common.lucene.search.RegexpFilter;
@ -48,6 +51,7 @@ import org.elasticsearch.index.codec.postingsformat.PostingsFormatService;
import org.elasticsearch.index.fielddata.FieldDataType;
import org.elasticsearch.index.fielddata.IndexFieldDataService;
import org.elasticsearch.index.mapper.*;
import org.elasticsearch.index.mapper.internal.AllFieldMapper;
import org.elasticsearch.index.query.QueryParseContext;
import org.elasticsearch.index.search.FieldDataTermsFilter;
import org.elasticsearch.index.similarity.SimilarityLookupService;
@ -56,6 +60,7 @@ import org.elasticsearch.index.similarity.SimilarityProvider;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import java.util.Locale;
import java.util.Map;
/**
@ -78,6 +83,7 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
}
public static final float BOOST = 1.0f;
public static final ContentPath.Type PATH_TYPE = ContentPath.Type.FULL;
}
public abstract static class Builder<T extends Builder, Y extends AbstractFieldMapper> extends Mapper.Builder<T, Y> {
@ -97,10 +103,12 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
protected Loading normsLoading;
@Nullable
protected Settings fieldDataSettings;
protected final MultiFields.Builder multiFieldsBuilder;
protected Builder(String name, FieldType fieldType) {
super(name);
this.fieldType = fieldType;
multiFieldsBuilder = new MultiFields.Builder();
}
public T index(boolean index) {
@ -216,6 +224,16 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
return builder;
}
public T multiFieldPathType(ContentPath.Type pathType) {
multiFieldsBuilder.pathType(pathType);
return builder;
}
public T addMultiField(Mapper.Builder mapperBuilder) {
multiFieldsBuilder.add(mapperBuilder);
return builder;
}
public Names buildNames(BuilderContext context) {
return new Names(name, buildIndexName(context), indexName == null ? name : indexName, buildFullName(context), context.path().sourcePath());
}
@ -248,11 +266,20 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
protected Loading normsLoading;
protected Settings customFieldDataSettings;
protected FieldDataType fieldDataType;
protected final MultiFields multiFields;
protected AbstractFieldMapper(Names names, float boost, FieldType fieldType, Boolean docValues, NamedAnalyzer indexAnalyzer,
NamedAnalyzer searchAnalyzer, PostingsFormatProvider postingsFormat,
DocValuesFormatProvider docValuesFormat, SimilarityProvider similarity,
Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings) {
this(names, boost, fieldType, docValues, indexAnalyzer, searchAnalyzer, postingsFormat, docValuesFormat, similarity,
normsLoading, fieldDataSettings, indexSettings, MultiFields.empty());
}
protected AbstractFieldMapper(Names names, float boost, FieldType fieldType, Boolean docValues, NamedAnalyzer indexAnalyzer,
NamedAnalyzer searchAnalyzer, PostingsFormatProvider postingsFormat,
DocValuesFormatProvider docValuesFormat, SimilarityProvider similarity,
Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings, MultiFields multiFields) {
this.names = names;
this.boost = boost;
this.fieldType = fieldType;
@ -296,6 +323,7 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
} else {
this.docValues = FieldDataType.DOC_VALUES_FORMAT_VALUE.equals(fieldDataType.getFormat(indexSettings));
}
this.multiFields = multiFields;
}
@Nullable
@ -376,12 +404,17 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
} finally {
fields.clear();
}
multiFields.parse(this, context);
}
/** Parse the field value and populate <code>fields</code>. */
/**
* Parse the field value and populate <code>fields</code>.
*/
protected abstract void parseCreateField(ParseContext context, List<Field> fields) throws IOException;
/** Derived classes can override it to specify that boost value is set by derived classes. */
/**
* Derived classes can override it to specify that boost value is set by derived classes.
*/
protected boolean customBoost() {
return false;
}
@ -389,6 +422,7 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
@Override
public void traverse(FieldMapperListener fieldMapperListener) {
fieldMapperListener.fieldMapper(this);
multiFields.traverse(fieldMapperListener);
}
@Override
@ -561,6 +595,7 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
} else if (!this.similarity().equals(fieldMergeWith.similarity())) {
mergeContext.addConflict("mapper [" + names.fullName() + "] has different similarity");
}
multiFields.merge(mergeWith, mergeContext);
if (!mergeContext.mergeFlags().simulate()) {
// apply changeable values
@ -707,6 +742,7 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
} else if (includeDefaults) {
builder.field("fielddata", (Map) fieldDataType.getSettings().getAsMap());
}
multiFields.toXContent(builder, params);
}
protected static String indexOptionToString(IndexOptions indexOption) {
@ -761,7 +797,7 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
@Override
public void close() {
// nothing to do here, sub classes to override if needed
multiFields.close();
}
@Override
@ -783,4 +819,151 @@ public abstract class AbstractFieldMapper<T> implements FieldMapper<T> {
return normsLoading == null ? defaultLoading : normsLoading;
}
public static class MultiFields {
public static MultiFields empty() {
return new MultiFields(Defaults.PATH_TYPE, ImmutableOpenMap.<String, Mapper>of());
}
public static class Builder {
private final ImmutableOpenMap.Builder<String, Mapper.Builder> mapperBuilders = ImmutableOpenMap.builder();
private ContentPath.Type pathType = Defaults.PATH_TYPE;
public Builder pathType(ContentPath.Type pathType) {
this.pathType = pathType;
return this;
}
public Builder add(Mapper.Builder builder) {
mapperBuilders.put(builder.name(), builder);
return this;
}
@SuppressWarnings("unchecked")
public MultiFields build(AbstractFieldMapper.Builder mainFieldBuilder, BuilderContext context) {
if (pathType == Defaults.PATH_TYPE && mapperBuilders.isEmpty()) {
return empty();
} else if (mapperBuilders.isEmpty()) {
return new MultiFields(pathType, ImmutableOpenMap.<String, Mapper>of());
} else {
ContentPath.Type origPathType = context.path().pathType();
context.path().pathType(pathType);
context.path().add(mainFieldBuilder.name());
ImmutableOpenMap.Builder mapperBuilders = this.mapperBuilders;
for (ObjectObjectCursor<String, Mapper.Builder> cursor : this.mapperBuilders) {
String key = cursor.key;
Mapper.Builder value = cursor.value;
mapperBuilders.put(key, value.build(context));
}
context.path().remove();
context.path().pathType(origPathType);
ImmutableOpenMap.Builder<String, Mapper> mappers = mapperBuilders.cast();
return new MultiFields(pathType, mappers.build());
}
}
}
private final ContentPath.Type pathType;
private volatile ImmutableOpenMap<String, Mapper> mappers;
public MultiFields(ContentPath.Type pathType, ImmutableOpenMap<String, Mapper> mappers) {
this.pathType = pathType;
this.mappers = mappers;
// we disable the all in multi-field mappers
for (ObjectCursor<Mapper> cursor : mappers.values()) {
Mapper mapper = cursor.value;
if (mapper instanceof AllFieldMapper.IncludeInAll) {
((AllFieldMapper.IncludeInAll) mapper).unsetIncludeInAll();
}
}
}
public void parse(AbstractFieldMapper mainField, ParseContext context) throws IOException {
if (mappers.isEmpty()) {
return;
}
ContentPath.Type origPathType = context.path().pathType();
context.path().pathType(pathType);
context.path().add(mainField.name());
for (ObjectCursor<Mapper> cursor : mappers.values()) {
cursor.value.parse(context);
}
context.path().remove();
context.path().pathType(origPathType);
}
// No need for locking, because locking is taken care of in ObjectMapper#merge and DocumentMapper#merge
public void merge(Mapper mergeWith, MergeContext mergeContext) throws MergeMappingException {
AbstractFieldMapper mergeWithMultiField = (AbstractFieldMapper) mergeWith;
List<FieldMapper> newFieldMappers = null;
ImmutableOpenMap.Builder<String, Mapper> newMappersBuilder = null;
for (ObjectCursor<Mapper> cursor : mergeWithMultiField.multiFields.mappers.values()) {
Mapper mergeWithMapper = cursor.value;
Mapper mergeIntoMapper = mappers.get(mergeWithMapper.name());
if (mergeIntoMapper == null) {
// no mapping, simply add it if not simulating
if (!mergeContext.mergeFlags().simulate()) {
// we disable the all in multi-field mappers
if (mergeWithMapper instanceof AllFieldMapper.IncludeInAll) {
((AllFieldMapper.IncludeInAll) mergeWithMapper).unsetIncludeInAll();
}
if (newMappersBuilder == null) {
newMappersBuilder = ImmutableOpenMap.builder(mappers);
}
newMappersBuilder.put(mergeWithMapper.name(), mergeWithMapper);
if (mergeWithMapper instanceof AbstractFieldMapper) {
if (newFieldMappers == null) {
newFieldMappers = new ArrayList<FieldMapper>(2);
}
newFieldMappers.add((FieldMapper) mergeWithMapper);
}
}
} else {
mergeIntoMapper.merge(mergeWithMapper, mergeContext);
}
}
// first add all field mappers
if (newFieldMappers != null) {
mergeContext.docMapper().addFieldMappers(newFieldMappers);
}
// now publish mappers
if (newMappersBuilder != null) {
mappers = newMappersBuilder.build();
}
}
public void traverse(FieldMapperListener fieldMapperListener) {
for (ObjectCursor<Mapper> cursor : mappers.values()) {
cursor.value.traverse(fieldMapperListener);
}
}
public void close() {
for (ObjectCursor<Mapper> cursor : mappers.values()) {
cursor.value.close();
}
}
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
if (pathType != Defaults.PATH_TYPE) {
builder.field("path", pathType.name().toLowerCase(Locale.ROOT));
}
if (!mappers.isEmpty()) {
builder.startObject("fields");
for (ObjectCursor<Mapper> cursor : mappers.values()) {
cursor.value.toXContent(builder, params);
}
builder.endObject();
}
return builder;
}
}
}

View File

@ -87,7 +87,8 @@ public class BinaryFieldMapper extends AbstractFieldMapper<BytesReference> {
@Override
public BinaryFieldMapper build(BuilderContext context) {
return new BinaryFieldMapper(buildNames(context), fieldType, compress, compressThreshold, postingsProvider, docValuesProvider);
return new BinaryFieldMapper(buildNames(context), fieldType, compress, compressThreshold, postingsProvider,
docValuesProvider, multiFieldsBuilder.build(this, context));
}
}
@ -120,8 +121,9 @@ public class BinaryFieldMapper extends AbstractFieldMapper<BytesReference> {
private long compressThreshold;
protected BinaryFieldMapper(Names names, FieldType fieldType, Boolean compress, long compressThreshold,
PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider) {
super(names, 1.0f, fieldType, null, null, null, postingsProvider, docValuesProvider, null, null, null, null);
PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider,
MultiFields multiFields) {
super(names, 1.0f, fieldType, null, null, null, postingsProvider, docValuesProvider, null, null, null, null, multiFields);
this.compress = compress;
this.compressThreshold = compressThreshold;
}

View File

@ -99,7 +99,8 @@ public class BooleanFieldMapper extends AbstractFieldMapper<Boolean> {
@Override
public BooleanFieldMapper build(BuilderContext context) {
return new BooleanFieldMapper(buildNames(context), boost, fieldType, nullValue, postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, context.indexSettings());
return new BooleanFieldMapper(buildNames(context), boost, fieldType, nullValue, postingsProvider,
docValuesProvider, similarity, normsLoading, fieldDataSettings, context.indexSettings(), multiFieldsBuilder.build(this, context));
}
}
@ -123,8 +124,8 @@ public class BooleanFieldMapper extends AbstractFieldMapper<Boolean> {
protected BooleanFieldMapper(Names names, float boost, FieldType fieldType, Boolean nullValue, PostingsFormatProvider postingsProvider,
DocValuesFormatProvider docValuesProvider, SimilarityProvider similarity, Loading normsLoading,
@Nullable Settings fieldDataSettings, Settings indexSettings) {
super(names, boost, fieldType, null, Lucene.KEYWORD_ANALYZER, Lucene.KEYWORD_ANALYZER, postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings);
@Nullable Settings fieldDataSettings, Settings indexSettings, MultiFields multiFields) {
super(names, boost, fieldType, null, Lucene.KEYWORD_ANALYZER, Lucene.KEYWORD_ANALYZER, postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
this.nullValue = nullValue;
}

View File

@ -92,7 +92,8 @@ public class ByteFieldMapper extends NumberFieldMapper<Byte> {
fieldType.setOmitNorms(fieldType.omitNorms() && boost == 1.0f);
ByteFieldMapper fieldMapper = new ByteFieldMapper(buildNames(context),
precisionStep, boost, fieldType, docValues, nullValue, ignoreMalformed(context),
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, context.indexSettings());
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings,
context.indexSettings(), multiFieldsBuilder.build(this, context));
fieldMapper.includeInAll(includeInAll);
return fieldMapper;
}
@ -121,11 +122,11 @@ public class ByteFieldMapper extends NumberFieldMapper<Byte> {
protected ByteFieldMapper(Names names, int precisionStep, float boost, FieldType fieldType, Boolean docValues,
Byte nullValue, Explicit<Boolean> ignoreMalformed, PostingsFormatProvider postingsProvider,
DocValuesFormatProvider docValuesProvider, SimilarityProvider similarity, Loading normsLoading,
@Nullable Settings fieldDataSettings, Settings indexSettings) {
@Nullable Settings fieldDataSettings, Settings indexSettings, MultiFields multiFields) {
super(names, precisionStep, boost, fieldType, docValues,
ignoreMalformed, new NamedAnalyzer("_byte/" + precisionStep, new NumericIntegerAnalyzer(precisionStep)),
new NamedAnalyzer("_byte/max", new NumericIntegerAnalyzer(Integer.MAX_VALUE)), postingsProvider,
docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings);
docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
this.nullValue = nullValue;
this.nullValueAsString = nullValue == null ? null : nullValue.toString();
}
@ -382,4 +383,4 @@ public class ByteFieldMapper extends NumberFieldMapper<Byte> {
return Byte.toString(number);
}
}
}
}

View File

@ -127,7 +127,7 @@ public class CompletionFieldMapper extends AbstractFieldMapper<String> {
@Override
public CompletionFieldMapper build(Mapper.BuilderContext context) {
return new CompletionFieldMapper(buildNames(context), indexAnalyzer, searchAnalyzer, postingsProvider, similarity, payloads,
preserveSeparators, preservePositionIncrements, maxInputLength);
preserveSeparators, preservePositionIncrements, maxInputLength, multiFieldsBuilder.build(this, context));
}
}
@ -194,8 +194,8 @@ public class CompletionFieldMapper extends AbstractFieldMapper<String> {
private int maxInputLength;
public CompletionFieldMapper(Names names, NamedAnalyzer indexAnalyzer, NamedAnalyzer searchAnalyzer, PostingsFormatProvider postingsProvider, SimilarityProvider similarity, boolean payloads,
boolean preserveSeparators, boolean preservePositionIncrements, int maxInputLength) {
super(names, 1.0f, Defaults.FIELD_TYPE, null, indexAnalyzer, searchAnalyzer, postingsProvider, null, similarity, null, null, null);
boolean preserveSeparators, boolean preservePositionIncrements, int maxInputLength, MultiFields multiFields) {
super(names, 1.0f, Defaults.FIELD_TYPE, null, indexAnalyzer, searchAnalyzer, postingsProvider, null, similarity, null, null, null, multiFields);
analyzingSuggestLookupProvider = new AnalyzingCompletionLookupProvider(preserveSeparators, false, preservePositionIncrements, payloads);
this.completionPostingsFormatProvider = new CompletionPostingsFormatProvider("completion", postingsProvider, analyzingSuggestLookupProvider);
this.preserveSeparators = preserveSeparators;

View File

@ -129,7 +129,8 @@ public class DateFieldMapper extends NumberFieldMapper<Long> {
}
DateFieldMapper fieldMapper = new DateFieldMapper(buildNames(context), dateTimeFormatter,
precisionStep, boost, fieldType, docValues, nullValue, timeUnit, roundCeil, ignoreMalformed(context),
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, context.indexSettings());
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, context.indexSettings(),
multiFieldsBuilder.build(this, context));
fieldMapper.includeInAll(includeInAll);
return fieldMapper;
}
@ -203,11 +204,11 @@ public class DateFieldMapper extends NumberFieldMapper<Long> {
protected DateFieldMapper(Names names, FormatDateTimeFormatter dateTimeFormatter, int precisionStep, float boost, FieldType fieldType, Boolean docValues,
String nullValue, TimeUnit timeUnit, boolean roundCeil, Explicit<Boolean> ignoreMalformed,
PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider, SimilarityProvider similarity,
Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings) {
Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings, MultiFields multiFields) {
super(names, precisionStep, boost, fieldType, docValues, ignoreMalformed, new NamedAnalyzer("_date/" + precisionStep,
new NumericDateAnalyzer(precisionStep, dateTimeFormatter.parser())),
new NamedAnalyzer("_date/max", new NumericDateAnalyzer(Integer.MAX_VALUE, dateTimeFormatter.parser())),
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings);
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
this.dateTimeFormatter = dateTimeFormatter;
this.nullValue = nullValue;
this.timeUnit = timeUnit;

View File

@ -96,7 +96,7 @@ public class DoubleFieldMapper extends NumberFieldMapper<Double> {
fieldType.setOmitNorms(fieldType.omitNorms() && boost == 1.0f);
DoubleFieldMapper fieldMapper = new DoubleFieldMapper(buildNames(context),
precisionStep, boost, fieldType, docValues, nullValue, ignoreMalformed(context), postingsProvider, docValuesProvider,
similarity, normsLoading, fieldDataSettings, context.indexSettings());
similarity, normsLoading, fieldDataSettings, context.indexSettings(), multiFieldsBuilder.build(this, context));
fieldMapper.includeInAll(includeInAll);
return fieldMapper;
}
@ -126,10 +126,11 @@ public class DoubleFieldMapper extends NumberFieldMapper<Double> {
protected DoubleFieldMapper(Names names, int precisionStep, float boost, FieldType fieldType, Boolean docValues,
Double nullValue, Explicit<Boolean> ignoreMalformed,
PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider,
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings) {
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings,
Settings indexSettings, MultiFields multiFields) {
super(names, precisionStep, boost, fieldType, docValues, ignoreMalformed,
NumericDoubleAnalyzer.buildNamedAnalyzer(precisionStep), NumericDoubleAnalyzer.buildNamedAnalyzer(Integer.MAX_VALUE),
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings);
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
this.nullValue = nullValue;
this.nullValueAsString = nullValue == null ? null : nullValue.toString();
}

View File

@ -97,7 +97,7 @@ public class FloatFieldMapper extends NumberFieldMapper<Float> {
fieldType.setOmitNorms(fieldType.omitNorms() && boost == 1.0f);
FloatFieldMapper fieldMapper = new FloatFieldMapper(buildNames(context),
precisionStep, boost, fieldType, docValues, nullValue, ignoreMalformed(context), postingsProvider, docValuesProvider,
similarity, normsLoading, fieldDataSettings, context.indexSettings());
similarity, normsLoading, fieldDataSettings, context.indexSettings(), multiFieldsBuilder.build(this, context));
fieldMapper.includeInAll(includeInAll);
return fieldMapper;
}
@ -126,10 +126,11 @@ public class FloatFieldMapper extends NumberFieldMapper<Float> {
protected FloatFieldMapper(Names names, int precisionStep, float boost, FieldType fieldType, Boolean docValues,
Float nullValue, Explicit<Boolean> ignoreMalformed,
PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider,
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings) {
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings,
Settings indexSettings, MultiFields multiFields) {
super(names, precisionStep, boost, fieldType, docValues, ignoreMalformed,
NumericFloatAnalyzer.buildNamedAnalyzer(precisionStep), NumericFloatAnalyzer.buildNamedAnalyzer(Integer.MAX_VALUE),
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings);
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
this.nullValue = nullValue;
this.nullValueAsString = nullValue == null ? null : nullValue.toString();
}

View File

@ -92,7 +92,8 @@ public class IntegerFieldMapper extends NumberFieldMapper<Integer> {
public IntegerFieldMapper build(BuilderContext context) {
fieldType.setOmitNorms(fieldType.omitNorms() && boost == 1.0f);
IntegerFieldMapper fieldMapper = new IntegerFieldMapper(buildNames(context), precisionStep, boost, fieldType, docValues,
nullValue, ignoreMalformed(context), postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, context.indexSettings());
nullValue, ignoreMalformed(context), postingsProvider, docValuesProvider, similarity, normsLoading,
fieldDataSettings, context.indexSettings(), multiFieldsBuilder.build(this, context));
fieldMapper.includeInAll(includeInAll);
return fieldMapper;
}
@ -122,10 +123,10 @@ public class IntegerFieldMapper extends NumberFieldMapper<Integer> {
Integer nullValue, Explicit<Boolean> ignoreMalformed,
PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider,
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings,
Settings indexSettings) {
Settings indexSettings, MultiFields multiFields) {
super(names, precisionStep, boost, fieldType, docValues, ignoreMalformed,
NumericIntegerAnalyzer.buildNamedAnalyzer(precisionStep), NumericIntegerAnalyzer.buildNamedAnalyzer(Integer.MAX_VALUE),
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings);
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
this.nullValue = nullValue;
this.nullValueAsString = nullValue == null ? null : nullValue.toString();
}

View File

@ -92,7 +92,8 @@ public class LongFieldMapper extends NumberFieldMapper<Long> {
public LongFieldMapper build(BuilderContext context) {
fieldType.setOmitNorms(fieldType.omitNorms() && boost == 1.0f);
LongFieldMapper fieldMapper = new LongFieldMapper(buildNames(context), precisionStep, boost, fieldType, docValues, nullValue,
ignoreMalformed(context), postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, context.indexSettings());
ignoreMalformed(context), postingsProvider, docValuesProvider, similarity, normsLoading,
fieldDataSettings, context.indexSettings(), multiFieldsBuilder.build(this, context));
fieldMapper.includeInAll(includeInAll);
return fieldMapper;
}
@ -121,10 +122,11 @@ public class LongFieldMapper extends NumberFieldMapper<Long> {
protected LongFieldMapper(Names names, int precisionStep, float boost, FieldType fieldType, Boolean docValues,
Long nullValue, Explicit<Boolean> ignoreMalformed,
PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider,
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings) {
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings,
Settings indexSettings, MultiFields multiFields) {
super(names, precisionStep, boost, fieldType, docValues, ignoreMalformed,
NumericLongAnalyzer.buildNamedAnalyzer(precisionStep), NumericLongAnalyzer.buildNamedAnalyzer(Integer.MAX_VALUE),
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings);
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
this.nullValue = nullValue;
this.nullValueAsString = nullValue == null ? null : nullValue.toString();
}

View File

@ -148,9 +148,11 @@ public abstract class NumberFieldMapper<T extends Number> extends AbstractFieldM
Explicit<Boolean> ignoreMalformed, NamedAnalyzer indexAnalyzer,
NamedAnalyzer searchAnalyzer, PostingsFormatProvider postingsProvider,
DocValuesFormatProvider docValuesProvider, SimilarityProvider similarity,
Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings) {
Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings,
MultiFields multiFields) {
// LUCENE 4 UPGRADE: Since we can't do anything before the super call, we have to push the boost check down to subclasses
super(names, boost, fieldType, docValues, indexAnalyzer, searchAnalyzer, postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings);
super(names, boost, fieldType, docValues, indexAnalyzer, searchAnalyzer, postingsProvider, docValuesProvider,
similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
if (precisionStep <= 0 || precisionStep >= maxPrecisionStep()) {
this.precisionStep = Integer.MAX_VALUE;
} else {
@ -173,6 +175,11 @@ public abstract class NumberFieldMapper<T extends Number> extends AbstractFieldM
}
}
@Override
public void unsetIncludeInAll() {
includeInAll = null;
}
protected abstract int maxPrecisionStep();
public int precisionStep() {

View File

@ -93,7 +93,8 @@ public class ShortFieldMapper extends NumberFieldMapper<Short> {
public ShortFieldMapper build(BuilderContext context) {
fieldType.setOmitNorms(fieldType.omitNorms() && boost == 1.0f);
ShortFieldMapper fieldMapper = new ShortFieldMapper(buildNames(context), precisionStep, boost, fieldType, docValues, nullValue,
ignoreMalformed(context), postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, context.indexSettings());
ignoreMalformed(context), postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings,
context.indexSettings(), multiFieldsBuilder.build(this, context));
fieldMapper.includeInAll(includeInAll);
return fieldMapper;
}
@ -122,10 +123,11 @@ public class ShortFieldMapper extends NumberFieldMapper<Short> {
protected ShortFieldMapper(Names names, int precisionStep, float boost, FieldType fieldType, Boolean docValues,
Short nullValue, Explicit<Boolean> ignoreMalformed,
PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider,
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings) {
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings,
Settings indexSettings, MultiFields multiFields) {
super(names, precisionStep, boost, fieldType, docValues, ignoreMalformed, new NamedAnalyzer("_short/" + precisionStep,
new NumericIntegerAnalyzer(precisionStep)), new NamedAnalyzer("_short/max", new NumericIntegerAnalyzer(Integer.MAX_VALUE)),
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings);
postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
this.nullValue = nullValue;
this.nullValueAsString = nullValue == null ? null : nullValue.toString();
}

View File

@ -50,6 +50,7 @@ import java.util.Map;
import static org.elasticsearch.index.mapper.MapperBuilders.stringField;
import static org.elasticsearch.index.mapper.core.TypeParsers.parseField;
import static org.elasticsearch.index.mapper.core.TypeParsers.parseMultiField;
/**
*
@ -135,7 +136,8 @@ public class StringFieldMapper extends AbstractFieldMapper<String> implements Al
}
StringFieldMapper fieldMapper = new StringFieldMapper(buildNames(context),
boost, fieldType, docValues, nullValue, indexAnalyzer, searchAnalyzer, searchQuotedAnalyzer,
positionOffsetGap, ignoreAbove, postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, context.indexSettings());
positionOffsetGap, ignoreAbove, postingsProvider, docValuesProvider, similarity, normsLoading,
fieldDataSettings, context.indexSettings(), multiFieldsBuilder.build(this, context));
fieldMapper.includeInAll(includeInAll);
return fieldMapper;
}
@ -172,6 +174,8 @@ public class StringFieldMapper extends AbstractFieldMapper<String> implements Al
}
} else if (propName.equals("ignore_above")) {
builder.ignoreAbove(XContentMapValues.nodeIntegerValue(propNode, -1));
} else {
parseMultiField(builder, name, node, parserContext, propName, propNode);
}
}
return builder;
@ -193,8 +197,9 @@ public class StringFieldMapper extends AbstractFieldMapper<String> implements Al
NamedAnalyzer searchQuotedAnalyzer, int positionOffsetGap, int ignoreAbove,
PostingsFormatProvider postingsFormat, DocValuesFormatProvider docValuesFormat,
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings,
Settings indexSettings) {
super(names, boost, fieldType, docValues, indexAnalyzer, searchAnalyzer, postingsFormat, docValuesFormat, similarity, normsLoading, fieldDataSettings, indexSettings);
Settings indexSettings, MultiFields multiFields) {
super(names, boost, fieldType, docValues, indexAnalyzer, searchAnalyzer, postingsFormat, docValuesFormat,
similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
if (fieldType.tokenized() && fieldType.indexed() && hasDocValues()) {
throw new MapperParsingException("Field [" + names.fullName() + "] cannot be analyzed and have doc values");
}
@ -228,6 +233,11 @@ public class StringFieldMapper extends AbstractFieldMapper<String> implements Al
}
}
@Override
public void unsetIncludeInAll() {
includeInAll = null;
}
@Override
public String value(Object value) {
if (value == null) {

View File

@ -80,7 +80,7 @@ public class TokenCountFieldMapper extends IntegerFieldMapper {
fieldType.setOmitNorms(fieldType.omitNorms() && boost == 1.0f);
TokenCountFieldMapper fieldMapper = new TokenCountFieldMapper(buildNames(context), precisionStep, boost, fieldType, docValues, nullValue,
ignoreMalformed(context), postingsProvider, docValuesProvider, similarity, normsLoading, fieldDataSettings, context.indexSettings(),
analyzer);
analyzer, multiFieldsBuilder.build(this, context));
fieldMapper.includeInAll(includeInAll);
return fieldMapper;
}
@ -116,9 +116,10 @@ public class TokenCountFieldMapper extends IntegerFieldMapper {
protected TokenCountFieldMapper(Names names, int precisionStep, float boost, FieldType fieldType, Boolean docValues, Integer nullValue,
Explicit<Boolean> ignoreMalformed, PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider,
SimilarityProvider similarity, Loading normsLoading, Settings fieldDataSettings, Settings indexSettings, NamedAnalyzer analyzer) {
SimilarityProvider similarity, Loading normsLoading, Settings fieldDataSettings, Settings indexSettings, NamedAnalyzer analyzer,
MultiFields multiFields) {
super(names, precisionStep, boost, fieldType, docValues, nullValue, ignoreMalformed, postingsProvider, docValuesProvider, similarity,
normsLoading, fieldDataSettings, indexSettings);
normsLoading, fieldDataSettings, indexSettings, multiFields);
this.analyzer = analyzer;
}

View File

@ -33,6 +33,9 @@ import org.elasticsearch.index.mapper.FieldMapper.Loading;
import org.elasticsearch.index.mapper.Mapper;
import org.elasticsearch.index.mapper.MapperParsingException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import static org.elasticsearch.common.xcontent.support.XContentMapValues.*;
@ -43,6 +46,91 @@ import static org.elasticsearch.index.mapper.FieldMapper.DOC_VALUES_FORMAT;
*/
public class TypeParsers {
public static final String MULTI_FIELD_CONTENT_TYPE = "multi_field";
public static final Mapper.TypeParser multiFieldConverterTypeParser = new Mapper.TypeParser() {
@Override
public Mapper.Builder<?, ?> parse(String name, Map<String, Object> node, ParserContext parserContext) throws MapperParsingException {
ContentPath.Type pathType = null;
AbstractFieldMapper.Builder mainFieldBuilder = null;
List<AbstractFieldMapper.Builder> fields = null;
String firstType = null;
for (Map.Entry<String, Object> entry : node.entrySet()) {
String fieldName = Strings.toUnderscoreCase(entry.getKey());
Object fieldNode = entry.getValue();
if (fieldName.equals("path")) {
pathType = parsePathType(name, fieldNode.toString());
} else if (fieldName.equals("fields")) {
Map<String, Object> fieldsNode = (Map<String, Object>) fieldNode;
for (Map.Entry<String, Object> entry1 : fieldsNode.entrySet()) {
String propName = entry1.getKey();
Map<String, Object> propNode = (Map<String, Object>) entry1.getValue();
String type;
Object typeNode = propNode.get("type");
if (typeNode != null) {
type = typeNode.toString();
if (firstType == null) {
firstType = type;
}
} else {
throw new MapperParsingException("No type specified for property [" + propName + "]");
}
Mapper.TypeParser typeParser = parserContext.typeParser(type);
if (typeParser == null) {
throw new MapperParsingException("No handler for type [" + type + "] declared on field [" + fieldName + "]");
}
if (propName.equals(name)) {
mainFieldBuilder = (AbstractFieldMapper.Builder) typeParser.parse(propName, propNode, parserContext);
} else {
if (fields == null) {
fields = new ArrayList<AbstractFieldMapper.Builder>(2);
}
fields.add((AbstractFieldMapper.Builder) typeParser.parse(propName, propNode, parserContext));
}
}
}
}
if (mainFieldBuilder == null) {
if (fields == null) {
// No fields at all were specified in multi_field, so lets return a non indexed string field.
return new StringFieldMapper.Builder(name).index(false);
}
Mapper.TypeParser typeParser = parserContext.typeParser(firstType);
if (typeParser == null) {
// The first multi field's type is unknown
mainFieldBuilder = new StringFieldMapper.Builder(name).index(false);
} else {
Mapper.Builder substitute = typeParser.parse(name, Collections.<String, Object>emptyMap(), parserContext);
if (substitute instanceof AbstractFieldMapper.Builder) {
mainFieldBuilder = ((AbstractFieldMapper.Builder) substitute).index(false);
} else {
// The first multi isn't a core field type
mainFieldBuilder = new StringFieldMapper.Builder(name).index(false);
}
}
}
if (fields != null && pathType != null) {
for (Mapper.Builder field : fields) {
mainFieldBuilder.addMultiField(field);
}
mainFieldBuilder.multiFieldPathType(pathType);
} else if (fields != null) {
for (Mapper.Builder field : fields) {
mainFieldBuilder.addMultiField(field);
}
} else if (pathType != null) {
mainFieldBuilder.multiFieldPathType(pathType);
}
return mainFieldBuilder;
}
};
public static final String DOC_VALUES = "doc_values";
public static final String INDEX_OPTIONS_DOCS = "docs";
public static final String INDEX_OPTIONS_FREQS = "freqs";
@ -62,6 +150,8 @@ public class TypeParsers {
builder.omitNorms(nodeBooleanValue(propNode));
} else if (propName.equals("similarity")) {
builder.similarity(parserContext.similarityLookupService().similarity(propNode.toString()));
} else {
parseMultiField(builder, name, numberNode, parserContext, propName, propNode);
}
}
}
@ -146,6 +236,34 @@ public class TypeParsers {
}
}
public static void parseMultiField(AbstractFieldMapper.Builder builder, String name, Map<String, Object> node, Mapper.TypeParser.ParserContext parserContext, String propName, Object propNode) {
if (propName.equals("path")) {
builder.multiFieldPathType(parsePathType(name, propNode.toString()));
} else if (propName.equals("fields")) {
@SuppressWarnings("unchecked")
Map<String, Object> multiFieldsPropNodes = (Map<String, Object>) propNode;
for (Map.Entry<String, Object> multiFieldEntry : multiFieldsPropNodes.entrySet()) {
String multiFieldName = multiFieldEntry.getKey();
@SuppressWarnings("unchecked")
Map<String, Object> multiFieldNodes = (Map<String, Object>) multiFieldEntry.getValue();
String type;
Object typeNode = multiFieldNodes.get("type");
if (typeNode != null) {
type = typeNode.toString();
} else {
throw new MapperParsingException("No type specified for property [" + multiFieldName + "]");
}
Mapper.TypeParser typeParser = parserContext.typeParser(type);
if (typeParser == null) {
throw new MapperParsingException("No handler for type [" + type + "] declared on field [" + multiFieldName + "]");
}
builder.addMultiField(typeParser.parse(multiFieldName, multiFieldNodes, parserContext));
}
}
}
private static IndexOptions nodeIndexOptionValue(final Object propNode) {
final String value = propNode.toString();
if (INDEX_OPTIONS_OFFSETS.equalsIgnoreCase(value)) {

View File

@ -130,7 +130,7 @@ public class GeoPointFieldMapper extends AbstractFieldMapper<GeoPoint> implement
this.builder = this;
}
public Builder pathType(ContentPath.Type pathType) {
public Builder multiFieldPathType(ContentPath.Type pathType) {
this.pathType = pathType;
return this;
}
@ -209,7 +209,7 @@ public class GeoPointFieldMapper extends AbstractFieldMapper<GeoPoint> implement
String fieldName = Strings.toUnderscoreCase(entry.getKey());
Object fieldNode = entry.getValue();
if (fieldName.equals("path")) {
builder.pathType(parsePathType(name, fieldNode.toString()));
builder.multiFieldPathType(parsePathType(name, fieldNode.toString()));
} else if (fieldName.equals("lat_lon")) {
builder.enableLatLon(XContentMapValues.nodeBooleanValue(fieldNode));
} else if (fieldName.equals("geohash")) {

View File

@ -152,7 +152,8 @@ public class GeoShapeFieldMapper extends AbstractFieldMapper<String> {
throw new ElasticsearchIllegalArgumentException("Unknown prefix tree type [" + tree + "]");
}
return new GeoShapeFieldMapper(names, prefixTree, strategyName, distanceErrorPct, fieldType, postingsProvider, docValuesProvider);
return new GeoShapeFieldMapper(names, prefixTree, strategyName, distanceErrorPct, fieldType, postingsProvider,
docValuesProvider, multiFieldsBuilder.build(this, context));
}
}
@ -195,8 +196,9 @@ public class GeoShapeFieldMapper extends AbstractFieldMapper<String> {
private final TermQueryPrefixTreeStrategy termStrategy;
public GeoShapeFieldMapper(FieldMapper.Names names, SpatialPrefixTree tree, String defaultStrategyName, double distanceErrorPct,
FieldType fieldType, PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider) {
super(names, 1, fieldType, null, null, null, postingsProvider, docValuesProvider, null, null, null, null);
FieldType fieldType, PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider,
MultiFields multiFields) {
super(names, 1, fieldType, null, null, null, postingsProvider, docValuesProvider, null, null, null, null, multiFields);
this.recursiveStrategy = new RecursivePrefixTreeStrategy(tree, names.indexName());
this.recursiveStrategy.setDistErrPct(distanceErrorPct);
this.termStrategy = new TermQueryPrefixTreeStrategy(tree, names.indexName());

View File

@ -63,6 +63,8 @@ public class AllFieldMapper extends AbstractFieldMapper<Void> implements Interna
void includeInAll(Boolean includeInAll);
void includeInAllIfNotSet(Boolean includeInAll);
void unsetIncludeInAll();
}
public static final String NAME = "_all";

View File

@ -127,7 +127,7 @@ public class BoostFieldMapper extends NumberFieldMapper<Float> implements Intern
PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider, @Nullable Settings fieldDataSettings, Settings indexSettings) {
super(new Names(name, indexName, indexName, name), precisionStep, boost, fieldType, docValues, Defaults.IGNORE_MALFORMED,
NumericFloatAnalyzer.buildNamedAnalyzer(precisionStep), NumericFloatAnalyzer.buildNamedAnalyzer(Integer.MAX_VALUE),
postingsProvider, docValuesProvider, null, null, fieldDataSettings, indexSettings);
postingsProvider, docValuesProvider, null, null, fieldDataSettings, indexSettings, MultiFields.empty());
this.nullValue = nullValue;
}

View File

@ -102,7 +102,8 @@ public class SizeFieldMapper extends IntegerFieldMapper implements RootMapper {
public SizeFieldMapper(EnabledAttributeMapper enabled, FieldType fieldType, PostingsFormatProvider postingsProvider,
DocValuesFormatProvider docValuesProvider, @Nullable Settings fieldDataSettings, Settings indexSettings) {
super(new Names(Defaults.NAME), Defaults.PRECISION_STEP, Defaults.BOOST, fieldType, null, Defaults.NULL_VALUE,
Defaults.IGNORE_MALFORMED, postingsProvider, docValuesProvider, null, null, fieldDataSettings, indexSettings);
Defaults.IGNORE_MALFORMED, postingsProvider, docValuesProvider, null, null, fieldDataSettings,
indexSettings, MultiFields.empty());
this.enabledState = enabled;
}

View File

@ -127,7 +127,7 @@ public class TTLFieldMapper extends LongFieldMapper implements InternalMapper, R
@Nullable Settings fieldDataSettings, Settings indexSettings) {
super(new Names(Defaults.NAME, Defaults.NAME, Defaults.NAME, Defaults.NAME), Defaults.PRECISION_STEP,
Defaults.BOOST, fieldType, null, Defaults.NULL_VALUE, ignoreMalformed,
postingsProvider, docValuesProvider, null, null, fieldDataSettings, indexSettings);
postingsProvider, docValuesProvider, null, null, fieldDataSettings, indexSettings, MultiFields.empty());
this.enabledState = enabled;
this.defaultTTL = defaultTTL;
}

View File

@ -148,7 +148,8 @@ public class TimestampFieldMapper extends DateFieldMapper implements InternalMap
super(new Names(Defaults.NAME, Defaults.NAME, Defaults.NAME, Defaults.NAME), dateTimeFormatter,
Defaults.PRECISION_STEP, Defaults.BOOST, fieldType, docValues,
Defaults.NULL_VALUE, TimeUnit.MILLISECONDS /*always milliseconds*/,
roundCeil, ignoreMalformed, postingsProvider, docValuesProvider, null, normsLoading, fieldDataSettings, indexSettings);
roundCeil, ignoreMalformed, postingsProvider, docValuesProvider, null, normsLoading, fieldDataSettings,
indexSettings, MultiFields.empty());
this.enabledState = enabledState;
this.path = path;
}

View File

@ -123,7 +123,7 @@ public class IpFieldMapper extends NumberFieldMapper<Long> {
fieldType.setOmitNorms(fieldType.omitNorms() && boost == 1.0f);
IpFieldMapper fieldMapper = new IpFieldMapper(buildNames(context),
precisionStep, boost, fieldType, docValues, nullValue, ignoreMalformed(context), postingsProvider, docValuesProvider, similarity,
normsLoading, fieldDataSettings, context.indexSettings());
normsLoading, fieldDataSettings, context.indexSettings(), multiFieldsBuilder.build(this, context));
fieldMapper.includeInAll(includeInAll);
return fieldMapper;
}
@ -150,11 +150,12 @@ public class IpFieldMapper extends NumberFieldMapper<Long> {
protected IpFieldMapper(Names names, int precisionStep, float boost, FieldType fieldType, Boolean docValues,
String nullValue, Explicit<Boolean> ignoreMalformed,
PostingsFormatProvider postingsProvider, DocValuesFormatProvider docValuesProvider,
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings, Settings indexSettings) {
SimilarityProvider similarity, Loading normsLoading, @Nullable Settings fieldDataSettings,
Settings indexSettings, MultiFields multiFields) {
super(names, precisionStep, boost, fieldType, docValues,
ignoreMalformed, new NamedAnalyzer("_ip/" + precisionStep, new NumericIpAnalyzer(precisionStep)),
new NamedAnalyzer("_ip/max", new NumericIpAnalyzer(Integer.MAX_VALUE)), postingsProvider, docValuesProvider,
similarity, normsLoading, fieldDataSettings, indexSettings);
similarity, normsLoading, fieldDataSettings, indexSettings, multiFields);
this.nullValue = nullValue;
}

View File

@ -1,348 +0,0 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index.mapper.multifield;
import com.google.common.collect.ImmutableMap;
import org.elasticsearch.common.Strings;
import org.elasticsearch.common.collect.MapBuilder;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.index.mapper.*;
import org.elasticsearch.index.mapper.core.AbstractFieldMapper;
import org.elasticsearch.index.mapper.internal.AllFieldMapper;
import java.io.IOException;
import java.util.*;
import static com.google.common.collect.Lists.newArrayList;
import static org.elasticsearch.common.collect.MapBuilder.newMapBuilder;
import static org.elasticsearch.index.mapper.MapperBuilders.multiField;
import static org.elasticsearch.index.mapper.core.TypeParsers.parsePathType;
/**
*
*/
public class MultiFieldMapper implements Mapper, AllFieldMapper.IncludeInAll {
public static final String CONTENT_TYPE = "multi_field";
public static class Defaults {
public static final ContentPath.Type PATH_TYPE = ContentPath.Type.FULL;
}
public static class Builder extends Mapper.Builder<Builder, MultiFieldMapper> {
private ContentPath.Type pathType = Defaults.PATH_TYPE;
private final List<Mapper.Builder> mappersBuilders = newArrayList();
private Mapper.Builder defaultMapperBuilder;
public Builder(String name) {
super(name);
this.builder = this;
}
public Builder pathType(ContentPath.Type pathType) {
this.pathType = pathType;
return this;
}
public Builder add(Mapper.Builder builder) {
if (builder.name().equals(name)) {
defaultMapperBuilder = builder;
} else {
mappersBuilders.add(builder);
}
return this;
}
@Override
public MultiFieldMapper build(BuilderContext context) {
ContentPath.Type origPathType = context.path().pathType();
context.path().pathType(pathType);
Mapper defaultMapper = null;
if (defaultMapperBuilder != null) {
defaultMapper = defaultMapperBuilder.build(context);
}
String origSourcePath = context.path().sourcePath(context.path().fullPathAsText(name));
context.path().add(name);
Map<String, Mapper> mappers = new HashMap<String, Mapper>();
for (Mapper.Builder builder : mappersBuilders) {
Mapper mapper = builder.build(context);
mappers.put(mapper.name(), mapper);
}
context.path().remove();
context.path().sourcePath(origSourcePath);
context.path().pathType(origPathType);
return new MultiFieldMapper(name, pathType, mappers, defaultMapper);
}
}
public static class TypeParser implements Mapper.TypeParser {
@Override
public Mapper.Builder parse(String name, Map<String, Object> node, ParserContext parserContext) throws MapperParsingException {
MultiFieldMapper.Builder builder = multiField(name);
for (Map.Entry<String, Object> entry : node.entrySet()) {
String fieldName = Strings.toUnderscoreCase(entry.getKey());
Object fieldNode = entry.getValue();
if (fieldName.equals("path")) {
builder.pathType(parsePathType(name, fieldNode.toString()));
} else if (fieldName.equals("fields")) {
Map<String, Object> fieldsNode = (Map<String, Object>) fieldNode;
for (Map.Entry<String, Object> entry1 : fieldsNode.entrySet()) {
String propName = entry1.getKey();
Map<String, Object> propNode = (Map<String, Object>) entry1.getValue();
String type;
Object typeNode = propNode.get("type");
if (typeNode != null) {
type = typeNode.toString();
} else {
throw new MapperParsingException("No type specified for property [" + propName + "]");
}
Mapper.TypeParser typeParser = parserContext.typeParser(type);
if (typeParser == null) {
throw new MapperParsingException("No handler for type [" + type + "] declared on field [" + fieldName + "]");
}
builder.add(typeParser.parse(propName, propNode, parserContext));
}
}
}
return builder;
}
}
private final String name;
private final ContentPath.Type pathType;
private final Object mutex = new Object();
private volatile ImmutableMap<String, Mapper> mappers = ImmutableMap.of();
private volatile Mapper defaultMapper;
public MultiFieldMapper(String name, ContentPath.Type pathType, Mapper defaultMapper) {
this(name, pathType, new HashMap<String, Mapper>(), defaultMapper);
}
public MultiFieldMapper(String name, ContentPath.Type pathType, Map<String, Mapper> mappers, Mapper defaultMapper) {
this.name = name;
this.pathType = pathType;
this.mappers = ImmutableMap.copyOf(mappers);
this.defaultMapper = defaultMapper;
// we disable the all in mappers, only the default one can be added
for (Mapper mapper : mappers.values()) {
if (mapper instanceof AllFieldMapper.IncludeInAll) {
((AllFieldMapper.IncludeInAll) mapper).includeInAll(false);
}
}
}
@Override
public String name() {
return this.name;
}
@Override
public void includeInAll(Boolean includeInAll) {
if (includeInAll != null && defaultMapper != null && (defaultMapper instanceof AllFieldMapper.IncludeInAll)) {
((AllFieldMapper.IncludeInAll) defaultMapper).includeInAll(includeInAll);
}
}
@Override
public void includeInAllIfNotSet(Boolean includeInAll) {
if (includeInAll != null && defaultMapper != null && (defaultMapper instanceof AllFieldMapper.IncludeInAll)) {
((AllFieldMapper.IncludeInAll) defaultMapper).includeInAllIfNotSet(includeInAll);
}
}
public ContentPath.Type pathType() {
return pathType;
}
public Mapper defaultMapper() {
return this.defaultMapper;
}
public ImmutableMap<String, Mapper> mappers() {
return this.mappers;
}
@Override
public void parse(ParseContext context) throws IOException {
ContentPath.Type origPathType = context.path().pathType();
context.path().pathType(pathType);
// do the default mapper without adding the path
if (defaultMapper != null) {
defaultMapper.parse(context);
}
context.path().add(name);
for (Mapper mapper : mappers.values()) {
mapper.parse(context);
}
context.path().remove();
context.path().pathType(origPathType);
}
@Override
public void merge(Mapper mergeWith, MergeContext mergeContext) throws MergeMappingException {
if (!(mergeWith instanceof MultiFieldMapper) && !(mergeWith instanceof AbstractFieldMapper)) {
mergeContext.addConflict("Can't merge a non multi_field / non simple mapping [" + mergeWith.name() + "] with a multi_field mapping [" + name() + "]");
return;
}
synchronized (mutex) {
if (mergeWith instanceof AbstractFieldMapper) {
// its a single field mapper, upgraded into a multi field mapper, just update the default mapper
if (defaultMapper == null) {
if (!mergeContext.mergeFlags().simulate()) {
mergeContext.docMapper().addFieldMappers((FieldMapper) defaultMapper);
defaultMapper = mergeWith; // only set & expose it after adding fieldmapper
}
}
} else {
MultiFieldMapper mergeWithMultiField = (MultiFieldMapper) mergeWith;
List<FieldMapper> newFieldMappers = null;
MapBuilder<String, Mapper> newMappersBuilder = null;
Mapper newDefaultMapper = null;
// merge the default mapper
if (defaultMapper == null) {
if (mergeWithMultiField.defaultMapper != null) {
if (!mergeContext.mergeFlags().simulate()) {
if (newFieldMappers == null) {
newFieldMappers = new ArrayList<FieldMapper>();
}
newFieldMappers.add((FieldMapper) defaultMapper);
newDefaultMapper = mergeWithMultiField.defaultMapper;
}
}
} else {
if (mergeWithMultiField.defaultMapper != null) {
defaultMapper.merge(mergeWithMultiField.defaultMapper, mergeContext);
}
}
// merge all the other mappers
for (Mapper mergeWithMapper : mergeWithMultiField.mappers.values()) {
Mapper mergeIntoMapper = mappers.get(mergeWithMapper.name());
if (mergeIntoMapper == null) {
// no mapping, simply add it if not simulating
if (!mergeContext.mergeFlags().simulate()) {
// disable the mapper from being in all, only the default mapper is in all
if (mergeWithMapper instanceof AllFieldMapper.IncludeInAll) {
((AllFieldMapper.IncludeInAll) mergeWithMapper).includeInAll(false);
}
if (newMappersBuilder == null) {
newMappersBuilder = newMapBuilder(mappers);
}
newMappersBuilder.put(mergeWithMapper.name(), mergeWithMapper);
if (mergeWithMapper instanceof AbstractFieldMapper) {
if (newFieldMappers == null) {
newFieldMappers = new ArrayList<FieldMapper>();
}
newFieldMappers.add((FieldMapper) mergeWithMapper);
}
}
} else {
mergeIntoMapper.merge(mergeWithMapper, mergeContext);
}
}
// first add all field mappers
if (newFieldMappers != null && !newFieldMappers.isEmpty()) {
mergeContext.docMapper().addFieldMappers(newFieldMappers);
}
// now publish mappers
if (newDefaultMapper != null) {
defaultMapper = newDefaultMapper;
}
if (newMappersBuilder != null) {
mappers = newMappersBuilder.immutableMap();
}
}
}
}
@Override
public void close() {
if (defaultMapper != null) {
defaultMapper.close();
}
for (Mapper mapper : mappers.values()) {
mapper.close();
}
}
@Override
public void traverse(FieldMapperListener fieldMapperListener) {
if (defaultMapper != null) {
defaultMapper.traverse(fieldMapperListener);
}
for (Mapper mapper : mappers.values()) {
mapper.traverse(fieldMapperListener);
}
}
@Override
public void traverse(ObjectMapperListener objectMapperListener) {
}
@Override
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
builder.startObject(name);
builder.field("type", CONTENT_TYPE);
if (pathType != Defaults.PATH_TYPE) {
builder.field("path", pathType.name().toLowerCase(Locale.ROOT));
}
builder.startObject("fields");
if (defaultMapper != null) {
defaultMapper.toXContent(builder, params);
}
if (mappers.size() <= 1) {
for (Mapper mapper : mappers.values()) {
mapper.toXContent(builder, params);
}
} else {
// sort the mappers (by name) if there is more than one mapping
TreeMap<String, Mapper> sortedMappers = new TreeMap<String, Mapper>(mappers);
for (Mapper mapper : sortedMappers.values()) {
mapper.toXContent(builder, params);
}
}
builder.endObject();
builder.endObject();
return builder;
}
}

View File

@ -38,7 +38,6 @@ import org.elasticsearch.index.mapper.ParseContext.Document;
import org.elasticsearch.index.mapper.internal.AllFieldMapper;
import org.elasticsearch.index.mapper.internal.TypeFieldMapper;
import org.elasticsearch.index.mapper.internal.UidFieldMapper;
import org.elasticsearch.index.mapper.multifield.MultiFieldMapper;
import java.io.IOException;
import java.util.*;
@ -242,8 +241,6 @@ public class ObjectMapper implements Mapper, AllFieldMapper.IncludeInAll {
// lets see if we can derive this...
if (propNode.get("properties") != null) {
type = ObjectMapper.CONTENT_TYPE;
} else if (propNode.get("fields") != null) {
type = MultiFieldMapper.CONTENT_TYPE;
} else if (propNode.size() == 1 && propNode.get("enabled") != null) {
// if there is a single property with the enabled flag on it, make it an object
// (usually, setting enabled to false to not index any type, including core values, which
@ -341,6 +338,17 @@ public class ObjectMapper implements Mapper, AllFieldMapper.IncludeInAll {
}
}
@Override
public void unsetIncludeInAll() {
includeInAll = null;
// when called from outside, apply this on all the inner mappers
for (ObjectObjectCursor<String, Mapper> cursor : mappers) {
if (cursor.value instanceof AllFieldMapper.IncludeInAll) {
((AllFieldMapper.IncludeInAll) cursor.value).unsetIncludeInAll();
}
}
}
public Nested nested() {
return this.nested;
}
@ -845,21 +853,7 @@ public class ObjectMapper implements Mapper, AllFieldMapper.IncludeInAll {
mergeWithMapper.traverse(newObjectMappers);
}
} else {
if ((mergeWithMapper instanceof MultiFieldMapper) && !(mergeIntoMapper instanceof MultiFieldMapper)) {
MultiFieldMapper mergeWithMultiField = (MultiFieldMapper) mergeWithMapper;
mergeWithMultiField.merge(mergeIntoMapper, mergeContext);
if (!mergeContext.mergeFlags().simulate()) {
mappersToPut.add(mergeWithMultiField);
// now, record mappers to traverse events for all mappers
// we don't just traverse mergeWithMultiField as we already have the default handler
for (Mapper mapper : mergeWithMultiField.mappers().values()) {
mapper.traverse(newFieldMappers);
mapper.traverse(newObjectMappers);
}
}
} else {
mergeIntoMapper.merge(mergeWithMapper, mergeContext);
}
mergeIntoMapper.merge(mergeWithMapper, mergeContext);
}
}
if (!newFieldMappers.mappers.isEmpty()) {

View File

@ -24,15 +24,20 @@ import org.elasticsearch.common.bytes.BytesArray;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.DocumentMapperParser;
import org.elasticsearch.index.mapper.FieldMapper;
import org.elasticsearch.index.mapper.MapperTestUtils;
import org.elasticsearch.index.mapper.ParseContext.Document;
import org.elasticsearch.index.mapper.core.DateFieldMapper;
import org.elasticsearch.index.mapper.core.LongFieldMapper;
import org.elasticsearch.index.mapper.core.StringFieldMapper;
import org.elasticsearch.index.mapper.core.TokenCountFieldMapper;
import org.elasticsearch.test.ElasticsearchTestCase;
import org.junit.Test;
import static org.elasticsearch.common.io.Streams.copyToBytesFromClasspath;
import static org.elasticsearch.common.io.Streams.copyToStringFromClasspath;
import static org.elasticsearch.index.mapper.MapperBuilders.*;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.*;
/**
*
@ -40,8 +45,18 @@ import static org.hamcrest.Matchers.equalTo;
public class MultiFieldTests extends ElasticsearchTestCase {
@Test
public void testMultiField() throws Exception {
String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/test-mapping.json");
public void testMultiField_multiFieldType() throws Exception {
String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/test-multi-field-type.json");
testMultiField(mapping);
}
@Test
public void testMultiField_multiFields() throws Exception {
String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/test-multi-fields.json");
testMultiField(mapping);
}
private void testMultiField(String mapping) throws Exception {
DocumentMapper docMapper = MapperTestUtils.newParser().parse(mapping);
BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/multifield/test-data.json"));
Document doc = docMapper.parse(json).rootDoc();
@ -70,6 +85,46 @@ public class MultiFieldTests extends ElasticsearchTestCase {
f = doc.getField("object1.multi1.string");
assertThat(f.name(), equalTo("object1.multi1.string"));
assertThat(f.stringValue(), equalTo("2010-01-01"));
assertThat(docMapper.mappers().fullName("name").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name").mapper(), instanceOf(StringFieldMapper.class));
assertThat(docMapper.mappers().fullName("name").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name").mapper().fieldType().stored(), equalTo(true));
assertThat(docMapper.mappers().fullName("name").mapper().fieldType().tokenized(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.indexed").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.indexed").mapper(), instanceOf(StringFieldMapper.class));
assertThat(docMapper.mappers().fullName("name.indexed").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.indexed").mapper().fieldType().stored(), equalTo(false));
assertThat(docMapper.mappers().fullName("name.indexed").mapper().fieldType().tokenized(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper(), instanceOf(StringFieldMapper.class));
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper().fieldType().indexed(), equalTo(false));
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper().fieldType().stored(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper().fieldType().tokenized(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.test1").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.test1").mapper(), instanceOf(StringFieldMapper.class));
assertThat(docMapper.mappers().fullName("name.test1").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.test1").mapper().fieldType().stored(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.test1").mapper().fieldType().tokenized(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.test1").mapper().fieldDataType().getLoading(), equalTo(FieldMapper.Loading.EAGER));
assertThat(docMapper.mappers().fullName("name.test2").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.test2").mapper(), instanceOf(TokenCountFieldMapper.class));
assertThat(docMapper.mappers().fullName("name.test2").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.test2").mapper().fieldType().stored(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.test2").mapper().fieldType().tokenized(), equalTo(false));
assertThat(((TokenCountFieldMapper) docMapper.mappers().fullName("name.test2").mapper()).analyzer(), equalTo("simple"));
assertThat(((TokenCountFieldMapper) docMapper.mappers().fullName("name.test2").mapper()).analyzer(), equalTo("simple"));
assertThat(docMapper.mappers().fullName("object1.multi1").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("object1.multi1").mapper(), instanceOf(DateFieldMapper.class));
assertThat(docMapper.mappers().fullName("object1.multi1.string").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("object1.multi1.string").mapper(), instanceOf(StringFieldMapper.class));
assertThat(docMapper.mappers().fullName("object1.multi1.string").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("object1.multi1.string").mapper().fieldType().tokenized(), equalTo(false));
}
@Test
@ -77,10 +132,9 @@ public class MultiFieldTests extends ElasticsearchTestCase {
DocumentMapperParser mapperParser = MapperTestUtils.newParser();
DocumentMapper builderDocMapper = doc("test", rootObject("person").add(
multiField("name")
.add(stringField("name").store(true))
.add(stringField("indexed").index(true).tokenized(true))
.add(stringField("not_indexed").index(false).store(true))
stringField("name").store(true)
.addMultiField(stringField("indexed").index(true).tokenized(true))
.addMultiField(stringField("not_indexed").index(false).store(true))
)).build(mapperParser);
builderDocMapper.refreshSource();
@ -102,6 +156,7 @@ public class MultiFieldTests extends ElasticsearchTestCase {
f = doc.getField("name.indexed");
assertThat(f.name(), equalTo("name.indexed"));
assertThat(f.stringValue(), equalTo("some name"));
assertThat(f.fieldType().tokenized(), equalTo(true));
assertThat(f.fieldType().stored(), equalTo(false));
assertThat(f.fieldType().indexed(), equalTo(true));
@ -111,4 +166,74 @@ public class MultiFieldTests extends ElasticsearchTestCase {
assertThat(f.fieldType().stored(), equalTo(true));
assertThat(f.fieldType().indexed(), equalTo(false));
}
@Test
public void testConvertMultiFieldNoDefaultField() throws Exception {
String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/test-multi-field-type-no-default-field.json");
DocumentMapper docMapper = MapperTestUtils.newParser().parse(mapping);
BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/multifield/test-data.json"));
Document doc = docMapper.parse(json).rootDoc();
assertNull(doc.getField("name"));
IndexableField f = doc.getField("name.indexed");
assertThat(f.name(), equalTo("name.indexed"));
assertThat(f.stringValue(), equalTo("some name"));
assertThat(f.fieldType().stored(), equalTo(false));
assertThat(f.fieldType().indexed(), equalTo(true));
f = doc.getField("name.not_indexed");
assertThat(f.name(), equalTo("name.not_indexed"));
assertThat(f.stringValue(), equalTo("some name"));
assertThat(f.fieldType().stored(), equalTo(true));
assertThat(f.fieldType().indexed(), equalTo(false));
assertThat(docMapper.mappers().fullName("name").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name").mapper(), instanceOf(StringFieldMapper.class));
assertThat(docMapper.mappers().fullName("name").mapper().fieldType().indexed(), equalTo(false));
assertThat(docMapper.mappers().fullName("name").mapper().fieldType().stored(), equalTo(false));
assertThat(docMapper.mappers().fullName("name").mapper().fieldType().tokenized(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.indexed").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.indexed").mapper(), instanceOf(StringFieldMapper.class));
assertThat(docMapper.mappers().fullName("name.indexed").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.indexed").mapper().fieldType().stored(), equalTo(false));
assertThat(docMapper.mappers().fullName("name.indexed").mapper().fieldType().tokenized(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper(), instanceOf(StringFieldMapper.class));
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper().fieldType().indexed(), equalTo(false));
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper().fieldType().stored(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper().fieldType().tokenized(), equalTo(true));
assertNull(doc.getField("age"));
f = doc.getField("age.not_stored");
assertThat(f.name(), equalTo("age.not_stored"));
assertThat(f.numericValue(), equalTo((Number) 28L));
assertThat(f.fieldType().stored(), equalTo(false));
assertThat(f.fieldType().indexed(), equalTo(true));
f = doc.getField("age.stored");
assertThat(f.name(), equalTo("age.stored"));
assertThat(f.numericValue(), equalTo((Number) 28L));
assertThat(f.fieldType().stored(), equalTo(true));
assertThat(f.fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("age").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("age").mapper(), instanceOf(LongFieldMapper.class));
assertThat(docMapper.mappers().fullName("age").mapper().fieldType().indexed(), equalTo(false));
assertThat(docMapper.mappers().fullName("age").mapper().fieldType().stored(), equalTo(false));
assertThat(docMapper.mappers().fullName("age").mapper().fieldType().tokenized(), equalTo(false));
assertThat(docMapper.mappers().fullName("age.not_stored").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("age.not_stored").mapper(), instanceOf(LongFieldMapper.class));
assertThat(docMapper.mappers().fullName("age.not_stored").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("age.not_stored").mapper().fieldType().stored(), equalTo(false));
assertThat(docMapper.mappers().fullName("age.not_stored").mapper().fieldType().tokenized(), equalTo(false));
assertThat(docMapper.mappers().fullName("age.stored").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("age.stored").mapper(), instanceOf(LongFieldMapper.class));
assertThat(docMapper.mappers().fullName("age.stored").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("age.stored").mapper().fieldType().stored(), equalTo(true));
assertThat(docMapper.mappers().fullName("age.stored").mapper().fieldType().tokenized(), equalTo(false));
}
}

View File

@ -0,0 +1,168 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index.mapper.multifield;
import org.elasticsearch.action.admin.indices.mapping.get.GetMappingsResponse;
import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.cluster.metadata.MappingMetaData;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.common.xcontent.support.XContentMapValues;
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.test.ElasticsearchIntegrationTest;
import org.junit.Test;
import java.io.IOException;
import java.util.Map;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.*;
/**
*/
public class MultiFieldsIntegrationTests extends ElasticsearchIntegrationTest {
@Test
public void testMultiFields() throws Exception {
assertAcked(
client().admin().indices().prepareCreate("my-index")
.addMapping("my-type", createTypeSource())
);
GetMappingsResponse getMappingsResponse = client().admin().indices().prepareGetMappings("my-index").get();
MappingMetaData mappingMetaData = getMappingsResponse.mappings().get("my-index").get("my-type");
assertThat(mappingMetaData, not(nullValue()));
Map<String, Object> mappingSource = mappingMetaData.sourceAsMap();
assertThat(((Map) XContentMapValues.extractValue("properties.title.fields", mappingSource)).size(), equalTo(1));
client().prepareIndex("my-index", "my-type", "1")
.setSource("title", "Multi fields")
.setRefresh(true)
.get();
SearchResponse searchResponse = client().prepareSearch("my-index")
.setQuery(QueryBuilders.matchQuery("title", "multi"))
.get();
assertThat(searchResponse.getHits().totalHits(), equalTo(1l));
searchResponse = client().prepareSearch("my-index")
.setQuery(QueryBuilders.matchQuery("title.not_analyzed", "Multi fields"))
.get();
assertThat(searchResponse.getHits().totalHits(), equalTo(1l));
assertAcked(
client().admin().indices().preparePutMapping("my-index").setType("my-type")
.setSource(createPutMappingSource())
.setIgnoreConflicts(true) // If updated with multi-field type, we need to ignore failures.
);
getMappingsResponse = client().admin().indices().prepareGetMappings("my-index").get();
mappingMetaData = getMappingsResponse.mappings().get("my-index").get("my-type");
assertThat(mappingMetaData, not(nullValue()));
mappingSource = mappingMetaData.sourceAsMap();
assertThat(((Map) XContentMapValues.extractValue("properties.title", mappingSource)).size(), equalTo(2));
assertThat(((Map) XContentMapValues.extractValue("properties.title.fields", mappingSource)).size(), equalTo(2));
searchResponse = client().prepareSearch("my-index")
.setQuery(QueryBuilders.matchQuery("title.uncased", "multi"))
.get();
assertThat(searchResponse.getHits().totalHits(), equalTo(0l));
searchResponse = client().prepareSearch("my-index")
.setQuery(QueryBuilders.matchQuery("title.uncased", "Multi"))
.get();
assertThat(searchResponse.getHits().totalHits(), equalTo(0l));
client().prepareIndex("my-index", "my-type", "1")
.setSource("title", "Multi fields")
.setRefresh(true)
.get();
searchResponse = client().prepareSearch("my-index")
.setQuery(QueryBuilders.matchQuery("title.uncased", "Multi"))
.get();
assertThat(searchResponse.getHits().totalHits(), equalTo(1l));
}
private XContentBuilder createTypeSource() throws IOException {
if (randomBoolean()) {
return XContentFactory.jsonBuilder().startObject().startObject("my-type")
.startObject("properties")
.startObject("title")
.field("type", "string")
.startObject("fields")
.startObject("not_analyzed")
.field("type", "string")
.field("index", "not_analyzed")
.endObject()
.endObject()
.endObject()
.endObject()
.endObject().endObject();
} else {
return XContentFactory.jsonBuilder().startObject().startObject("my-type")
.startObject("properties")
.startObject("title")
.field("type", "multi_field")
.startObject("fields")
.startObject("title")
.field("type", "string")
.endObject()
.startObject("not_analyzed")
.field("type", "string")
.field("index", "not_analyzed")
.endObject()
.endObject()
.endObject()
.endObject()
.endObject().endObject();
}
}
private XContentBuilder createPutMappingSource() throws IOException {
if (randomBoolean()) {
return XContentFactory.jsonBuilder().startObject().startObject("my-type")
.startObject("properties")
.startObject("title")
.field("type", "string")
.startObject("fields")
.startObject("uncased")
.field("type", "string")
.field("analyzer", "whitespace")
.endObject()
.endObject()
.endObject()
.endObject()
.endObject().endObject();
} else {
return XContentFactory.jsonBuilder().startObject().startObject("my-type")
.startObject("properties")
.startObject("title")
.field("type", "multi_field")
.startObject("fields")
.startObject("uncased")
.field("type", "string")
.field("analyzer", "whitespace")
.endObject()
.endObject()
.endObject()
.endObject()
.endObject().endObject();
}
}
}

View File

@ -102,6 +102,7 @@ public class JavaMultiFieldMergeTests extends ElasticsearchTestCase {
mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping4.json");
DocumentMapper docMapper4 = parser.parse(mapping);
mergeResult = docMapper.merge(docMapper4, mergeFlags().simulate(true));
assertThat(Arrays.toString(mergeResult.conflicts()), mergeResult.hasConflicts(), equalTo(false));
@ -115,4 +116,84 @@ public class JavaMultiFieldMergeTests extends ElasticsearchTestCase {
assertThat(docMapper.mappers().fullName("name.not_indexed2").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed3").mapper(), notNullValue());
}
@Test
public void testUpgradeFromMultiFieldTypeToMultiFields() throws Exception {
String mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-mapping1.json");
DocumentMapperParser parser = MapperTestUtils.newParser();
DocumentMapper docMapper = parser.parse(mapping);
assertThat(docMapper.mappers().fullName("name").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.indexed"), nullValue());
BytesReference json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-data.json"));
Document doc = docMapper.parse(json).rootDoc();
IndexableField f = doc.getField("name");
assertThat(f, notNullValue());
f = doc.getField("name.indexed");
assertThat(f, nullValue());
mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/upgrade1.json");
DocumentMapper docMapper2 = parser.parse(mapping);
DocumentMapper.MergeResult mergeResult = docMapper.merge(docMapper2, mergeFlags().simulate(true));
assertThat(Arrays.toString(mergeResult.conflicts()), mergeResult.hasConflicts(), equalTo(false));
docMapper.merge(docMapper2, mergeFlags().simulate(false));
assertThat(docMapper.mappers().name("name").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.indexed").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed2"), nullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed3"), nullValue());
json = new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/test-data.json"));
doc = docMapper.parse(json).rootDoc();
f = doc.getField("name");
assertThat(f, notNullValue());
f = doc.getField("name.indexed");
assertThat(f, notNullValue());
mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/upgrade2.json");
DocumentMapper docMapper3 = parser.parse(mapping);
mergeResult = docMapper.merge(docMapper3, mergeFlags().simulate(true));
assertThat(Arrays.toString(mergeResult.conflicts()), mergeResult.hasConflicts(), equalTo(false));
docMapper.merge(docMapper3, mergeFlags().simulate(false));
assertThat(docMapper.mappers().name("name").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.indexed").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed2").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed3"), nullValue());
mapping = copyToStringFromClasspath("/org/elasticsearch/index/mapper/multifield/merge/upgrade3.json");
DocumentMapper docMapper4 = parser.parse(mapping);
mergeResult = docMapper.merge(docMapper4, mergeFlags().simulate(true));
assertThat(Arrays.toString(mergeResult.conflicts()), mergeResult.hasConflicts(), equalTo(true));
assertThat(mergeResult.conflicts()[0], equalTo("mapper [name] has different index values"));
assertThat(mergeResult.conflicts()[1], equalTo("mapper [name] has different store values"));
mergeResult = docMapper.merge(docMapper4, mergeFlags().simulate(false));
assertThat(Arrays.toString(mergeResult.conflicts()), mergeResult.hasConflicts(), equalTo(true));
assertThat(docMapper.mappers().name("name").mapper().fieldType().indexed(), equalTo(true));
assertThat(mergeResult.conflicts()[0], equalTo("mapper [name] has different index values"));
assertThat(mergeResult.conflicts()[1], equalTo("mapper [name] has different store values"));
// There are conflicts, but the `name.not_indexed3` has been added, b/c that field has no conflicts
assertThat(docMapper.mappers().fullName("name").mapper().fieldType().indexed(), equalTo(true));
assertThat(docMapper.mappers().fullName("name.indexed").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed2").mapper(), notNullValue());
assertThat(docMapper.mappers().fullName("name.not_indexed3").mapper(), notNullValue());
}
}

View File

@ -1,22 +1,24 @@
{
person:{
properties:{
"person" :{
"properties" :{
"name":{
type:"multi_field",
"type" :"string",
"index" :"analyzed",
"store" :"yes",
"fields":{
"name":{
type:"string",
index:"analyzed",
store:"yes"
"type" :"string",
"index" :"analyzed",
"store" :"yes"
},
"indexed":{
type:"string",
index:"analyzed"
"type" :"string",
"index" :"analyzed"
},
"not_indexed":{
type:"string",
index:"no",
store:"yes"
"type" :"string",
"index" :"no",
"store" :"yes"
}
}
}

View File

@ -1,13 +1,15 @@
{
person:{
properties:{
"name":{
type:"multi_field",
"fields":{
"name":{
type:"string",
index:"analyzed",
store:"yes"
"person" : {
"properties" :{
"name" : {
"type" : "string",
"index" : "analyzed",
"store" : "yes",
"fields": {
"name" : {
"type" : "string",
"index" : "analyzed",
"store" : "yes"
},
"indexed":{
type:"string",

View File

@ -2,7 +2,9 @@
person:{
properties:{
"name":{
type:"multi_field",
type:"string",
index:"analyzed",
store:"yes",
"fields":{
"not_indexed3":{
type:"string",

View File

@ -0,0 +1,25 @@
{
person:{
properties:{
"name":{
type:"multi_field",
"fields":{
"name":{
type:"string",
index:"analyzed",
store:"yes"
},
"indexed":{
type:"string",
index:"analyzed"
},
"not_indexed":{
type:"string",
index:"no",
store:"yes"
}
}
}
}
}
}

View File

@ -0,0 +1,30 @@
{
person:{
properties:{
"name":{
type:"multi_field",
"fields":{
"name":{
type:"string",
index:"analyzed",
store:"yes"
},
"indexed":{
type:"string",
index:"analyzed"
},
"not_indexed":{
type:"string",
index:"no",
store:"yes"
},
"not_indexed2":{
type:"string",
index:"no",
store:"yes"
}
}
}
}
}
}

View File

@ -0,0 +1,16 @@
{
person:{
properties:{
"name":{
type:"multi_field",
"fields":{
"not_indexed3":{
type:"string",
index:"no",
store:"yes"
}
}
}
}
}
}

View File

@ -1,5 +1,6 @@
{
"_id":1,
"age":28,
"name":"some name",
"object1":{
"multi1":"2010-01-01"

View File

@ -0,0 +1,32 @@
{
"person": {
"properties": {
"name": {
"type": "multi_field",
"fields": {
"indexed": {
"type": "string",
"index": "analyzed"
},
"not_indexed": {
"type": "string",
"index": "no",
"store": "yes"
}
}
},
"age": {
"type": "multi_field",
"fields": {
"not_stored": {
"type": "long"
},
"stored": {
"type": "long",
"store": "yes"
}
}
}
}
}
}

View File

@ -17,6 +17,20 @@
"type":"string",
"index":"no",
"store":"yes"
},
"test1" : {
"type":"string",
"index":"analyzed",
"store" : "yes",
"fielddata" : {
"loading" : "eager"
}
},
"test2" : {
"type" : "token_count",
"store" : "yes",
"index" : "not_analyzed",
"analyzer" : "simple"
}
}
},

View File

@ -0,0 +1,50 @@
{
"person": {
"properties": {
"name": {
"type": "string",
"index": "analyzed",
"store": "yes",
"fields": {
"indexed": {
"type": "string",
"index": "analyzed",
"store": "no"
},
"not_indexed": {
"type": "string",
"index": "no",
"store": "yes"
},
"test1": {
"type": "string",
"index": "analyzed",
"store": "yes",
"fielddata": {
"loading": "eager"
}
},
"test2": {
"type": "token_count",
"index": "not_analyzed",
"store": "yes",
"analyzer": "simple"
}
}
},
"object1": {
"properties": {
"multi1": {
"type": "date",
"fields": {
"string": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
}
}
}

View File

@ -415,7 +415,7 @@ public class CompletionSuggestSearchTests extends ElasticsearchIntegrationTest {
}
@Test
public void testThatUpgradeToMultiFieldWorks() throws Exception {
public void testThatUpgradeToMultiFieldTypeWorks() throws Exception {
Settings.Builder settingsBuilder = createDefaultSettings();
final XContentBuilder mapping = jsonBuilder()
.startObject()
@ -423,6 +423,7 @@ public class CompletionSuggestSearchTests extends ElasticsearchIntegrationTest {
.startObject("properties")
.startObject(FIELD)
.field("type", "string")
.field("path", "just_name") // The path can't be changes / upgraded
.endObject()
.endObject()
.endObject()
@ -460,6 +461,51 @@ public class CompletionSuggestSearchTests extends ElasticsearchIntegrationTest {
assertSuggestions(afterReindexingResponse, "suggs", "Foo Fighters");
}
@Test
public void testThatUpgradeToMultiFieldsWorks() throws Exception {
Settings.Builder settingsBuilder = createDefaultSettings();
final XContentBuilder mapping = jsonBuilder()
.startObject()
.startObject(TYPE)
.startObject("properties")
.startObject(FIELD)
.field("type", "string")
.field("path", "just_name") // The path can't be changes / upgraded
.endObject()
.endObject()
.endObject()
.endObject();
client().admin().indices().prepareCreate(INDEX).addMapping(TYPE, mapping).setSettings(settingsBuilder).get();
ensureYellow();
client().prepareIndex(INDEX, TYPE, "1").setRefresh(true).setSource(jsonBuilder().startObject().field(FIELD, "Foo Fighters").endObject()).get();
PutMappingResponse putMappingResponse = client().admin().indices().preparePutMapping(INDEX).setType(TYPE).setSource(jsonBuilder().startObject()
.startObject(TYPE).startObject("properties")
.startObject(FIELD)
.field("type", "string")
.startObject("fields")
.startObject("suggest").field("type", "completion").field("index_analyzer", "simple").field("search_analyzer", "simple").endObject()
.endObject()
.endObject()
.endObject().endObject()
.endObject())
.get();
assertThat(putMappingResponse.isAcknowledged(), is(true));
SuggestResponse suggestResponse = client().prepareSuggest(INDEX).addSuggestion(
new CompletionSuggestionBuilder("suggs").field("suggest").text("f").size(10)
).execute().actionGet();
assertSuggestions(suggestResponse, "suggs");
client().prepareIndex(INDEX, TYPE, "1").setRefresh(true).setSource(jsonBuilder().startObject().field(FIELD, "Foo Fighters").endObject()).get();
waitForRelocation(ClusterHealthStatus.GREEN);
SuggestResponse afterReindexingResponse = client().prepareSuggest(INDEX).addSuggestion(
new CompletionSuggestionBuilder("suggs").field("suggest").text("f").size(10)
).execute().actionGet();
assertSuggestions(afterReindexingResponse, "suggs", "Foo Fighters");
}
@Test
public void testThatFuzzySuggesterWorks() throws Exception {
createIndexAndMapping(completionMappingBuilder);

View File

@ -39,6 +39,7 @@ import org.elasticsearch.index.codec.postingsformat.Elasticsearch090PostingsForm
import org.elasticsearch.index.codec.postingsformat.PostingsFormatProvider;
import org.elasticsearch.index.codec.postingsformat.PreBuiltPostingsFormatProvider;
import org.elasticsearch.index.mapper.FieldMapper.Names;
import org.elasticsearch.index.mapper.core.AbstractFieldMapper;
import org.elasticsearch.index.mapper.core.CompletionFieldMapper;
import org.elasticsearch.search.suggest.SuggestUtils;
import org.elasticsearch.search.suggest.completion.Completion090PostingsFormat.LookupFactory;
@ -70,7 +71,7 @@ public class CompletionPostingsFormatTest extends ElasticsearchTestCase {
LookupFactory load = currentProvider.load(input);
PostingsFormatProvider format = new PreBuiltPostingsFormatProvider(new Elasticsearch090PostingsFormat());
NamedAnalyzer analyzer = new NamedAnalyzer("foo", new StandardAnalyzer(TEST_VERSION_CURRENT));
Lookup lookup = load.getLookup(new CompletionFieldMapper(new Names("foo"), analyzer, analyzer, format, null, true, true, true, Integer.MAX_VALUE), new CompletionSuggestionContext(null));
Lookup lookup = load.getLookup(new CompletionFieldMapper(new Names("foo"), analyzer, analyzer, format, null, true, true, true, Integer.MAX_VALUE, AbstractFieldMapper.MultiFields.empty()), new CompletionSuggestionContext(null));
List<LookupResult> result = lookup.lookup("ge", false, 10);
assertThat(result.get(0).key.toString(), equalTo("Generator - Foo Fighters"));
assertThat(result.get(0).payload.utf8ToString(), equalTo("id:10"));
@ -89,7 +90,7 @@ public class CompletionPostingsFormatTest extends ElasticsearchTestCase {
LookupFactory load = currentProvider.load(input);
PostingsFormatProvider format = new PreBuiltPostingsFormatProvider(new Elasticsearch090PostingsFormat());
NamedAnalyzer analyzer = new NamedAnalyzer("foo", new StandardAnalyzer(TEST_VERSION_CURRENT));
AnalyzingCompletionLookupProvider.AnalyzingSuggestHolder analyzingSuggestHolder = load.getAnalyzingSuggestHolder(new CompletionFieldMapper(new Names("foo"), analyzer, analyzer, format, null, true, true, true, Integer.MAX_VALUE));
AnalyzingCompletionLookupProvider.AnalyzingSuggestHolder analyzingSuggestHolder = load.getAnalyzingSuggestHolder(new CompletionFieldMapper(new Names("foo"), analyzer, analyzer, format, null, true, true, true, Integer.MAX_VALUE, AbstractFieldMapper.MultiFields.empty()));
assertThat(analyzingSuggestHolder.sepLabel, is(AnalyzingCompletionLookupProviderV1.SEP_LABEL));
assertThat(analyzingSuggestHolder.payloadSep, is(AnalyzingCompletionLookupProviderV1.PAYLOAD_SEP));
assertThat(analyzingSuggestHolder.endByte, is(AnalyzingCompletionLookupProviderV1.END_BYTE));
@ -107,7 +108,7 @@ public class CompletionPostingsFormatTest extends ElasticsearchTestCase {
LookupFactory load = currentProvider.load(input);
PostingsFormatProvider format = new PreBuiltPostingsFormatProvider(new Elasticsearch090PostingsFormat());
NamedAnalyzer analyzer = new NamedAnalyzer("foo", new StandardAnalyzer(TEST_VERSION_CURRENT));
AnalyzingCompletionLookupProvider.AnalyzingSuggestHolder analyzingSuggestHolder = load.getAnalyzingSuggestHolder(new CompletionFieldMapper(new Names("foo"), analyzer, analyzer, format, null, true, true, true, Integer.MAX_VALUE));
AnalyzingCompletionLookupProvider.AnalyzingSuggestHolder analyzingSuggestHolder = load.getAnalyzingSuggestHolder(new CompletionFieldMapper(new Names("foo"), analyzer, analyzer, format, null, true, true, true, Integer.MAX_VALUE, AbstractFieldMapper.MultiFields.empty()));
assertThat(analyzingSuggestHolder.sepLabel, is(XAnalyzingSuggester.SEP_LABEL));
assertThat(analyzingSuggestHolder.payloadSep, is(XAnalyzingSuggester.PAYLOAD_SEP));
assertThat(analyzingSuggestHolder.endByte, is(XAnalyzingSuggester.END_BYTE));
@ -206,7 +207,7 @@ public class CompletionPostingsFormatTest extends ElasticsearchTestCase {
NamedAnalyzer namedAnalzyer = new NamedAnalyzer("foo", new StandardAnalyzer(TEST_VERSION_CURRENT));
final CompletionFieldMapper mapper = new CompletionFieldMapper(new Names("foo"), namedAnalzyer, namedAnalzyer, provider, null, usePayloads,
preserveSeparators, preservePositionIncrements, Integer.MAX_VALUE);
preserveSeparators, preservePositionIncrements, Integer.MAX_VALUE, AbstractFieldMapper.MultiFields.empty());
Lookup buildAnalyzingLookup = buildAnalyzingLookup(mapper, titles, titles, weights);
Field field = buildAnalyzingLookup.getClass().getDeclaredField("maxAnalyzedPathsForOneInput");
field.setAccessible(true);
@ -295,7 +296,7 @@ public class CompletionPostingsFormatTest extends ElasticsearchTestCase {
LookupFactory load = provider.load(input);
PostingsFormatProvider format = new PreBuiltPostingsFormatProvider(new Elasticsearch090PostingsFormat());
NamedAnalyzer analyzer = new NamedAnalyzer("foo", new StandardAnalyzer(TEST_VERSION_CURRENT));
assertNull(load.getLookup(new CompletionFieldMapper(new Names("foo"), analyzer, analyzer, format, null, true, true, true, Integer.MAX_VALUE), new CompletionSuggestionContext(null)));
assertNull(load.getLookup(new CompletionFieldMapper(new Names("foo"), analyzer, analyzer, format, null, true, true, true, Integer.MAX_VALUE, AbstractFieldMapper.MultiFields.empty()), new CompletionSuggestionContext(null)));
dir.close();
}