Expose proximity boosting (#39385) (#40251)

Expose DistanceFeatureQuery for geo, date and date_nanos types

Closes #33382
This commit is contained in:
Mayya Sharipova 2019-03-20 09:24:41 -04:00 committed by GitHub
parent 4c2a8638ca
commit 49a7c6e0e8
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
13 changed files with 783 additions and 5 deletions

View File

@ -0,0 +1,177 @@
[[query-dsl-distance-feature-query]]
=== Distance Feature Query
The `distance_feature` query is a specialized query that only works
on <<date, `date`>>, <<date_nanos, `date_nanos`>> or <<geo-point,`geo_point`>>
fields. Its goal is to boost documents' scores based on proximity
to some given origin. For example, use this query if you want to
give more weight to documents with dates closer to a certain date,
or to documents with locations closer to a certain location.
This query is called `distance_feature` query, because it dynamically
calculates distances between the given origin and documents' field values,
and use these distances as features to boost the documents' scores.
`distance_feature` query is typically used on its own to find the nearest
neighbors to a given point, or put in a `should` clause of a
<<query-dsl-bool-query,`bool`>> query so that its score is added to the score
of the query.
Compared to using <<query-dsl-function-score-query,`function_score`>> or other
ways to modify the score, this query has the benefit of being able to
efficiently skip non-competitive hits when
<<search-uri-request,`track_total_hits`>> is not set to `true`.
==== Syntax of distance_feature query
`distance_feature` query has the following syntax:
[source,js]
--------------------------------------------------
"distance_feature": {
"field": <field>,
"origin": <origin>,
"pivot": <pivot>,
"boost" : <boost>
}
--------------------------------------------------
// NOTCONSOLE
[horizontal]
`field`::
Required parameter. Defines the name of the field on which to calculate
distances. Must be a field of the type `date`, `date_nanos` or `geo_point`,
and must be indexed (`"index": true`, which is the default) and has
<<doc-values, doc values>> (`"doc_values": true`, which is the default).
`origin`::
Required parameter. Defines a point of origin used for calculating
distances. Must be a date for date and date_nanos fields,
and a geo-point for geo_point fields. Date math (for example `now-1h`) is
supported for a date origin.
`pivot`::
Required parameter. Defines the distance from origin at which the computed
score will equal to a half of the `boost` parameter. Must be
a `number+date unit` ("1h", "10d",...) for date and date_nanos fields,
and a `number + geo unit` ("1km", "12m",...) for geo fields.
`boost`::
Optional parameter with a default value of `1`. Defines the factor by which
to multiply the score. Must be a non-negative float number.
The `distance_feature` query computes a document's score as following:
`score = boost * pivot / (pivot + distance)`
where `distance` is the absolute difference between the origin and
a document's field value.
==== Example using distance_feature query
Let's look at an example. We index several documents containing
information about sales items, such as name, production date,
and location.
[source,js]
--------------------------------------------------
PUT items
{
"mappings": {
"properties": {
"name": {
"type": "keyword"
},
"production_date": {
"type": "date"
},
"location": {
"type": "geo_point"
}
}
}
}
PUT items/_doc/1
{
"name" : "chocolate",
"production_date": "2018-02-01",
"location": [-71.34, 41.12]
}
PUT items/_doc/2
{
"name" : "chocolate",
"production_date": "2018-01-01",
"location": [-71.3, 41.15]
}
PUT items/_doc/3
{
"name" : "chocolate",
"production_date": "2017-12-01",
"location": [-71.3, 41.12]
}
POST items/_refresh
--------------------------------------------------
// CONSOLE
We look for all chocolate items, but we also want chocolates
that are produced recently (closer to the date `now`)
to be ranked higher.
[source,js]
--------------------------------------------------
GET items/_search
{
"query": {
"bool": {
"must": {
"match": {
"name": "chocolate"
}
},
"should": {
"distance_feature": {
"field": "production_date",
"pivot": "7d",
"origin": "now"
}
}
}
}
}
--------------------------------------------------
// CONSOLE
// TEST[continued]
We can look for all chocolate items, but we also want chocolates
that are produced locally (closer to our geo origin)
come first in the result list.
[source,js]
--------------------------------------------------
GET items/_search
{
"query": {
"bool": {
"must": {
"match": {
"name": "chocolate"
}
},
"should": {
"distance_feature": {
"field": "location",
"pivot": "1000m",
"origin": [-71.3, 41.15]
}
}
}
}
}
--------------------------------------------------
// CONSOLE
// TEST[continued]

View File

@ -28,6 +28,12 @@ the specified document.
A query that computes scores based on the values of numeric features and is
able to efficiently skip non-competitive hits.
<<query-dsl-distance-feature-query,`distance_feature` query>>::
A query that computes scores based on the dynamically computed distances
between the origin and documents' date, date_nanos and geo_point fields.
It is able to efficiently skip non-competitive hits.
<<query-dsl-wrapper-query,`wrapper` query>>::
A query that accepts other queries as json or yaml string.
@ -42,4 +48,6 @@ include::percolate-query.asciidoc[]
include::rank-feature-query.asciidoc[]
include::distance-feature-query.asciidoc[]
include::wrapper-query.asciidoc[]

View File

@ -440,6 +440,7 @@ public final class ObjectParser<Value, Context> extends AbstractObjectParser<Val
OBJECT_OR_LONG(START_OBJECT, VALUE_NUMBER),
OBJECT_ARRAY_BOOLEAN_OR_STRING(START_OBJECT, START_ARRAY, VALUE_BOOLEAN, VALUE_STRING),
OBJECT_ARRAY_OR_STRING(START_OBJECT, START_ARRAY, VALUE_STRING),
OBJECT_ARRAY_STRING_OR_NUMBER(START_OBJECT, START_ARRAY, VALUE_STRING, VALUE_NUMBER),
VALUE(VALUE_BOOLEAN, VALUE_NULL, VALUE_EMBEDDED_OBJECT, VALUE_NUMBER, VALUE_STRING),
VALUE_OBJECT_ARRAY(VALUE_BOOLEAN, VALUE_NULL, VALUE_EMBEDDED_OBJECT, VALUE_NUMBER, VALUE_STRING, START_OBJECT, START_ARRAY),
VALUE_ARRAY(VALUE_BOOLEAN, VALUE_NULL, VALUE_NUMBER, VALUE_STRING, START_ARRAY);

View File

@ -0,0 +1,87 @@
setup:
- skip:
version: " - 7.0.99"
reason: "Implemented in 7.1"
- do:
indices.create:
index: index1
body:
settings:
number_of_replicas: 0
mappings:
properties:
my_date:
type: date
my_date_nanos:
type: date_nanos
my_geo:
type: geo_point
- do:
bulk:
refresh: true
body:
- '{ "index" : { "_index" : "index1", "_id" : "1" } }'
- '{ "my_date": "2018-02-01T10:00:00Z", "my_date_nanos": "2018-02-01T00:00:00.223456789Z", "my_geo": [-71.34, 41.13] }'
- '{ "index" : { "_index" : "index1", "_id" : "2" } }'
- '{ "my_date": "2018-02-01T11:00:00Z", "my_date_nanos": "2018-02-01T00:00:00.123456789Z", "my_geo": [-71.34, 41.14] }'
- '{ "index" : { "_index" : "index1", "_id" : "3" } }'
- '{ "my_date": "2018-02-01T09:00:00Z", "my_date_nanos": "2018-02-01T00:00:00.323456789Z", "my_geo": [-71.34, 41.12] }'
---
"test distance_feature query on date type":
- do:
search:
rest_total_hits_as_int: true
index: index1
body:
query:
distance_feature:
field: my_date
pivot: 1h
origin: 2018-02-01T08:00:30Z
- length: { hits.hits: 3 }
- match: { hits.hits.0._id: "3" }
- match: { hits.hits.1._id: "1" }
- match: { hits.hits.2._id: "2" }
---
"test distance_feature query on date_nanos type":
- do:
search:
rest_total_hits_as_int: true
index: index1
body:
query:
distance_feature:
field: my_date_nanos
pivot: 100000000nanos
origin: 2018-02-01T00:00:00.323456789Z
- length: { hits.hits: 3 }
- match: { hits.hits.0._id: "3" }
- match: { hits.hits.1._id: "1" }
- match: { hits.hits.2._id: "2" }
---
"test distance_feature query on geo_point type":
- do:
search:
rest_total_hits_as_int: true
index: index1
body:
query:
distance_feature:
field: my_geo
pivot: 1km
origin: [-71.35, 41.12]
- length: { hits.hits: 3 }
- match: { hits.hits.0._id: "3" }
- match: { hits.hits.1._id: "1" }
- match: { hits.hits.2._id: "2" }

View File

@ -545,6 +545,27 @@ public class GeoUtils {
}
}
/**
* Parse a {@link GeoPoint} from a string. The string must have one of the following forms:
*
* <ul>
* <li>Latitude, Longitude form: <pre>&quot;<i>&lt;latitude&gt;</i>,<i>&lt;longitude&gt;</i>&quot;</pre></li>
* <li>Geohash form:: <pre>&quot;<i>&lt;geohash&gt;</i>&quot;</pre></li>
* </ul>
*
* @param val a String to parse the value from
* @return new parsed {@link GeoPoint}
*/
public static GeoPoint parseFromString(String val) {
GeoPoint point = new GeoPoint();
boolean ignoreZValue = false;
if (val.contains(",")) {
return point.resetFromString(val, ignoreZValue);
} else {
return parseGeoHash(point, val, EffectivePoint.BOTTOM_LEFT);
}
}
/**
* Parse a precision that can be expressed as an integer or a distance measure like "1km", "10m".
*

View File

@ -308,6 +308,10 @@ public final class DateFieldMapper extends FieldMapper {
return dateTimeFormatter;
}
public Resolution resolution() {
return resolution;
}
void setDateTimeFormatter(DateFormatter formatter) {
checkIfFrozen();
this.dateTimeFormatter = formatter;

View File

@ -0,0 +1,231 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index.query;
import org.apache.lucene.document.LatLonPoint;
import org.apache.lucene.document.LongPoint;
import org.apache.lucene.search.Query;
import org.elasticsearch.common.ParseField;
import org.elasticsearch.common.ParsingException;
import org.elasticsearch.common.geo.GeoPoint;
import org.elasticsearch.common.geo.GeoUtils;
import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.lucene.search.Queries;
import org.elasticsearch.common.unit.DistanceUnit;
import org.elasticsearch.common.xcontent.ConstructingObjectParser;
import org.elasticsearch.common.xcontent.ObjectParser;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.index.mapper.DateFieldMapper;
import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.GeoPointFieldMapper.GeoPointFieldType;
import org.elasticsearch.index.mapper.DateFieldMapper.DateFieldType;
import java.io.IOException;
import java.util.Objects;
import static org.elasticsearch.common.xcontent.ConstructingObjectParser.constructorArg;
/**
* A query to boost scores based on their proximity to the given origin
* for date, date_nanos and geo_point field types
*/
public class DistanceFeatureQueryBuilder extends AbstractQueryBuilder<DistanceFeatureQueryBuilder> {
public static final String NAME = "distance_feature";
private static final ParseField FIELD_FIELD = new ParseField("field");
private static final ParseField ORIGIN_FIELD = new ParseField("origin");
private static final ParseField PIVOT_FIELD = new ParseField("pivot");
private final String field;
private final Origin origin;
private final String pivot;
private static final ConstructingObjectParser<DistanceFeatureQueryBuilder, Void> PARSER = new ConstructingObjectParser<>(
"distance_feature", false,
args -> new DistanceFeatureQueryBuilder((String) args[0], (Origin) args[1], (String) args[2])
);
static {
PARSER.declareString(constructorArg(), FIELD_FIELD);
// origin: number or string for date and date_nanos fields; string, array, object for geo fields
PARSER.declareField(constructorArg(), DistanceFeatureQueryBuilder.Origin::originFromXContent,
ORIGIN_FIELD, ObjectParser.ValueType.OBJECT_ARRAY_STRING_OR_NUMBER);
PARSER.declareString(constructorArg(), PIVOT_FIELD);
declareStandardFields(PARSER);
}
public DistanceFeatureQueryBuilder(String field, Origin origin, String pivot) {
this.field = Objects.requireNonNull(field);
this.origin = Objects.requireNonNull(origin);
this.pivot = Objects.requireNonNull(pivot);
}
public static DistanceFeatureQueryBuilder fromXContent(XContentParser parser) {
return PARSER.apply(parser, null);
}
@Override
protected void doXContent(XContentBuilder builder, Params params) throws IOException {
builder.startObject(NAME);
builder.field(FIELD_FIELD.getPreferredName(), field);
builder.field(ORIGIN_FIELD.getPreferredName(), origin.origin);
builder.field(PIVOT_FIELD.getPreferredName(), pivot);
printBoostAndQueryName(builder);
builder.endObject();
}
public DistanceFeatureQueryBuilder(StreamInput in) throws IOException {
super(in);
field = in.readString();
origin = new Origin(in);
pivot = in.readString();
}
@Override
protected void doWriteTo(StreamOutput out) throws IOException {
out.writeString(field);
origin.writeTo(out);
out.writeString(pivot);
}
@Override
public String getWriteableName() {
return NAME;
}
@Override
protected Query doToQuery(QueryShardContext context) throws IOException {
MappedFieldType fieldType = context.fieldMapper(field);
if (fieldType == null) {
return Queries.newMatchNoDocsQuery("Can't run [" + NAME + "] query on unmapped fields!");
}
Object originObj = origin.origin();
if (fieldType instanceof DateFieldType) {
long originLong = ((DateFieldType) fieldType).parseToLong(originObj, true, null, null, context);
TimeValue pivotVal = TimeValue.parseTimeValue(pivot, DistanceFeatureQueryBuilder.class.getSimpleName() + ".pivot");
if (((DateFieldType) fieldType).resolution() == DateFieldMapper.Resolution.MILLISECONDS) {
return LongPoint.newDistanceFeatureQuery(field, boost, originLong, pivotVal.getMillis());
} else { // NANOSECONDS
return LongPoint.newDistanceFeatureQuery(field, boost, originLong, pivotVal.getNanos());
}
} else if (fieldType instanceof GeoPointFieldType) {
GeoPoint originGeoPoint;
if (originObj instanceof GeoPoint) {
originGeoPoint = (GeoPoint) originObj;
} else if (originObj instanceof String) {
originGeoPoint = GeoUtils.parseFromString((String) originObj);
} else {
throw new IllegalArgumentException("Illegal type ["+ origin.getClass() + "] for [origin]! " +
"Must be of type [geo_point] or [string] for geo_point fields!");
}
double pivotDouble = DistanceUnit.DEFAULT.parse(pivot, DistanceUnit.DEFAULT);
return LatLonPoint.newDistanceFeatureQuery(field, boost, originGeoPoint.lat(), originGeoPoint.lon(), pivotDouble);
}
throw new IllegalArgumentException("Illegal data type of [" + fieldType.typeName() + "]!"+
"[" + NAME + "] query can only be run on a date, date_nanos or geo_point field type!");
}
String fieldName() {
return field;
}
Origin origin() {
return origin;
}
String pivot() {
return pivot;
}
@Override
protected int doHashCode() {
return Objects.hash(field, origin, pivot);
}
@Override
protected boolean doEquals(DistanceFeatureQueryBuilder other) {
return this.field.equals(other.field) && Objects.equals(this.origin, other.origin) && this.pivot.equals(other.pivot);
}
public static class Origin {
private final Object origin;
public Origin(Long origin) {
this.origin = Objects.requireNonNull(origin);
}
public Origin(String origin) {
this.origin = Objects.requireNonNull(origin);
}
public Origin(GeoPoint origin) {
this.origin = Objects.requireNonNull(origin);
}
private static Origin originFromXContent(XContentParser parser) throws IOException {
if (parser.currentToken() == XContentParser.Token.VALUE_NUMBER) {
return new Origin(parser.longValue());
} else if(parser.currentToken() == XContentParser.Token.VALUE_STRING) {
return new Origin(parser.text());
} else if (parser.currentToken() == XContentParser.Token.START_OBJECT) {
return new Origin(GeoUtils.parseGeoPoint(parser));
} else if (parser.currentToken() == XContentParser.Token.START_ARRAY) {
return new Origin(GeoUtils.parseGeoPoint(parser));
} else {
throw new ParsingException(parser.getTokenLocation(),
"Illegal type while parsing [origin]! Must be [number] or [string] for date and date_nanos fields;" +
" or [string], [array], [object] for geo_point fields!");
}
}
private Origin(StreamInput in) throws IOException {
origin = in.readGenericValue();
}
private void writeTo(final StreamOutput out) throws IOException {
out.writeGenericValue(origin);
}
Object origin() {
return origin;
}
@Override
public final boolean equals(Object other) {
if ((other instanceof Origin) == false) return false;
Object otherOrigin = ((Origin) other).origin();
return this.origin().equals(otherOrigin);
}
@Override
public int hashCode() {
return Objects.hash(origin);
}
@Override
public String toString() {
return origin.toString();
}
}
}

View File

@ -36,6 +36,7 @@ import org.elasticsearch.index.query.BoostingQueryBuilder;
import org.elasticsearch.index.query.CommonTermsQueryBuilder;
import org.elasticsearch.index.query.ConstantScoreQueryBuilder;
import org.elasticsearch.index.query.DisMaxQueryBuilder;
import org.elasticsearch.index.query.DistanceFeatureQueryBuilder;
import org.elasticsearch.index.query.ExistsQueryBuilder;
import org.elasticsearch.index.query.FieldMaskingSpanQueryBuilder;
import org.elasticsearch.index.query.FuzzyQueryBuilder;
@ -823,6 +824,8 @@ public class SearchModule {
registerQuery(new QuerySpec<>(MatchNoneQueryBuilder.NAME, MatchNoneQueryBuilder::new, MatchNoneQueryBuilder::fromXContent));
registerQuery(new QuerySpec<>(TermsSetQueryBuilder.NAME, TermsSetQueryBuilder::new, TermsSetQueryBuilder::fromXContent));
registerQuery(new QuerySpec<>(IntervalQueryBuilder.NAME, IntervalQueryBuilder::new, IntervalQueryBuilder::fromXContent));
registerQuery(new QuerySpec<>(DistanceFeatureQueryBuilder.NAME, DistanceFeatureQueryBuilder::new,
DistanceFeatureQueryBuilder::fromXContent));
if (ShapesAvailability.JTS_AVAILABLE && ShapesAvailability.SPATIAL4J_AVAILABLE) {
registerQuery(new QuerySpec<>(GeoShapeQueryBuilder.NAME, GeoShapeQueryBuilder::new, GeoShapeQueryBuilder::fromXContent));

View File

@ -0,0 +1,236 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.index.query;
import org.apache.lucene.document.LatLonPoint;
import org.apache.lucene.document.LongPoint;
import org.apache.lucene.search.Query;
import org.elasticsearch.common.geo.GeoPoint;
import org.elasticsearch.common.geo.GeoUtils;
import org.elasticsearch.common.lucene.search.Queries;
import org.elasticsearch.common.unit.DistanceUnit;
import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.index.mapper.DateFieldMapper;
import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.search.internal.SearchContext;
import org.elasticsearch.test.AbstractQueryTestCase;
import org.joda.time.DateTime;
import org.elasticsearch.index.query.DistanceFeatureQueryBuilder.Origin;
import org.elasticsearch.index.mapper.DateFieldMapper.DateFieldType;
import java.io.IOException;
import java.time.Instant;
import static org.hamcrest.Matchers.containsString;
public class DistanceFeatureQueryBuilderTests extends AbstractQueryTestCase<DistanceFeatureQueryBuilder> {
@Override
protected DistanceFeatureQueryBuilder doCreateTestQueryBuilder() {
String field = randomFrom(DATE_FIELD_NAME, DATE_NANOS_FIELD_NAME, GEO_POINT_FIELD_NAME);
Origin origin;
String pivot;
switch (field) {
case GEO_POINT_FIELD_NAME:
GeoPoint point = new GeoPoint(randomDouble(), randomDouble());
origin = randomBoolean() ? new Origin(point) : new Origin(point.geohash());
pivot = randomFrom(DistanceUnit.values()).toString(randomDouble());
break;
case DATE_FIELD_NAME:
long randomDateMills = randomLongBetween(0, 2_000_000_000_000L);
origin = randomBoolean() ? new Origin(randomDateMills) : new Origin(new DateTime(randomDateMills).toString());
pivot = randomTimeValue(1, 1000, "d", "h", "ms", "s", "m");
break;
default: // DATE_NANOS_FIELD_NAME
randomDateMills = randomLongBetween(0, 2_000_000_000_000L);
if (randomBoolean()) {
origin = new Origin(randomDateMills); // nano_dates long accept milliseconds since epoch
} else {
long randomNanos = randomLongBetween(0, 1_000_000L);
Instant randomDateNanos = Instant.ofEpochMilli(randomDateMills).plusNanos(randomNanos);
origin = new Origin(randomDateNanos.toString());
}
pivot = randomTimeValue(1, 100_000_000, "nanos");
break;
}
return new DistanceFeatureQueryBuilder(field, origin, pivot);
}
@Override
protected void doAssertLuceneQuery(DistanceFeatureQueryBuilder queryBuilder, Query query, SearchContext context) throws IOException {
String fieldName = expectedFieldName(queryBuilder.fieldName());
Object origin = queryBuilder.origin().origin();
String pivot = queryBuilder.pivot();
float boost = queryBuilder.boost;
final Query expectedQuery;
if (fieldName.equals(GEO_POINT_FIELD_NAME)) {
GeoPoint originGeoPoint = (origin instanceof GeoPoint)? (GeoPoint) origin : GeoUtils.parseFromString((String) origin);
double pivotDouble = DistanceUnit.DEFAULT.parse(pivot, DistanceUnit.DEFAULT);
expectedQuery = LatLonPoint.newDistanceFeatureQuery(fieldName, boost, originGeoPoint.lat(), originGeoPoint.lon(), pivotDouble);
} else { // if (fieldName.equals(DATE_FIELD_NAME))
MapperService mapperService = context.getQueryShardContext().getMapperService();
DateFieldType fieldType = (DateFieldType) mapperService.fullName(fieldName);
long originLong = fieldType.parseToLong(origin, true, null, null, context.getQueryShardContext());
TimeValue pivotVal = TimeValue.parseTimeValue(pivot, DistanceFeatureQueryBuilder.class.getSimpleName() + ".pivot");
long pivotLong;
if (fieldType.resolution() == DateFieldMapper.Resolution.MILLISECONDS) {
pivotLong = pivotVal.getMillis();
} else { // NANOSECONDS
pivotLong = pivotVal.getNanos();
}
expectedQuery = LongPoint.newDistanceFeatureQuery(fieldName, boost, originLong, pivotLong);
}
assertEquals(expectedQuery, query);
}
public void testFromJsonDateFieldType() throws IOException {
// origin as string
String origin = "2018-01-01T13:10:30Z";
String pivot = "7d";
String json = "{\n" +
" \"distance_feature\" : {\n" +
" \"field\": \""+ DATE_FIELD_NAME + "\",\n" +
" \"origin\": \"" + origin + "\",\n" +
" \"pivot\" : \"" + pivot + "\",\n" +
" \"boost\" : 1.0\n" +
" }\n" +
"}";
DistanceFeatureQueryBuilder parsed = (DistanceFeatureQueryBuilder) parseQuery(json);
checkGeneratedJson(json, parsed);
assertEquals(json, origin, parsed.origin().origin());
assertEquals(json, pivot, parsed.pivot());
assertEquals(json, 1.0, parsed.boost(), 0.0001);
// origin as long
long originLong = 1514812230999L;
json = "{\n" +
" \"distance_feature\" : {\n" +
" \"field\": \""+ DATE_FIELD_NAME + "\",\n" +
" \"origin\": " + originLong + ",\n" +
" \"pivot\" : \"" + pivot + "\",\n" +
" \"boost\" : 1.0\n" +
" }\n" +
"}";
parsed = (DistanceFeatureQueryBuilder) parseQuery(json);
assertEquals(json, originLong, parsed.origin().origin());
}
public void testFromJsonDateNanosFieldType() throws IOException {
// origin as string
String origin = "2018-01-01T13:10:30.323456789Z";
String pivot = "100000000nanos";
String json = "{\n" +
" \"distance_feature\" : {\n" +
" \"field\": \""+ DATE_NANOS_FIELD_NAME + "\",\n" +
" \"origin\": \"" + origin + "\",\n" +
" \"pivot\" : \"" + pivot + "\",\n" +
" \"boost\" : 1.0\n" +
" }\n" +
"}";
DistanceFeatureQueryBuilder parsed = (DistanceFeatureQueryBuilder) parseQuery(json);
checkGeneratedJson(json, parsed);
assertEquals(json, origin, parsed.origin().origin());
assertEquals(json, pivot, parsed.pivot());
assertEquals(json, 1.0, parsed.boost(), 0.0001);
// origin as long
long originLong = 1514812230999L;
json = "{\n" +
" \"distance_feature\" : {\n" +
" \"field\": \""+ DATE_NANOS_FIELD_NAME + "\",\n" +
" \"origin\": " + originLong + ",\n" +
" \"pivot\" : \"" + pivot + "\",\n" +
" \"boost\" : 1.0\n" +
" }\n" +
"}";
parsed = (DistanceFeatureQueryBuilder) parseQuery(json);
assertEquals(json, originLong, parsed.origin().origin());
}
public void testFromJsonGeoFieldType() throws IOException {
final GeoPoint origin = new GeoPoint(41.12,-71.34);
final String pivot = "1km";
// origin as string
String json = "{\n" +
" \"distance_feature\" : {\n" +
" \"field\": \""+ GEO_POINT_FIELD_NAME + "\",\n" +
" \"origin\": \"" + origin.toString() + "\",\n" +
" \"pivot\" : \"" + pivot + "\",\n" +
" \"boost\" : 2.0\n" +
" }\n" +
"}";
DistanceFeatureQueryBuilder parsed = (DistanceFeatureQueryBuilder) parseQuery(json);
checkGeneratedJson(json, parsed);
assertEquals(json, origin.toString(), parsed.origin().origin());
assertEquals(json, pivot, parsed.pivot());
assertEquals(json, 2.0, parsed.boost(), 0.0001);
// origin as array
json = "{\n" +
" \"distance_feature\" : {\n" +
" \"field\": \""+ GEO_POINT_FIELD_NAME + "\",\n" +
" \"origin\": [" + origin.lon() + ", " + origin.lat() + "],\n" +
" \"pivot\" : \"" + pivot + "\",\n" +
" \"boost\" : 2.0\n" +
" }\n" +
"}";
parsed = (DistanceFeatureQueryBuilder) parseQuery(json);
assertEquals(json, origin, parsed.origin().origin());
// origin as object
json = "{\n" +
" \"distance_feature\" : {\n" +
" \"field\": \""+ GEO_POINT_FIELD_NAME + "\",\n" +
" \"origin\": {" + "\"lat\":"+ origin.lat() + ", \"lon\":"+ origin.lon() + "},\n" +
" \"pivot\" : \"" + pivot + "\",\n" +
" \"boost\" : 2.0\n" +
" }\n" +
"}";
parsed = (DistanceFeatureQueryBuilder) parseQuery(json);
assertEquals(json, origin, parsed.origin().origin());
}
public void testQueryMatchNoDocsQueryWithUnmappedField() throws IOException {
Query expectedQuery = Queries.newMatchNoDocsQuery(
"Can't run [" + DistanceFeatureQueryBuilder.NAME + "] query on unmapped fields!");
String queryString = "{\n" +
" \"distance_feature\" : {\n" +
" \"field\": \"random_unmapped_field\",\n" +
" \"origin\": \"random_string\",\n" +
" \"pivot\" : \"random_string\"\n" +
" }\n" +
"}";
Query query = parseQuery(queryString).toQuery(createShardContext());
assertEquals(expectedQuery, query);
}
public void testQueryFailsWithWrongFieldType() {
String query = "{\n" +
" \"distance_feature\" : {\n" +
" \"field\": \""+ INT_FIELD_NAME + "\",\n" +
" \"origin\": 40,\n" +
" \"pivot\" : \"random_string\"\n" +
" }\n" +
"}";
IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () -> parseQuery(query).toQuery(createShardContext()));
assertThat(e.getMessage(), containsString("query can only be run on a date, date_nanos or geo_point field type!"));
}
}

View File

@ -84,7 +84,8 @@ public class TermsQueryBuilderTests extends AbstractQueryTestCase<TermsQueryBuil
choice.equals(GEO_POINT_ALIAS_FIELD_NAME) ||
choice.equals(GEO_SHAPE_FIELD_NAME) ||
choice.equals(INT_RANGE_FIELD_NAME) ||
choice.equals(DATE_RANGE_FIELD_NAME),
choice.equals(DATE_RANGE_FIELD_NAME) ||
choice.equals(DATE_NANOS_FIELD_NAME), // TODO: needs testing for date_nanos type
() -> getRandomFieldName());
Object[] values = new Object[randomInt(5)];
for (int i = 0; i < values.length; i++) {

View File

@ -358,7 +358,8 @@ public class SearchModuleTests extends ESTestCase {
"terms_set",
"type",
"wildcard",
"wrapper"
"wrapper",
"distance_feature"
};
//add here deprecated queries to make sure we log a deprecation warnings when they are used

View File

@ -106,6 +106,7 @@ public abstract class AbstractBuilderTestCase extends ESTestCase {
protected static final String INT_RANGE_FIELD_NAME = "mapped_int_range";
protected static final String DOUBLE_FIELD_NAME = "mapped_double";
protected static final String BOOLEAN_FIELD_NAME = "mapped_boolean";
protected static final String DATE_NANOS_FIELD_NAME = "mapped_date_nanos";
protected static final String DATE_FIELD_NAME = "mapped_date";
protected static final String DATE_ALIAS_FIELD_NAME = "mapped_date_alias";
protected static final String DATE_RANGE_FIELD_NAME = "mapped_date_range";
@ -114,11 +115,11 @@ public abstract class AbstractBuilderTestCase extends ESTestCase {
protected static final String GEO_POINT_ALIAS_FIELD_NAME = "mapped_geo_point_alias";
protected static final String GEO_SHAPE_FIELD_NAME = "mapped_geo_shape";
protected static final String[] MAPPED_FIELD_NAMES = new String[]{STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME,
INT_FIELD_NAME, INT_RANGE_FIELD_NAME, DOUBLE_FIELD_NAME, BOOLEAN_FIELD_NAME, DATE_FIELD_NAME,
INT_FIELD_NAME, INT_RANGE_FIELD_NAME, DOUBLE_FIELD_NAME, BOOLEAN_FIELD_NAME, DATE_NANOS_FIELD_NAME, DATE_FIELD_NAME,
DATE_RANGE_FIELD_NAME, OBJECT_FIELD_NAME, GEO_POINT_FIELD_NAME, GEO_POINT_ALIAS_FIELD_NAME,
GEO_SHAPE_FIELD_NAME};
protected static final String[] MAPPED_LEAF_FIELD_NAMES = new String[]{STRING_FIELD_NAME, STRING_ALIAS_FIELD_NAME,
INT_FIELD_NAME, INT_RANGE_FIELD_NAME, DOUBLE_FIELD_NAME, BOOLEAN_FIELD_NAME,
INT_FIELD_NAME, INT_RANGE_FIELD_NAME, DOUBLE_FIELD_NAME, BOOLEAN_FIELD_NAME, DATE_NANOS_FIELD_NAME,
DATE_FIELD_NAME, DATE_RANGE_FIELD_NAME, GEO_POINT_FIELD_NAME, GEO_POINT_ALIAS_FIELD_NAME};
private static final Map<String, String> ALIAS_TO_CONCRETE_FIELD_NAME = new HashMap<>();
@ -390,6 +391,7 @@ public abstract class AbstractBuilderTestCase extends ESTestCase {
INT_RANGE_FIELD_NAME, "type=integer_range",
DOUBLE_FIELD_NAME, "type=double",
BOOLEAN_FIELD_NAME, "type=boolean",
DATE_NANOS_FIELD_NAME, "type=date_nanos",
DATE_FIELD_NAME, "type=date",
DATE_ALIAS_FIELD_NAME, "type=alias,path=" + DATE_FIELD_NAME,
DATE_RANGE_FIELD_NAME, "type=date_range",

View File

@ -58,6 +58,7 @@ import org.joda.time.DateTime;
import org.joda.time.DateTimeZone;
import java.io.IOException;
import java.time.Instant;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Deque;
@ -632,7 +633,7 @@ public abstract class AbstractQueryTestCase<QB extends AbstractQueryBuilder<QB>>
/**
* create a random value for either {@link AbstractQueryTestCase#BOOLEAN_FIELD_NAME}, {@link AbstractQueryTestCase#INT_FIELD_NAME},
* {@link AbstractQueryTestCase#DOUBLE_FIELD_NAME}, {@link AbstractQueryTestCase#STRING_FIELD_NAME} or
* {@link AbstractQueryTestCase#DATE_FIELD_NAME}, or a String value by default
* {@link AbstractQueryTestCase#DATE_FIELD_NAME} or {@link AbstractQueryTestCase#DATE_NANOS_FIELD_NAME} or a String value by default
*/
protected static Object getRandomValueForFieldName(String fieldName) {
Object value;
@ -659,6 +660,9 @@ public abstract class AbstractQueryTestCase<QB extends AbstractQueryBuilder<QB>>
case DATE_FIELD_NAME:
value = new DateTime(System.currentTimeMillis(), DateTimeZone.UTC).toString();
break;
case DATE_NANOS_FIELD_NAME:
value = Instant.now().toString();
break;
default:
value = randomAlphaOfLengthBetween(1, 10);
}
@ -711,6 +715,8 @@ public abstract class AbstractQueryTestCase<QB extends AbstractQueryBuilder<QB>>
return Fuzziness.build(1 + randomFloat() * 10);
case DATE_FIELD_NAME:
return Fuzziness.build(randomTimeValue());
case DATE_NANOS_FIELD_NAME:
return Fuzziness.build(randomTimeValue());
default:
if (randomBoolean()) {
return Fuzziness.fromEdits(randomIntBetween(0, 2));