Unsigned long 64bits(#62892)
Introduce 64-bit unsigned long field type This field type supports - indexing of integer values from [0, 18446744073709551615] - precise queries (term, range) - precise sort and terms aggregations - other aggregations are based on conversion of long values to double and can be imprecise for large values. Backport for #60050 Closes #32434
This commit is contained in:
parent
a43f29cfc9
commit
54064a1eec
|
@ -15,6 +15,7 @@ The following numeric types are supported:
|
|||
`float`:: A single-precision 32-bit IEEE 754 floating point number, restricted to finite values.
|
||||
`half_float`:: A half-precision 16-bit IEEE 754 floating point number, restricted to finite values.
|
||||
`scaled_float`:: A floating point number that is backed by a `long`, scaled by a fixed `double` scaling factor.
|
||||
`unsigned_long`:: An unsigned 64-bit integer with a minimum value of 0 and a maximum value of +2^64^-1+.
|
||||
|
||||
Below is an example of configuring a mapping with numeric fields:
|
||||
|
||||
|
@ -115,7 +116,7 @@ The following parameters are accepted by numeric types:
|
|||
<<coerce,`coerce`>>::
|
||||
|
||||
Try to convert strings to numbers and truncate fractions for integers.
|
||||
Accepts `true` (default) and `false`.
|
||||
Accepts `true` (default) and `false`. Not applicable for `unsigned_long`.
|
||||
|
||||
<<mapping-boost,`boost`>>::
|
||||
|
||||
|
@ -169,3 +170,5 @@ The following parameters are accepted by numeric types:
|
|||
sorting) will behave as if the document had a value of +2.3+. High values
|
||||
of `scaling_factor` improve accuracy but also increase space requirements.
|
||||
This parameter is required.
|
||||
|
||||
include::unsigned_long.asciidoc[]
|
||||
|
|
|
@ -0,0 +1,116 @@
|
|||
[role="xpack"]
|
||||
[testenv="basic"]
|
||||
|
||||
[[unsigned-long]]
|
||||
=== Unsigned long data type
|
||||
Unsigned long is a numeric field type that represents an unsigned 64-bit
|
||||
integer with a minimum value of 0 and a maximum value of +2^64^-1+
|
||||
(from 0 to 18446744073709551615 inclusive).
|
||||
|
||||
[source,console]
|
||||
--------------------------------------------------
|
||||
PUT my_index
|
||||
{
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"my_counter": {
|
||||
"type": "unsigned_long"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------------------------
|
||||
|
||||
Unsigned long can be indexed in a numeric or string form,
|
||||
representing integer values in the range [0, 18446744073709551615].
|
||||
They can't have a decimal part.
|
||||
|
||||
[source,console]
|
||||
--------------------------------
|
||||
POST /my_index/_bulk?refresh
|
||||
{"index":{"_id":1}}
|
||||
{"my_counter": 0}
|
||||
{"index":{"_id":2}}
|
||||
{"my_counter": 9223372036854775808}
|
||||
{"index":{"_id":3}}
|
||||
{"my_counter": 18446744073709551614}
|
||||
{"index":{"_id":4}}
|
||||
{"my_counter": 18446744073709551615}
|
||||
--------------------------------
|
||||
//TEST[continued]
|
||||
|
||||
Term queries accept any numbers in a numeric or string form.
|
||||
|
||||
[source,console]
|
||||
--------------------------------
|
||||
GET /my_index/_search
|
||||
{
|
||||
"query": {
|
||||
"term" : {
|
||||
"my_counter" : 18446744073709551615
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------
|
||||
//TEST[continued]
|
||||
|
||||
Range query terms can contain values with decimal parts.
|
||||
In this case {es} converts them to integer values:
|
||||
`gte` and `gt` terms are converted to the nearest integer up inclusive,
|
||||
and `lt` and `lte` ranges are converted to the nearest integer down inclusive.
|
||||
|
||||
It is recommended to pass ranges as strings to ensure they are parsed
|
||||
without any loss of precision.
|
||||
|
||||
[source,console]
|
||||
--------------------------------
|
||||
GET /my_index/_search
|
||||
{
|
||||
"query": {
|
||||
"range" : {
|
||||
"my_counter" : {
|
||||
"gte" : "9223372036854775808.5",
|
||||
"lte" : "18446744073709551615"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
--------------------------------
|
||||
//TEST[continued]
|
||||
|
||||
|
||||
For queries with sort on an `unsigned_long` field,
|
||||
for a particular document {es} returns a sort value of the type `long`
|
||||
if the value of this document is within the range of long values,
|
||||
or of the type `BigInteger` if the value exceeds this range.
|
||||
|
||||
NOTE: REST clients need to be able to handle big integer values
|
||||
in JSON to support this field type correctly.
|
||||
|
||||
[source,console]
|
||||
--------------------------------
|
||||
GET /my_index/_search
|
||||
{
|
||||
"query": {
|
||||
"match_all" : {}
|
||||
},
|
||||
"sort" : {"my_counter" : "desc"}
|
||||
}
|
||||
--------------------------------
|
||||
//TEST[continued]
|
||||
|
||||
Similarly to sort values, script values of an `unsigned_long` field
|
||||
return a `Number` representing a `Long` or `BigInteger`.
|
||||
The same values: `Long` or `BigInteger` are used for `terms` aggregations.
|
||||
|
||||
==== Queries with mixed numeric types
|
||||
|
||||
Searches with mixed numeric types one of which is `unsigned_long` are
|
||||
supported, except queries with sort. Thus, a sort query across two indexes
|
||||
where the same field name has an `unsigned_long` type in one index,
|
||||
and `long` type in another, doesn't produce correct results and must
|
||||
be avoided. If there is a need for such kind of sorting, script based sorting
|
||||
can be used instead.
|
||||
|
||||
Aggregations across several numeric types one of which is `unsigned_long` are
|
||||
supported. In this case, values are converted to the `double` type.
|
|
@ -29,6 +29,7 @@ import java.lang.invoke.CallSite;
|
|||
import java.lang.invoke.MethodHandle;
|
||||
import java.lang.invoke.MethodHandles;
|
||||
import java.lang.invoke.MethodType;
|
||||
import java.math.BigInteger;
|
||||
import java.time.ZonedDateTime;
|
||||
import java.util.BitSet;
|
||||
import java.util.Collections;
|
||||
|
@ -734,6 +735,8 @@ public final class Def {
|
|||
return (float)value;
|
||||
} else if (value instanceof Double) {
|
||||
return (double)value;
|
||||
} else if (value instanceof BigInteger) {
|
||||
return ((BigInteger)value).doubleValue();
|
||||
} else {
|
||||
throw new ClassCastException("cannot implicitly cast " +
|
||||
"def [" + PainlessLookupUtility.typeToUnboxedType(value.getClass()).getCanonicalName() + "] to " +
|
||||
|
@ -866,7 +869,8 @@ public final class Def {
|
|||
value instanceof Integer ||
|
||||
value instanceof Long ||
|
||||
value instanceof Float ||
|
||||
value instanceof Double
|
||||
value instanceof Double ||
|
||||
value instanceof BigInteger
|
||||
) {
|
||||
return ((Number)value).doubleValue();
|
||||
} else {
|
||||
|
@ -1004,7 +1008,9 @@ public final class Def {
|
|||
} else if (value instanceof Float) {
|
||||
return (double)(float)value;
|
||||
} else if (value instanceof Double) {
|
||||
return (Double)value;
|
||||
return (Double) value;
|
||||
} else if (value instanceof BigInteger) {
|
||||
return ((BigInteger)value).doubleValue();
|
||||
} else {
|
||||
throw new ClassCastException("cannot implicitly cast " +
|
||||
"def [" + PainlessLookupUtility.typeToUnboxedType(value.getClass()).getCanonicalName() + "] to " +
|
||||
|
@ -1151,7 +1157,8 @@ public final class Def {
|
|||
value instanceof Integer ||
|
||||
value instanceof Long ||
|
||||
value instanceof Float ||
|
||||
value instanceof Double
|
||||
value instanceof Double ||
|
||||
value instanceof BigInteger
|
||||
) {
|
||||
return ((Number)value).doubleValue();
|
||||
} else {
|
||||
|
|
|
@ -166,6 +166,7 @@ public class DefCastTests extends ScriptTestCase {
|
|||
assertEquals((double)0, exec("def d = Long.valueOf(0); double b = d; b"));
|
||||
assertEquals((double)0, exec("def d = Float.valueOf(0); double b = d; b"));
|
||||
assertEquals((double)0, exec("def d = Double.valueOf(0); double b = d; b"));
|
||||
assertEquals((double)0, exec("def d = BigInteger.valueOf(0); double b = d; b"));
|
||||
expectScriptThrows(ClassCastException.class, () -> exec("def d = new ArrayList(); double b = d;"));
|
||||
}
|
||||
|
||||
|
|
|
@ -427,6 +427,7 @@ public final class SearchPhaseController {
|
|||
if (queryResults.isEmpty()) {
|
||||
throw new IllegalStateException(errorMsg);
|
||||
}
|
||||
validateMergeSortValueFormats(queryResults);
|
||||
final QuerySearchResult firstResult = queryResults.stream().findFirst().get().queryResult();
|
||||
final boolean hasSuggest = firstResult.suggest() != null;
|
||||
final boolean hasProfileResults = firstResult.hasProfileResults();
|
||||
|
@ -486,6 +487,36 @@ public final class SearchPhaseController {
|
|||
performFinalReduce ? aggReduceContextBuilder.forFinalReduction() : aggReduceContextBuilder.forPartialReduction());
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks that query results from all shards have consistent unsigned_long format.
|
||||
* Sort queries on a field that has long type in one index, and unsigned_long in another index
|
||||
* don't work correctly. Throw an error if this kind of sorting is detected.
|
||||
* //TODO: instead of throwing error, find a way to sort long and unsigned_long together
|
||||
*/
|
||||
private static void validateMergeSortValueFormats(Collection<? extends SearchPhaseResult> queryResults) {
|
||||
boolean[] ulFormats = null;
|
||||
boolean firstResult = true;
|
||||
for (SearchPhaseResult entry : queryResults) {
|
||||
DocValueFormat[] formats = entry.queryResult().sortValueFormats();
|
||||
if (formats == null) return;
|
||||
if (firstResult) {
|
||||
firstResult = false;
|
||||
ulFormats = new boolean[formats.length];
|
||||
for (int i = 0; i < formats.length; i++) {
|
||||
ulFormats[i] = formats[i] == DocValueFormat.UNSIGNED_LONG_SHIFTED ? true : false;
|
||||
}
|
||||
} else {
|
||||
for (int i = 0; i < formats.length; i++) {
|
||||
// if the format is unsigned_long in one shard, and something different in another shard
|
||||
if (ulFormats[i] ^ (formats[i] == DocValueFormat.UNSIGNED_LONG_SHIFTED)) {
|
||||
throw new IllegalArgumentException("Can't do sort across indices, as a field has [unsigned_long] type " +
|
||||
"in one index, and different type in another index!");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Returns the size of the requested top documents (from + size)
|
||||
*/
|
||||
|
|
|
@ -51,6 +51,7 @@ import java.io.FileNotFoundException;
|
|||
import java.io.FilterInputStream;
|
||||
import java.io.IOException;
|
||||
import java.io.InputStream;
|
||||
import java.math.BigInteger;
|
||||
import java.nio.file.AccessDeniedException;
|
||||
import java.nio.file.AtomicMoveNotSupportedException;
|
||||
import java.nio.file.DirectoryNotEmptyException;
|
||||
|
@ -329,6 +330,11 @@ public abstract class StreamInput extends InputStream {
|
|||
return null;
|
||||
}
|
||||
|
||||
public BigInteger readBigInteger() throws IOException {
|
||||
return new BigInteger(readString());
|
||||
}
|
||||
|
||||
|
||||
@Nullable
|
||||
public Text readOptionalText() throws IOException {
|
||||
int length = readInt();
|
||||
|
@ -741,6 +747,8 @@ public abstract class StreamInput extends InputStream {
|
|||
return readCollection(StreamInput::readGenericValue, LinkedHashSet::new, Collections.emptySet());
|
||||
case 25:
|
||||
return readCollection(StreamInput::readGenericValue, HashSet::new, Collections.emptySet());
|
||||
case 26:
|
||||
return readBigInteger();
|
||||
default:
|
||||
throw new IOException("Can't read unknown type [" + type + "]");
|
||||
}
|
||||
|
|
|
@ -51,6 +51,7 @@ import java.io.EOFException;
|
|||
import java.io.FileNotFoundException;
|
||||
import java.io.IOException;
|
||||
import java.io.OutputStream;
|
||||
import java.math.BigInteger;
|
||||
import java.nio.file.AccessDeniedException;
|
||||
import java.nio.file.AtomicMoveNotSupportedException;
|
||||
import java.nio.file.DirectoryNotEmptyException;
|
||||
|
@ -803,6 +804,11 @@ public abstract class StreamOutput extends OutputStream {
|
|||
}
|
||||
o.writeCollection((Set<?>) v, StreamOutput::writeGenericValue);
|
||||
});
|
||||
// TODO: improve serialization of BigInteger
|
||||
writers.put(BigInteger.class, (o, v) -> {
|
||||
o.writeByte((byte) 26);
|
||||
o.writeString(v.toString());
|
||||
});
|
||||
WRITERS = Collections.unmodifiableMap(writers);
|
||||
}
|
||||
|
||||
|
|
|
@ -95,6 +95,7 @@ import org.elasticsearch.index.analysis.NamedAnalyzer;
|
|||
import org.elasticsearch.index.fielddata.IndexFieldData;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.math.BigInteger;
|
||||
import java.text.ParseException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
|
@ -369,6 +370,8 @@ public class Lucene {
|
|||
cFields[j] = in.readBoolean();
|
||||
} else if (type == 9) {
|
||||
cFields[j] = in.readBytesRef();
|
||||
} else if (type == 10) {
|
||||
cFields[j] = new BigInteger(in.readString());
|
||||
} else {
|
||||
throw new IOException("Can't match type [" + type + "]");
|
||||
}
|
||||
|
@ -398,6 +401,8 @@ public class Lucene {
|
|||
return in.readBoolean();
|
||||
} else if (type == 9) {
|
||||
return in.readBytesRef();
|
||||
}else if (type == 10) {
|
||||
return new BigInteger(in.readString());
|
||||
} else {
|
||||
throw new IOException("Can't match type [" + type + "]");
|
||||
}
|
||||
|
@ -517,6 +522,10 @@ public class Lucene {
|
|||
} else if (type == BytesRef.class) {
|
||||
out.writeByte((byte) 9);
|
||||
out.writeBytesRef((BytesRef) field);
|
||||
} else if (type == BigInteger.class) {
|
||||
//TODO: improve serialization of BigInteger
|
||||
out.writeByte((byte) 10);
|
||||
out.writeString(field.toString());
|
||||
} else {
|
||||
throw new IOException("Can't handle sort field value of type [" + type + "]");
|
||||
}
|
||||
|
|
|
@ -37,6 +37,7 @@ import org.elasticsearch.index.mapper.DateFieldMapper;
|
|||
import org.elasticsearch.search.aggregations.bucket.geogrid.GeoTileUtils;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.math.BigInteger;
|
||||
import java.net.InetAddress;
|
||||
import java.text.DecimalFormat;
|
||||
import java.text.DecimalFormatSymbols;
|
||||
|
@ -51,6 +52,8 @@ import java.util.function.LongSupplier;
|
|||
|
||||
/** A formatter for values as returned by the fielddata/doc-values APIs. */
|
||||
public interface DocValueFormat extends NamedWriteable {
|
||||
long MASK_2_63 = 0x8000000000000000L;
|
||||
BigInteger BIGINTEGER_2_64_MINUS_ONE = BigInteger.ONE.shiftLeft(64).subtract(BigInteger.ONE); // 2^64 -1
|
||||
|
||||
/** Format a long value. This is used by terms and histogram aggregations
|
||||
* to format keys for fields that use longs as a doc value representation
|
||||
|
@ -493,5 +496,66 @@ public interface DocValueFormat extends NamedWriteable {
|
|||
public int hashCode() {
|
||||
return Objects.hash(pattern);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* DocValues format for unsigned 64 bit long values,
|
||||
* that are stored as shifted signed 64 bit long values.
|
||||
*/
|
||||
DocValueFormat UNSIGNED_LONG_SHIFTED = new DocValueFormat() {
|
||||
|
||||
@Override
|
||||
public String getWriteableName() {
|
||||
return "unsigned_long_shifted";
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) {
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return "unsigned_long_shifted";
|
||||
}
|
||||
|
||||
/**
|
||||
* Formats the unsigned long to the shifted long format
|
||||
*/
|
||||
@Override
|
||||
public long parseLong(String value, boolean roundUp, LongSupplier now) {
|
||||
long parsedValue = Long.parseUnsignedLong(value);
|
||||
// subtract 2^63 or 10000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000
|
||||
// equivalent to flipping the first bit
|
||||
return parsedValue ^ MASK_2_63;
|
||||
}
|
||||
|
||||
/**
|
||||
* Formats a raw docValue that is stored in the shifted long format to the unsigned long representation.
|
||||
*/
|
||||
@Override
|
||||
public Object format(long value) {
|
||||
// add 2^63 or 10000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000,
|
||||
// equivalent to flipping the first bit
|
||||
long formattedValue = value ^ MASK_2_63;
|
||||
if (formattedValue >= 0) {
|
||||
return formattedValue;
|
||||
} else {
|
||||
return BigInteger.valueOf(formattedValue).and(BIGINTEGER_2_64_MINUS_ONE);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Double docValues of the unsigned_long field type are already in the formatted representation,
|
||||
* so we don't need to do anything here
|
||||
*/
|
||||
@Override
|
||||
public Double format(double value) {
|
||||
return value;
|
||||
}
|
||||
|
||||
@Override
|
||||
public double parseDouble(String value, boolean roundUp, LongSupplier now) {
|
||||
return Double.parseDouble(value);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
|
|
@ -778,6 +778,7 @@ public class SearchModule {
|
|||
registerValueFormat(DocValueFormat.IP.getWriteableName(), in -> DocValueFormat.IP);
|
||||
registerValueFormat(DocValueFormat.RAW.getWriteableName(), in -> DocValueFormat.RAW);
|
||||
registerValueFormat(DocValueFormat.BINARY.getWriteableName(), in -> DocValueFormat.BINARY);
|
||||
registerValueFormat(DocValueFormat.UNSIGNED_LONG_SHIFTED.getWriteableName(), in -> DocValueFormat.UNSIGNED_LONG_SHIFTED);
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -57,10 +57,13 @@ public class SearchSortValues implements ToXContentFragment, Writeable {
|
|||
this.rawSortValues = rawSortValues;
|
||||
this.formattedSortValues = Arrays.copyOf(rawSortValues, rawSortValues.length);
|
||||
for (int i = 0; i < rawSortValues.length; ++i) {
|
||||
//we currently format only BytesRef but we may want to change that in the future
|
||||
Object sortValue = rawSortValues[i];
|
||||
if (sortValue instanceof BytesRef) {
|
||||
this.formattedSortValues[i] = sortValueFormats[i].format((BytesRef) sortValue);
|
||||
} else if ((sortValue instanceof Long) && (sortValueFormats[i] == DocValueFormat.UNSIGNED_LONG_SHIFTED)) {
|
||||
this.formattedSortValues[i] = sortValueFormats[i].format((Long) sortValue);
|
||||
} else {
|
||||
this.formattedSortValues[i] = sortValue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -146,7 +146,8 @@ public class DoubleTerms extends InternalMappedTerms<DoubleTerms, DoubleTerms.Bu
|
|||
public InternalAggregation reduce(List<InternalAggregation> aggregations, ReduceContext reduceContext) {
|
||||
boolean promoteToDouble = false;
|
||||
for (InternalAggregation agg : aggregations) {
|
||||
if (agg instanceof LongTerms && ((LongTerms) agg).format == DocValueFormat.RAW) {
|
||||
if (agg instanceof LongTerms &&
|
||||
(((LongTerms) agg).format == DocValueFormat.RAW || ((LongTerms) agg).format == DocValueFormat.UNSIGNED_LONG_SHIFTED) ) {
|
||||
/*
|
||||
* this terms agg mixes longs and doubles, we must promote longs to doubles to make the internal aggs
|
||||
* compatible
|
||||
|
|
|
@ -67,12 +67,20 @@ public class LongTerms extends InternalMappedTerms<LongTerms, LongTerms.Bucket>
|
|||
|
||||
@Override
|
||||
public Object getKey() {
|
||||
return term;
|
||||
if (format == DocValueFormat.UNSIGNED_LONG_SHIFTED) {
|
||||
return format.format(term);
|
||||
} else {
|
||||
return term;
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public Number getKeyAsNumber() {
|
||||
return term;
|
||||
if (format == DocValueFormat.UNSIGNED_LONG_SHIFTED) {
|
||||
return (Number) format.format(term);
|
||||
} else {
|
||||
return term;
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -82,8 +90,12 @@ public class LongTerms extends InternalMappedTerms<LongTerms, LongTerms.Bucket>
|
|||
|
||||
@Override
|
||||
protected final XContentBuilder keyToXContent(XContentBuilder builder) throws IOException {
|
||||
builder.field(CommonFields.KEY.getPreferredName(), term);
|
||||
if (format != DocValueFormat.RAW) {
|
||||
if (format == DocValueFormat.UNSIGNED_LONG_SHIFTED) {
|
||||
builder.field(CommonFields.KEY.getPreferredName(), format.format(term));
|
||||
} else {
|
||||
builder.field(CommonFields.KEY.getPreferredName(), term);
|
||||
}
|
||||
if (format != DocValueFormat.RAW && format != DocValueFormat.UNSIGNED_LONG_SHIFTED) {
|
||||
builder.field(CommonFields.KEY_AS_STRING.getPreferredName(), format.format(term).toString());
|
||||
}
|
||||
return builder;
|
||||
|
@ -144,10 +156,31 @@ public class LongTerms extends InternalMappedTerms<LongTerms, LongTerms.Bucket>
|
|||
|
||||
@Override
|
||||
public InternalAggregation reduce(List<InternalAggregation> aggregations, ReduceContext reduceContext) {
|
||||
boolean unsignedLongFormat = false;
|
||||
boolean rawFormat = false;
|
||||
for (InternalAggregation agg : aggregations) {
|
||||
if (agg instanceof DoubleTerms) {
|
||||
return agg.reduce(aggregations, reduceContext);
|
||||
}
|
||||
if (agg instanceof LongTerms) {
|
||||
if (((LongTerms) agg).format == DocValueFormat.RAW) {
|
||||
rawFormat = true;
|
||||
} else if (((LongTerms) agg).format == DocValueFormat.UNSIGNED_LONG_SHIFTED) {
|
||||
unsignedLongFormat = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (rawFormat && unsignedLongFormat) { // if we have mixed formats, convert results to double format
|
||||
List<InternalAggregation> newAggs = new ArrayList<>(aggregations.size());
|
||||
for (InternalAggregation agg : aggregations) {
|
||||
if (agg instanceof LongTerms) {
|
||||
DoubleTerms dTerms = LongTerms.convertLongTermsToDouble((LongTerms) agg, format);
|
||||
newAggs.add(dTerms);
|
||||
} else {
|
||||
newAggs.add(agg);
|
||||
}
|
||||
}
|
||||
return newAggs.get(0).reduce(newAggs, reduceContext);
|
||||
}
|
||||
return super.reduce(aggregations, reduceContext);
|
||||
}
|
||||
|
|
|
@ -40,6 +40,7 @@ import org.elasticsearch.search.DocValueFormat;
|
|||
import org.elasticsearch.search.sort.SortAndFormats;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.math.BigInteger;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
@ -91,6 +92,7 @@ public class SearchAfterBuilder implements ToXContentObject, Writeable {
|
|||
if (values[i] instanceof Double) continue;
|
||||
if (values[i] instanceof Float) continue;
|
||||
if (values[i] instanceof Boolean) continue;
|
||||
if (values[i] instanceof BigInteger) continue;
|
||||
throw new IllegalArgumentException("Can't handle " + SEARCH_AFTER + " field value of type [" + values[i].getClass() + "]");
|
||||
}
|
||||
sortValues = new Object[values.length];
|
||||
|
@ -181,7 +183,8 @@ public class SearchAfterBuilder implements ToXContentObject, Writeable {
|
|||
return Double.parseDouble(value.toString());
|
||||
|
||||
case LONG:
|
||||
if (value instanceof Number) {
|
||||
// for unsigned_long field type we want to pass search_after value through formatting
|
||||
if (value instanceof Number && format != DocValueFormat.UNSIGNED_LONG_SHIFTED) {
|
||||
return ((Number) value).longValue();
|
||||
}
|
||||
return format.parseLong(value.toString(), false,
|
||||
|
@ -243,6 +246,10 @@ public class SearchAfterBuilder implements ToXContentObject, Writeable {
|
|||
values.add(parser.floatValue());
|
||||
break;
|
||||
|
||||
case BIG_INTEGER:
|
||||
values.add(parser.text());
|
||||
break;
|
||||
|
||||
default:
|
||||
throw new IllegalArgumentException("[search_after] does not accept numbers of type ["
|
||||
+ parser.numberType() + "], got " + parser.text());
|
||||
|
|
|
@ -43,7 +43,6 @@ import org.elasticsearch.test.ESTestCase;
|
|||
|
||||
import java.io.IOException;
|
||||
import java.math.BigDecimal;
|
||||
import java.math.BigInteger;
|
||||
import java.util.Collections;
|
||||
|
||||
import static org.elasticsearch.search.searchafter.SearchAfterBuilder.extractSortType;
|
||||
|
@ -59,7 +58,7 @@ public class SearchAfterBuilderTests extends ESTestCase {
|
|||
SearchAfterBuilder searchAfterBuilder = new SearchAfterBuilder();
|
||||
Object[] values = new Object[numSearchFrom];
|
||||
for (int i = 0; i < numSearchFrom; i++) {
|
||||
int branch = randomInt(9);
|
||||
int branch = randomInt(10);
|
||||
switch (branch) {
|
||||
case 0:
|
||||
values[i] = randomInt();
|
||||
|
@ -91,6 +90,9 @@ public class SearchAfterBuilderTests extends ESTestCase {
|
|||
case 9:
|
||||
values[i] = null;
|
||||
break;
|
||||
case 10:
|
||||
values[i] = randomBigInteger();
|
||||
break;
|
||||
}
|
||||
}
|
||||
searchAfterBuilder.setSortValues(values);
|
||||
|
@ -196,27 +198,12 @@ public class SearchAfterBuilderTests extends ESTestCase {
|
|||
|
||||
public void testFromXContentIllegalType() throws Exception {
|
||||
for (XContentType type : XContentType.values()) {
|
||||
// BIG_INTEGER
|
||||
XContentBuilder xContent = XContentFactory.contentBuilder(type);
|
||||
xContent.startObject()
|
||||
.startArray("search_after")
|
||||
.value(new BigInteger("9223372036854776000"))
|
||||
.endArray()
|
||||
.endObject();
|
||||
try (XContentParser parser = createParser(xContent)) {
|
||||
parser.nextToken();
|
||||
parser.nextToken();
|
||||
parser.nextToken();
|
||||
IllegalArgumentException exc = expectThrows(IllegalArgumentException.class, () -> SearchAfterBuilder.fromXContent(parser));
|
||||
assertThat(exc.getMessage(), containsString("BIG_INTEGER"));
|
||||
}
|
||||
|
||||
// BIG_DECIMAL
|
||||
// ignore json and yaml, they parse floating point numbers as floats/doubles
|
||||
if (type == XContentType.JSON || type == XContentType.YAML) {
|
||||
continue;
|
||||
}
|
||||
xContent = XContentFactory.contentBuilder(type);
|
||||
XContentBuilder xContent = XContentFactory.contentBuilder(type);
|
||||
xContent.startObject()
|
||||
.startArray("search_after")
|
||||
.value(new BigDecimal("9223372036854776003.3"))
|
||||
|
|
|
@ -121,6 +121,7 @@ import org.junit.rules.RuleChain;
|
|||
|
||||
import java.io.IOException;
|
||||
import java.io.InputStream;
|
||||
import java.math.BigInteger;
|
||||
import java.net.InetAddress;
|
||||
import java.net.UnknownHostException;
|
||||
import java.nio.file.Path;
|
||||
|
@ -730,6 +731,16 @@ public abstract class ESTestCase extends LuceneTestCase {
|
|||
return random().nextLong();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a random BigInteger uniformly distributed over the range 0 to (2^64 - 1) inclusive
|
||||
* Currently BigIntegers are only used for unsigned_long field type, where the max value is 2^64 - 1.
|
||||
* Modify this random generator if a wider range for BigIntegers is necessary.
|
||||
* @return a random bigInteger in the range [0 ; 2^64 - 1]
|
||||
*/
|
||||
public static BigInteger randomBigInteger() {
|
||||
return new BigInteger(64, random());
|
||||
}
|
||||
|
||||
/** A random integer from 0..max (inclusive). */
|
||||
public static int randomInt(int max) {
|
||||
return RandomizedTest.randomInt(max);
|
||||
|
|
|
@ -0,0 +1,23 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
evaluationDependsOn(xpackModule('core'))
|
||||
|
||||
apply plugin: 'elasticsearch.esplugin'
|
||||
|
||||
esplugin {
|
||||
name 'unsigned-long'
|
||||
description 'Module for the unsigned long field type'
|
||||
classname 'org.elasticsearch.xpack.unsignedlong.UnsignedLongMapperPlugin'
|
||||
extendedPlugins = ['x-pack-core', 'lang-painless']
|
||||
}
|
||||
archivesBaseName = 'x-pack-unsigned-long'
|
||||
|
||||
dependencies {
|
||||
compileOnly project(':modules:lang-painless:spi')
|
||||
compileOnly project(path: xpackModule('core'), configuration: 'default')
|
||||
testImplementation project(path: xpackModule('core'), configuration: 'testArtifacts')
|
||||
}
|
|
@ -0,0 +1,51 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.unsignedlong;
|
||||
|
||||
import org.elasticsearch.painless.spi.PainlessExtension;
|
||||
import org.elasticsearch.painless.spi.Whitelist;
|
||||
import org.elasticsearch.painless.spi.WhitelistLoader;
|
||||
import org.elasticsearch.script.AggregationScript;
|
||||
import org.elasticsearch.script.BucketAggregationSelectorScript;
|
||||
import org.elasticsearch.script.FieldScript;
|
||||
import org.elasticsearch.script.FilterScript;
|
||||
import org.elasticsearch.script.NumberSortScript;
|
||||
import org.elasticsearch.script.ScoreScript;
|
||||
import org.elasticsearch.script.ScriptContext;
|
||||
import org.elasticsearch.script.StringSortScript;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import static java.util.Collections.singletonList;
|
||||
|
||||
public class DocValuesWhitelistExtension implements PainlessExtension {
|
||||
|
||||
private static final Whitelist WHITELIST = WhitelistLoader.loadFromResourceFiles(DocValuesWhitelistExtension.class, "whitelist.txt");
|
||||
|
||||
@Override
|
||||
public Map<ScriptContext<?>, List<Whitelist>> getContextWhitelists() {
|
||||
List<Whitelist> whitelist = singletonList(WHITELIST);
|
||||
Map<ScriptContext<?>, List<Whitelist>> contexts = org.elasticsearch.common.collect.Map.of(
|
||||
FieldScript.CONTEXT,
|
||||
whitelist,
|
||||
ScoreScript.CONTEXT,
|
||||
whitelist,
|
||||
FilterScript.CONTEXT,
|
||||
whitelist,
|
||||
AggregationScript.CONTEXT,
|
||||
whitelist,
|
||||
NumberSortScript.CONTEXT,
|
||||
whitelist,
|
||||
StringSortScript.CONTEXT,
|
||||
whitelist,
|
||||
BucketAggregationSelectorScript.CONTEXT,
|
||||
whitelist
|
||||
);
|
||||
return contexts;
|
||||
}
|
||||
}
|
|
@ -0,0 +1,513 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.unsignedlong;
|
||||
|
||||
import com.fasterxml.jackson.core.JsonParseException;
|
||||
import com.fasterxml.jackson.core.exc.InputCoercionException;
|
||||
import org.apache.lucene.document.LongPoint;
|
||||
import org.apache.lucene.document.SortedNumericDocValuesField;
|
||||
import org.apache.lucene.search.IndexOrDocValuesQuery;
|
||||
import org.apache.lucene.search.IndexSortSortedNumericDocValuesRangeQuery;
|
||||
import org.apache.lucene.search.MatchNoDocsQuery;
|
||||
import org.apache.lucene.search.Query;
|
||||
import org.apache.lucene.util.BytesRef;
|
||||
import org.elasticsearch.common.Explicit;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.index.fielddata.IndexFieldData;
|
||||
import org.elasticsearch.index.fielddata.IndexNumericFieldData;
|
||||
import org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData;
|
||||
import org.elasticsearch.index.mapper.FieldMapper;
|
||||
import org.elasticsearch.index.mapper.MappedFieldType;
|
||||
import org.elasticsearch.index.mapper.MapperParsingException;
|
||||
import org.elasticsearch.index.mapper.MapperService;
|
||||
import org.elasticsearch.index.mapper.NumberFieldMapper;
|
||||
import org.elasticsearch.index.mapper.ParametrizedFieldMapper;
|
||||
import org.elasticsearch.index.mapper.ParseContext;
|
||||
import org.elasticsearch.index.mapper.SimpleMappedFieldType;
|
||||
import org.elasticsearch.index.mapper.SourceValueFetcher;
|
||||
import org.elasticsearch.index.mapper.TextSearchInfo;
|
||||
import org.elasticsearch.index.mapper.ValueFetcher;
|
||||
import org.elasticsearch.index.query.QueryShardContext;
|
||||
import org.elasticsearch.search.DocValueFormat;
|
||||
import org.elasticsearch.search.lookup.SearchLookup;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.math.BigDecimal;
|
||||
import java.math.BigInteger;
|
||||
import java.time.ZoneId;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.function.Function;
|
||||
import java.util.function.Supplier;
|
||||
|
||||
public class UnsignedLongFieldMapper extends ParametrizedFieldMapper {
|
||||
public static final String CONTENT_TYPE = "unsigned_long";
|
||||
|
||||
private static final long MASK_2_63 = 0x8000000000000000L;
|
||||
static final BigInteger BIGINTEGER_2_64_MINUS_ONE = BigInteger.ONE.shiftLeft(64).subtract(BigInteger.ONE); // 2^64 -1
|
||||
private static final BigDecimal BIGDECIMAL_2_64_MINUS_ONE = new BigDecimal(BIGINTEGER_2_64_MINUS_ONE);
|
||||
|
||||
private static UnsignedLongFieldMapper toType(FieldMapper in) {
|
||||
return (UnsignedLongFieldMapper) in;
|
||||
}
|
||||
|
||||
public static class Builder extends ParametrizedFieldMapper.Builder {
|
||||
private final Parameter<Boolean> indexed = Parameter.indexParam(m -> toType(m).indexed, true);
|
||||
private final Parameter<Boolean> hasDocValues = Parameter.docValuesParam(m -> toType(m).hasDocValues, true);
|
||||
private final Parameter<Boolean> stored = Parameter.storeParam(m -> toType(m).stored, false);
|
||||
private final Parameter<Explicit<Boolean>> ignoreMalformed;
|
||||
private final Parameter<String> nullValue;
|
||||
private final Parameter<Map<String, String>> meta = Parameter.metaParam();
|
||||
|
||||
public Builder(String name, Settings settings) {
|
||||
this(name, IGNORE_MALFORMED_SETTING.get(settings));
|
||||
}
|
||||
|
||||
private Builder(String name, boolean ignoreMalformedByDefault) {
|
||||
super(name);
|
||||
this.ignoreMalformed = Parameter.explicitBoolParam(
|
||||
"ignore_malformed",
|
||||
true,
|
||||
m -> toType(m).ignoreMalformed,
|
||||
ignoreMalformedByDefault
|
||||
);
|
||||
this.nullValue = new Parameter<>(
|
||||
"null_value",
|
||||
false,
|
||||
() -> null,
|
||||
(n, c, o) -> parseNullValueAsString(o),
|
||||
m -> toType(m).nullValue
|
||||
).acceptsNull();
|
||||
}
|
||||
|
||||
private String parseNullValueAsString(Object o) {
|
||||
if (o == null) return null;
|
||||
try {
|
||||
parseUnsignedLong(o); // confirm that null_value is a proper unsigned_long
|
||||
return (o instanceof BytesRef) ? ((BytesRef) o).utf8ToString() : o.toString();
|
||||
} catch (Exception e) {
|
||||
throw new MapperParsingException("Error parsing [null_value] on field [" + name() + "]: " + e.getMessage(), e);
|
||||
}
|
||||
}
|
||||
|
||||
Builder nullValue(String nullValue) {
|
||||
this.nullValue.setValue(nullValue);
|
||||
return this;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected List<Parameter<?>> getParameters() {
|
||||
return Arrays.asList(indexed, hasDocValues, stored, ignoreMalformed, nullValue, meta);
|
||||
}
|
||||
|
||||
@Override
|
||||
public UnsignedLongFieldMapper build(BuilderContext context) {
|
||||
UnsignedLongFieldType fieldType = new UnsignedLongFieldType(
|
||||
buildFullName(context),
|
||||
indexed.getValue(),
|
||||
stored.getValue(),
|
||||
hasDocValues.getValue(),
|
||||
meta.getValue()
|
||||
);
|
||||
return new UnsignedLongFieldMapper(name, fieldType, multiFieldsBuilder.build(this, context), copyTo.build(), this);
|
||||
}
|
||||
}
|
||||
|
||||
public static final TypeParser PARSER = new TypeParser((n, c) -> new Builder(n, c.getSettings()));
|
||||
|
||||
public static final class UnsignedLongFieldType extends SimpleMappedFieldType {
|
||||
|
||||
public UnsignedLongFieldType(String name, boolean indexed, boolean isStored, boolean hasDocValues, Map<String, String> meta) {
|
||||
super(name, indexed, isStored, hasDocValues, TextSearchInfo.SIMPLE_MATCH_ONLY, meta);
|
||||
}
|
||||
|
||||
public UnsignedLongFieldType(String name) {
|
||||
this(name, true, false, true, Collections.emptyMap());
|
||||
}
|
||||
|
||||
@Override
|
||||
public String typeName() {
|
||||
return CONTENT_TYPE;
|
||||
}
|
||||
|
||||
@Override
|
||||
public Query termQuery(Object value, QueryShardContext context) {
|
||||
failIfNotIndexed();
|
||||
Long longValue = parseTerm(value);
|
||||
if (longValue == null) {
|
||||
return new MatchNoDocsQuery();
|
||||
}
|
||||
return LongPoint.newExactQuery(name(), unsignedToSortableSignedLong(longValue));
|
||||
}
|
||||
|
||||
@Override
|
||||
public Query termsQuery(List<?> values, QueryShardContext context) {
|
||||
failIfNotIndexed();
|
||||
long[] lvalues = new long[values.size()];
|
||||
int upTo = 0;
|
||||
for (int i = 0; i < values.size(); i++) {
|
||||
Object value = values.get(i);
|
||||
Long longValue = parseTerm(value);
|
||||
if (longValue != null) {
|
||||
lvalues[upTo++] = unsignedToSortableSignedLong(longValue);
|
||||
}
|
||||
}
|
||||
if (upTo == 0) {
|
||||
return new MatchNoDocsQuery();
|
||||
}
|
||||
if (upTo != lvalues.length) {
|
||||
lvalues = Arrays.copyOf(lvalues, upTo);
|
||||
}
|
||||
return LongPoint.newSetQuery(name(), lvalues);
|
||||
}
|
||||
|
||||
@Override
|
||||
public Query rangeQuery(Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, QueryShardContext context) {
|
||||
failIfNotIndexed();
|
||||
long l = Long.MIN_VALUE;
|
||||
long u = Long.MAX_VALUE;
|
||||
if (lowerTerm != null) {
|
||||
Long lt = parseLowerRangeTerm(lowerTerm, includeLower);
|
||||
if (lt == null) return new MatchNoDocsQuery();
|
||||
l = unsignedToSortableSignedLong(lt);
|
||||
}
|
||||
if (upperTerm != null) {
|
||||
Long ut = parseUpperRangeTerm(upperTerm, includeUpper);
|
||||
if (ut == null) return new MatchNoDocsQuery();
|
||||
u = unsignedToSortableSignedLong(ut);
|
||||
}
|
||||
if (l > u) return new MatchNoDocsQuery();
|
||||
|
||||
Query query = LongPoint.newRangeQuery(name(), l, u);
|
||||
if (hasDocValues()) {
|
||||
Query dvQuery = SortedNumericDocValuesField.newSlowRangeQuery(name(), l, u);
|
||||
query = new IndexOrDocValuesQuery(query, dvQuery);
|
||||
if (context.indexSortedOnField(name())) {
|
||||
query = new IndexSortSortedNumericDocValuesRangeQuery(name(), l, u, query);
|
||||
}
|
||||
}
|
||||
return query;
|
||||
}
|
||||
|
||||
@Override
|
||||
public IndexFieldData.Builder fielddataBuilder(String fullyQualifiedIndexName, Supplier<SearchLookup> searchLookup) {
|
||||
failIfNoDocValues();
|
||||
return (cache, breakerService, mapperService) -> {
|
||||
final IndexNumericFieldData signedLongValues = new SortedNumericIndexFieldData.Builder(
|
||||
name(),
|
||||
IndexNumericFieldData.NumericType.LONG
|
||||
).build(cache, breakerService, mapperService);
|
||||
return new UnsignedLongIndexFieldData(signedLongValues);
|
||||
};
|
||||
}
|
||||
|
||||
@Override
|
||||
public Object valueForDisplay(Object value) {
|
||||
if (value == null) {
|
||||
return null;
|
||||
}
|
||||
return value;
|
||||
}
|
||||
|
||||
@Override
|
||||
public DocValueFormat docValueFormat(String format, ZoneId timeZone) {
|
||||
if (timeZone != null) {
|
||||
throw new IllegalArgumentException(
|
||||
"Field [" + name() + "] of type [" + typeName() + "] does not support custom time zones"
|
||||
);
|
||||
}
|
||||
return DocValueFormat.UNSIGNED_LONG_SHIFTED;
|
||||
}
|
||||
|
||||
@Override
|
||||
public Function<byte[], Number> pointReaderIfPossible() {
|
||||
if (isSearchable()) {
|
||||
return (value) -> LongPoint.decodeDimension(value, 0);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parses value to unsigned long for Term Query
|
||||
* @param value to to parse
|
||||
* @return parsed value, if a value represents an unsigned long in the range [0, 18446744073709551615]
|
||||
* null, if a value represents some other number
|
||||
* throws an exception if a value is wrongly formatted number
|
||||
*/
|
||||
protected static Long parseTerm(Object value) {
|
||||
if (value instanceof Number) {
|
||||
if ((value instanceof Long) || (value instanceof Integer) || (value instanceof Short) || (value instanceof Byte)) {
|
||||
long lv = ((Number) value).longValue();
|
||||
if (lv >= 0) {
|
||||
return lv;
|
||||
}
|
||||
} else if (value instanceof BigInteger) {
|
||||
BigInteger bigIntegerValue = (BigInteger) value;
|
||||
if (bigIntegerValue.compareTo(BigInteger.ZERO) >= 0 && bigIntegerValue.compareTo(BIGINTEGER_2_64_MINUS_ONE) <= 0) {
|
||||
return bigIntegerValue.longValue();
|
||||
}
|
||||
}
|
||||
} else {
|
||||
String stringValue = (value instanceof BytesRef) ? ((BytesRef) value).utf8ToString() : value.toString();
|
||||
try {
|
||||
return Long.parseUnsignedLong(stringValue);
|
||||
} catch (NumberFormatException e) {
|
||||
// try again in case a number was negative or contained decimal
|
||||
Double.parseDouble(stringValue); // throws an exception if it is an improper number
|
||||
}
|
||||
}
|
||||
return null; // any other number: decimal or beyond the range of unsigned long
|
||||
}
|
||||
|
||||
/**
|
||||
* Parses a lower term for a range query
|
||||
* @param value to parse
|
||||
* @param include whether a value should be included
|
||||
* @return parsed value to long considering include parameter
|
||||
* 0, if value is less than 0
|
||||
* a value truncated to long, if value is in range [0, 18446744073709551615]
|
||||
* null, if value is higher than the maximum allowed value for unsigned long
|
||||
* throws an exception is value represents wrongly formatted number
|
||||
*/
|
||||
protected static Long parseLowerRangeTerm(Object value, boolean include) {
|
||||
if ((value instanceof Long) || (value instanceof Integer) || (value instanceof Short) || (value instanceof Byte)) {
|
||||
long longValue = ((Number) value).longValue();
|
||||
if (longValue < 0) return 0L; // limit lowerTerm to min value for unsigned long: 0
|
||||
if (include == false) { // start from the next value
|
||||
// for unsigned long, the next value for Long.MAX_VALUE is -9223372036854775808L
|
||||
longValue = longValue == Long.MAX_VALUE ? Long.MIN_VALUE : ++longValue;
|
||||
}
|
||||
return longValue;
|
||||
}
|
||||
String stringValue = (value instanceof BytesRef) ? ((BytesRef) value).utf8ToString() : value.toString();
|
||||
final BigDecimal bigDecimalValue = new BigDecimal(stringValue); // throws an exception if it is an improper number
|
||||
if (bigDecimalValue.compareTo(BigDecimal.ZERO) <= 0) {
|
||||
return 0L; // for values <=0, set lowerTerm to 0
|
||||
}
|
||||
int c = bigDecimalValue.compareTo(BIGDECIMAL_2_64_MINUS_ONE);
|
||||
if (c > 0 || (c == 0 && include == false)) {
|
||||
return null; // lowerTerm is beyond maximum value
|
||||
}
|
||||
long longValue = bigDecimalValue.longValue();
|
||||
boolean hasDecimal = (bigDecimalValue.scale() > 0 && bigDecimalValue.stripTrailingZeros().scale() > 0);
|
||||
if (include == false || hasDecimal) {
|
||||
++longValue;
|
||||
}
|
||||
return longValue;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parses an upper term for a range query
|
||||
* @param value to parse
|
||||
* @param include whether a value should be included
|
||||
* @return parsed value to long considering include parameter
|
||||
* null, if value is less that 0, as value is lower than the minimum allowed value for unsigned long
|
||||
* a value truncated to long if value is in range [0, 18446744073709551615]
|
||||
* -1 (unsigned long of 18446744073709551615) for values greater than 18446744073709551615
|
||||
* throws an exception is value represents wrongly formatted number
|
||||
*/
|
||||
protected static Long parseUpperRangeTerm(Object value, boolean include) {
|
||||
if ((value instanceof Long) || (value instanceof Integer) || (value instanceof Short) || (value instanceof Byte)) {
|
||||
long longValue = ((Number) value).longValue();
|
||||
if ((longValue < 0) || (longValue == 0 && include == false)) return null; // upperTerm is below minimum
|
||||
longValue = include ? longValue : --longValue;
|
||||
return longValue;
|
||||
}
|
||||
String stringValue = (value instanceof BytesRef) ? ((BytesRef) value).utf8ToString() : value.toString();
|
||||
final BigDecimal bigDecimalValue = new BigDecimal(stringValue); // throws an exception if it is an improper number
|
||||
int c = bigDecimalValue.compareTo(BigDecimal.ZERO);
|
||||
if (c < 0 || (c == 0 && include == false)) {
|
||||
return null; // upperTerm is below minimum
|
||||
}
|
||||
if (bigDecimalValue.compareTo(BIGDECIMAL_2_64_MINUS_ONE) > 0) {
|
||||
return -1L; // limit upperTerm to max value for unsigned long: 18446744073709551615
|
||||
}
|
||||
long longValue = bigDecimalValue.longValue();
|
||||
boolean hasDecimal = (bigDecimalValue.scale() > 0 && bigDecimalValue.stripTrailingZeros().scale() > 0);
|
||||
if (include == false && hasDecimal == false) {
|
||||
--longValue;
|
||||
}
|
||||
return longValue;
|
||||
}
|
||||
}
|
||||
|
||||
private final boolean indexed;
|
||||
private final boolean hasDocValues;
|
||||
private final boolean stored;
|
||||
private final Explicit<Boolean> ignoreMalformed;
|
||||
private final boolean ignoreMalformedByDefault;
|
||||
private final String nullValue;
|
||||
private final Long nullValueIndexed; // null value to use for indexing, represented as shifted to signed long range
|
||||
private final Number nullValueFormatted; // null value to use in place of a {@code null} value in the document source
|
||||
|
||||
private UnsignedLongFieldMapper(
|
||||
String simpleName,
|
||||
MappedFieldType mappedFieldType,
|
||||
MultiFields multiFields,
|
||||
CopyTo copyTo,
|
||||
Builder builder
|
||||
) {
|
||||
super(simpleName, mappedFieldType, multiFields, copyTo);
|
||||
this.indexed = builder.indexed.getValue();
|
||||
this.hasDocValues = builder.hasDocValues.getValue();
|
||||
this.stored = builder.stored.getValue();
|
||||
this.ignoreMalformed = builder.ignoreMalformed.getValue();
|
||||
this.ignoreMalformedByDefault = builder.ignoreMalformed.getDefaultValue().value();
|
||||
this.nullValue = builder.nullValue.getValue();
|
||||
if (nullValue == null) {
|
||||
this.nullValueIndexed = null;
|
||||
this.nullValueFormatted = null;
|
||||
} else {
|
||||
long parsed = parseUnsignedLong(nullValue);
|
||||
this.nullValueIndexed = unsignedToSortableSignedLong(parsed);
|
||||
this.nullValueFormatted = parsed >= 0 ? parsed : BigInteger.valueOf(parsed).and(BIGINTEGER_2_64_MINUS_ONE);
|
||||
}
|
||||
}
|
||||
|
||||
boolean ignoreMalformed() {
|
||||
return ignoreMalformed.value();
|
||||
}
|
||||
|
||||
@Override
|
||||
public UnsignedLongFieldType fieldType() {
|
||||
return (UnsignedLongFieldType) super.fieldType();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected String contentType() {
|
||||
return CONTENT_TYPE;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected UnsignedLongFieldMapper clone() {
|
||||
return (UnsignedLongFieldMapper) super.clone();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void parseCreateField(ParseContext context) throws IOException {
|
||||
XContentParser parser = context.parser();
|
||||
Long numericValue;
|
||||
if (context.externalValueSet()) {
|
||||
numericValue = parseUnsignedLong(context.externalValue());
|
||||
} else if (parser.currentToken() == XContentParser.Token.VALUE_NULL) {
|
||||
numericValue = null;
|
||||
} else if (parser.currentToken() == XContentParser.Token.VALUE_STRING && parser.textLength() == 0) {
|
||||
numericValue = null;
|
||||
} else {
|
||||
try {
|
||||
if (parser.currentToken() == XContentParser.Token.VALUE_NUMBER) {
|
||||
numericValue = parseUnsignedLong(parser.numberValue());
|
||||
} else {
|
||||
numericValue = parseUnsignedLong(parser.text());
|
||||
}
|
||||
} catch (InputCoercionException | IllegalArgumentException | JsonParseException e) {
|
||||
if (ignoreMalformed.value() && parser.currentToken().isValue()) {
|
||||
context.addIgnoredField(mappedFieldType.name());
|
||||
return;
|
||||
} else {
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (numericValue == null) {
|
||||
numericValue = nullValueIndexed;
|
||||
if (numericValue == null) return;
|
||||
} else {
|
||||
numericValue = unsignedToSortableSignedLong(numericValue);
|
||||
}
|
||||
|
||||
context.doc()
|
||||
.addAll(NumberFieldMapper.NumberType.LONG.createFields(fieldType().name(), numericValue, indexed, hasDocValues, stored));
|
||||
if (hasDocValues == false && (stored || indexed)) {
|
||||
createFieldNamesField(context);
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public ValueFetcher valueFetcher(MapperService mapperService, SearchLookup searchLookup, String format) {
|
||||
if (format != null) {
|
||||
throw new IllegalArgumentException("Field [" + name() + "] of type [" + typeName() + "] doesn't support formats.");
|
||||
}
|
||||
|
||||
return new SourceValueFetcher(name(), mapperService, parsesArrayValue(), nullValueFormatted) {
|
||||
@Override
|
||||
protected Object parseSourceValue(Object value) {
|
||||
if (value.equals("")) {
|
||||
return nullValueFormatted;
|
||||
}
|
||||
long ulValue = parseUnsignedLong(value);
|
||||
if (ulValue >= 0) {
|
||||
return ulValue;
|
||||
} else {
|
||||
return BigInteger.valueOf(ulValue).and(BIGINTEGER_2_64_MINUS_ONE);
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
@Override
|
||||
public ParametrizedFieldMapper.Builder getMergeBuilder() {
|
||||
return new Builder(simpleName(), ignoreMalformedByDefault).init(this);
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse object to unsigned long
|
||||
* @param value must represent an unsigned long in rage [0;18446744073709551615] or an exception will be thrown
|
||||
*/
|
||||
private static long parseUnsignedLong(Object value) {
|
||||
if (value instanceof Number) {
|
||||
if ((value instanceof Long) || (value instanceof Integer) || (value instanceof Short) || (value instanceof Byte)) {
|
||||
long lv = ((Number) value).longValue();
|
||||
if (lv < 0) {
|
||||
throw new IllegalArgumentException("Value [" + lv + "] is out of range for unsigned long.");
|
||||
}
|
||||
return lv;
|
||||
} else if (value instanceof BigInteger) {
|
||||
BigInteger bigIntegerValue = (BigInteger) value;
|
||||
if (bigIntegerValue.compareTo(BIGINTEGER_2_64_MINUS_ONE) > 0 || bigIntegerValue.compareTo(BigInteger.ZERO) < 0) {
|
||||
throw new IllegalArgumentException("Value [" + bigIntegerValue + "] is out of range for unsigned long");
|
||||
}
|
||||
return bigIntegerValue.longValue();
|
||||
}
|
||||
// throw exception for all other numeric types with decimal parts
|
||||
throw new IllegalArgumentException("For input string: [" + value.toString() + "].");
|
||||
} else {
|
||||
String stringValue = (value instanceof BytesRef) ? ((BytesRef) value).utf8ToString() : value.toString();
|
||||
try {
|
||||
return Long.parseUnsignedLong(stringValue);
|
||||
} catch (NumberFormatException e) {
|
||||
throw new IllegalArgumentException("For input string: \"" + stringValue + "\"");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert an unsigned long to the singed long by subtract 2^63 from it
|
||||
* @param value – unsigned long value in the range [0; 2^64-1], values greater than 2^63-1 are negative
|
||||
* @return signed long value in the range [-2^63; 2^63-1]
|
||||
*/
|
||||
private static long unsignedToSortableSignedLong(long value) {
|
||||
// subtracting 2^63 or 10000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000
|
||||
// equivalent to flipping the first bit
|
||||
return value ^ MASK_2_63;
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert a signed long to unsigned by adding 2^63 to it
|
||||
* @param value – signed long value in the range [-2^63; 2^63-1]
|
||||
* @return unsigned long value in the range [0; 2^64-1], values greater then 2^63-1 are negative
|
||||
*/
|
||||
protected static long sortableSignedLongToUnsigned(long value) {
|
||||
// adding 2^63 or 10000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000
|
||||
// equivalent to flipping the first bit
|
||||
return value ^ MASK_2_63;
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,51 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.unsignedlong;
|
||||
|
||||
import org.apache.lucene.index.LeafReaderContext;
|
||||
import org.elasticsearch.index.fielddata.IndexNumericFieldData;
|
||||
import org.elasticsearch.index.fielddata.LeafNumericFieldData;
|
||||
import org.elasticsearch.search.aggregations.support.ValuesSourceType;
|
||||
|
||||
public class UnsignedLongIndexFieldData extends IndexNumericFieldData {
|
||||
private final IndexNumericFieldData signedLongIFD;
|
||||
|
||||
UnsignedLongIndexFieldData(IndexNumericFieldData signedLongFieldData) {
|
||||
this.signedLongIFD = signedLongFieldData;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String getFieldName() {
|
||||
return signedLongIFD.getFieldName();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ValuesSourceType getValuesSourceType() {
|
||||
return signedLongIFD.getValuesSourceType();
|
||||
}
|
||||
|
||||
@Override
|
||||
public LeafNumericFieldData load(LeafReaderContext context) {
|
||||
return new UnsignedLongLeafFieldData(signedLongIFD.load(context));
|
||||
}
|
||||
|
||||
@Override
|
||||
public LeafNumericFieldData loadDirect(LeafReaderContext context) throws Exception {
|
||||
return new UnsignedLongLeafFieldData(signedLongIFD.loadDirect(context));
|
||||
}
|
||||
|
||||
@Override
|
||||
protected boolean sortRequiresCustomComparator() {
|
||||
return true;
|
||||
}
|
||||
|
||||
@Override
|
||||
public NumericType getNumericType() {
|
||||
return NumericType.LONG;
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,100 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.unsignedlong;
|
||||
|
||||
import org.apache.lucene.index.DocValues;
|
||||
import org.apache.lucene.index.NumericDocValues;
|
||||
import org.apache.lucene.index.SortedNumericDocValues;
|
||||
import org.elasticsearch.index.fielddata.FieldData;
|
||||
import org.elasticsearch.index.fielddata.LeafNumericFieldData;
|
||||
import org.elasticsearch.index.fielddata.NumericDoubleValues;
|
||||
import org.elasticsearch.index.fielddata.ScriptDocValues;
|
||||
import org.elasticsearch.index.fielddata.SortedBinaryDocValues;
|
||||
import org.elasticsearch.index.fielddata.SortedNumericDoubleValues;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
import static org.elasticsearch.xpack.unsignedlong.UnsignedLongFieldMapper.sortableSignedLongToUnsigned;
|
||||
|
||||
public class UnsignedLongLeafFieldData implements LeafNumericFieldData {
|
||||
private final LeafNumericFieldData signedLongFD;
|
||||
|
||||
UnsignedLongLeafFieldData(LeafNumericFieldData signedLongFD) {
|
||||
this.signedLongFD = signedLongFD;
|
||||
}
|
||||
|
||||
@Override
|
||||
public SortedNumericDocValues getLongValues() {
|
||||
return signedLongFD.getLongValues();
|
||||
}
|
||||
|
||||
@Override
|
||||
public SortedNumericDoubleValues getDoubleValues() {
|
||||
final SortedNumericDocValues values = signedLongFD.getLongValues();
|
||||
final NumericDocValues singleValues = DocValues.unwrapSingleton(values);
|
||||
if (singleValues != null) {
|
||||
return FieldData.singleton(new NumericDoubleValues() {
|
||||
@Override
|
||||
public boolean advanceExact(int doc) throws IOException {
|
||||
return singleValues.advanceExact(doc);
|
||||
}
|
||||
|
||||
@Override
|
||||
public double doubleValue() throws IOException {
|
||||
return convertUnsignedLongToDouble(singleValues.longValue());
|
||||
}
|
||||
});
|
||||
} else {
|
||||
return new SortedNumericDoubleValues() {
|
||||
|
||||
@Override
|
||||
public boolean advanceExact(int target) throws IOException {
|
||||
return values.advanceExact(target);
|
||||
}
|
||||
|
||||
@Override
|
||||
public double nextValue() throws IOException {
|
||||
return convertUnsignedLongToDouble(values.nextValue());
|
||||
}
|
||||
|
||||
@Override
|
||||
public int docValueCount() {
|
||||
return values.docValueCount();
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public ScriptDocValues<?> getScriptValues() {
|
||||
return new UnsignedLongScriptDocValues(getLongValues());
|
||||
}
|
||||
|
||||
@Override
|
||||
public SortedBinaryDocValues getBytesValues() {
|
||||
return FieldData.toString(getDoubleValues());
|
||||
}
|
||||
|
||||
@Override
|
||||
public long ramBytesUsed() {
|
||||
return signedLongFD.ramBytesUsed();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void close() {
|
||||
signedLongFD.close();
|
||||
}
|
||||
|
||||
private static double convertUnsignedLongToDouble(long value) {
|
||||
if (value < 0L) {
|
||||
return sortableSignedLongToUnsigned(value); // add 2 ^ 63
|
||||
} else {
|
||||
// add 2 ^ 63 as a double to make sure there is no overflow and final result is positive
|
||||
return 0x1.0p63 + value;
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,24 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.unsignedlong;
|
||||
|
||||
import org.elasticsearch.index.mapper.Mapper;
|
||||
import org.elasticsearch.plugins.MapperPlugin;
|
||||
import org.elasticsearch.plugins.Plugin;
|
||||
|
||||
import java.util.Map;
|
||||
|
||||
import static java.util.Collections.singletonMap;
|
||||
|
||||
public class UnsignedLongMapperPlugin extends Plugin implements MapperPlugin {
|
||||
|
||||
@Override
|
||||
public Map<String, Mapper.TypeParser> getMappers() {
|
||||
return singletonMap(UnsignedLongFieldMapper.CONTENT_TYPE, UnsignedLongFieldMapper.PARSER);
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,67 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.unsignedlong;
|
||||
|
||||
import org.apache.lucene.index.SortedNumericDocValues;
|
||||
import org.apache.lucene.util.ArrayUtil;
|
||||
import org.elasticsearch.index.fielddata.ScriptDocValues;
|
||||
import org.elasticsearch.search.DocValueFormat;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
public class UnsignedLongScriptDocValues extends ScriptDocValues<Number> {
|
||||
private final SortedNumericDocValues in;
|
||||
private long[] values = new long[0];
|
||||
private int count;
|
||||
|
||||
/**
|
||||
* Standard constructor.
|
||||
*/
|
||||
public UnsignedLongScriptDocValues(SortedNumericDocValues in) {
|
||||
this.in = in;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void setNextDocId(int docId) throws IOException {
|
||||
if (in.advanceExact(docId)) {
|
||||
resize(in.docValueCount());
|
||||
for (int i = 0; i < count; i++) {
|
||||
values[i] = in.nextValue();
|
||||
}
|
||||
} else {
|
||||
resize(0);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the {@link #size()} and ensure that the {@link #values} array can
|
||||
* store at least that many entries.
|
||||
*/
|
||||
protected void resize(int newSize) {
|
||||
count = newSize;
|
||||
values = ArrayUtil.grow(values, count);
|
||||
}
|
||||
|
||||
public Number getValue() {
|
||||
return get(0);
|
||||
}
|
||||
|
||||
@Override
|
||||
public Number get(int index) {
|
||||
if (count == 0) {
|
||||
throw new IllegalStateException(
|
||||
"A document doesn't have a value for a field! Use doc[<field>].size()==0 to check if a document is missing a field!"
|
||||
);
|
||||
}
|
||||
return (Number) DocValueFormat.UNSIGNED_LONG_SHIFTED.format(values[index]);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int size() {
|
||||
return count;
|
||||
}
|
||||
}
|
|
@ -0,0 +1 @@
|
|||
org.elasticsearch.xpack.unsignedlong.DocValuesWhitelistExtension
|
|
@ -0,0 +1,10 @@
|
|||
#
|
||||
# Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
# or more contributor license agreements. Licensed under the Elastic License;
|
||||
# you may not use this file except in compliance with the Elastic License.
|
||||
#
|
||||
|
||||
class org.elasticsearch.xpack.unsignedlong.UnsignedLongScriptDocValues {
|
||||
Number get(int)
|
||||
Number getValue()
|
||||
}
|
|
@ -0,0 +1,350 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.unsignedlong;
|
||||
|
||||
import org.apache.lucene.index.DocValuesType;
|
||||
import org.apache.lucene.index.IndexableField;
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetadata;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.bytes.BytesReference;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.common.xcontent.XContentType;
|
||||
import org.elasticsearch.index.mapper.ContentPath;
|
||||
import org.elasticsearch.index.mapper.DocumentMapper;
|
||||
import org.elasticsearch.index.mapper.Mapper;
|
||||
import org.elasticsearch.index.mapper.MapperParsingException;
|
||||
import org.elasticsearch.index.mapper.MapperService;
|
||||
import org.elasticsearch.index.mapper.MapperTestCase;
|
||||
import org.elasticsearch.index.mapper.ParsedDocument;
|
||||
import org.elasticsearch.index.mapper.SourceToParse;
|
||||
import org.elasticsearch.index.termvectors.TermVectorsService;
|
||||
import org.elasticsearch.plugins.Plugin;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.math.BigInteger;
|
||||
import java.util.Collection;
|
||||
import java.util.Collections;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.xpack.unsignedlong.UnsignedLongFieldMapper.BIGINTEGER_2_64_MINUS_ONE;
|
||||
import static org.hamcrest.Matchers.containsString;
|
||||
|
||||
public class UnsignedLongFieldMapperTests extends MapperTestCase {
|
||||
|
||||
@Override
|
||||
protected Collection<? extends Plugin> getPlugins() {
|
||||
return Collections.singletonList(new UnsignedLongMapperPlugin());
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void minimalMapping(XContentBuilder b) throws IOException {
|
||||
b.field("type", "unsigned_long");
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void writeFieldValue(XContentBuilder builder) throws IOException {
|
||||
builder.value(123);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void registerParameters(ParameterChecker checker) throws IOException {
|
||||
checker.registerConflictCheck("doc_values", b -> b.field("doc_values", false));
|
||||
checker.registerConflictCheck("index", b -> b.field("index", false));
|
||||
checker.registerConflictCheck("store", b -> b.field("store", true));
|
||||
checker.registerConflictCheck("null_value", b -> b.field("null_value", 1));
|
||||
checker.registerUpdateCheck(
|
||||
b -> b.field("ignore_malformed", true),
|
||||
m -> assertTrue(((UnsignedLongFieldMapper) m).ignoreMalformed())
|
||||
);
|
||||
}
|
||||
|
||||
public void testDefaults() throws Exception {
|
||||
XContentBuilder mapping = fieldMapping(b -> b.field("type", "unsigned_long"));
|
||||
DocumentMapper mapper = createDocumentMapper(mapping);
|
||||
assertEquals(Strings.toString(mapping), mapper.mappingSource().toString());
|
||||
|
||||
// test indexing of values as string
|
||||
{
|
||||
ParsedDocument doc = mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("field", "18446744073709551615").endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
IndexableField[] fields = doc.rootDoc().getFields("field");
|
||||
assertEquals(2, fields.length);
|
||||
IndexableField pointField = fields[0];
|
||||
assertEquals(1, pointField.fieldType().pointIndexDimensionCount());
|
||||
assertFalse(pointField.fieldType().stored());
|
||||
assertEquals(9223372036854775807L, pointField.numericValue().longValue());
|
||||
IndexableField dvField = fields[1];
|
||||
assertEquals(DocValuesType.SORTED_NUMERIC, dvField.fieldType().docValuesType());
|
||||
assertEquals(9223372036854775807L, dvField.numericValue().longValue());
|
||||
assertFalse(dvField.fieldType().stored());
|
||||
}
|
||||
|
||||
// test indexing values as integer numbers
|
||||
{
|
||||
ParsedDocument doc = mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"2",
|
||||
BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("field", 9223372036854775807L).endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
IndexableField[] fields = doc.rootDoc().getFields("field");
|
||||
assertEquals(2, fields.length);
|
||||
IndexableField pointField = fields[0];
|
||||
assertEquals(-1L, pointField.numericValue().longValue());
|
||||
IndexableField dvField = fields[1];
|
||||
assertEquals(-1L, dvField.numericValue().longValue());
|
||||
}
|
||||
|
||||
// test that indexing values as number with decimal is not allowed
|
||||
{
|
||||
ThrowingRunnable runnable = () -> mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"3",
|
||||
BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("field", 10.5).endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
MapperParsingException e = expectThrows(MapperParsingException.class, runnable);
|
||||
assertThat(e.getCause().getMessage(), containsString("For input string: [10.5]"));
|
||||
}
|
||||
}
|
||||
|
||||
public void testNotIndexed() throws Exception {
|
||||
DocumentMapper mapper = createDocumentMapper(fieldMapping(b -> b.field("type", "unsigned_long").field("index", false)));
|
||||
|
||||
ParsedDocument doc = mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("field", "18446744073709551615").endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
IndexableField[] fields = doc.rootDoc().getFields("field");
|
||||
assertEquals(1, fields.length);
|
||||
IndexableField dvField = fields[0];
|
||||
assertEquals(DocValuesType.SORTED_NUMERIC, dvField.fieldType().docValuesType());
|
||||
assertEquals(9223372036854775807L, dvField.numericValue().longValue());
|
||||
}
|
||||
|
||||
public void testNoDocValues() throws Exception {
|
||||
DocumentMapper mapper = createDocumentMapper(fieldMapping(b -> b.field("type", "unsigned_long").field("doc_values", false)));
|
||||
|
||||
ParsedDocument doc = mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("field", "18446744073709551615").endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
IndexableField[] fields = doc.rootDoc().getFields("field");
|
||||
assertEquals(1, fields.length);
|
||||
IndexableField pointField = fields[0];
|
||||
assertEquals(1, pointField.fieldType().pointIndexDimensionCount());
|
||||
assertEquals(9223372036854775807L, pointField.numericValue().longValue());
|
||||
}
|
||||
|
||||
public void testStore() throws Exception {
|
||||
DocumentMapper mapper = createDocumentMapper(fieldMapping(b -> b.field("type", "unsigned_long").field("store", true)));
|
||||
|
||||
ParsedDocument doc = mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(XContentFactory.jsonBuilder().startObject().field("field", "18446744073709551615").endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
IndexableField[] fields = doc.rootDoc().getFields("field");
|
||||
assertEquals(3, fields.length);
|
||||
IndexableField pointField = fields[0];
|
||||
assertEquals(1, pointField.fieldType().pointIndexDimensionCount());
|
||||
assertEquals(9223372036854775807L, pointField.numericValue().longValue());
|
||||
IndexableField dvField = fields[1];
|
||||
assertEquals(DocValuesType.SORTED_NUMERIC, dvField.fieldType().docValuesType());
|
||||
assertEquals(9223372036854775807L, dvField.numericValue().longValue());
|
||||
IndexableField storedField = fields[2];
|
||||
assertTrue(storedField.fieldType().stored());
|
||||
assertEquals(9223372036854775807L, storedField.numericValue().longValue());
|
||||
}
|
||||
|
||||
public void testCoerceMappingParameterIsIllegal() {
|
||||
MapperParsingException e = expectThrows(
|
||||
MapperParsingException.class,
|
||||
() -> createMapperService(fieldMapping(b -> b.field("type", "unsigned_long").field("coerce", false)))
|
||||
);
|
||||
assertThat(
|
||||
e.getMessage(),
|
||||
containsString("Failed to parse mapping [_doc]: unknown parameter [coerce] on mapper [field] of type [unsigned_long]")
|
||||
);
|
||||
}
|
||||
|
||||
public void testNullValue() throws IOException {
|
||||
// test that if null value is not defined, field is not indexed
|
||||
{
|
||||
DocumentMapper mapper = createDocumentMapper(fieldMapping(this::minimalMapping));
|
||||
ParsedDocument doc = mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(XContentFactory.jsonBuilder().startObject().nullField("field").endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
assertArrayEquals(new IndexableField[0], doc.rootDoc().getFields("field"));
|
||||
}
|
||||
|
||||
// test that if null value is defined, it is used
|
||||
{
|
||||
DocumentMapper mapper = createDocumentMapper(
|
||||
fieldMapping(b -> b.field("type", "unsigned_long").field("null_value", "18446744073709551615"))
|
||||
);
|
||||
ParsedDocument doc = mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(XContentFactory.jsonBuilder().startObject().nullField("field").endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
IndexableField[] fields = doc.rootDoc().getFields("field");
|
||||
assertEquals(2, fields.length);
|
||||
IndexableField pointField = fields[0];
|
||||
assertEquals(9223372036854775807L, pointField.numericValue().longValue());
|
||||
IndexableField dvField = fields[1];
|
||||
assertEquals(9223372036854775807L, dvField.numericValue().longValue());
|
||||
}
|
||||
}
|
||||
|
||||
public void testIgnoreMalformed() throws Exception {
|
||||
// test ignore_malformed is false by default
|
||||
{
|
||||
DocumentMapper mapper = createDocumentMapper(fieldMapping(this::minimalMapping));
|
||||
Object malformedValue1 = "a";
|
||||
ThrowingRunnable runnable = () -> mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(jsonBuilder().startObject().field("field", malformedValue1).endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
MapperParsingException e = expectThrows(MapperParsingException.class, runnable);
|
||||
assertThat(e.getCause().getMessage(), containsString("For input string: \"a\""));
|
||||
|
||||
Object malformedValue2 = Boolean.FALSE;
|
||||
runnable = () -> mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(jsonBuilder().startObject().field("field", malformedValue2).endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
e = expectThrows(MapperParsingException.class, runnable);
|
||||
assertThat(e.getCause().getMessage(), containsString("For input string: \"false\""));
|
||||
}
|
||||
|
||||
// test ignore_malformed when set to true ignored malformed documents
|
||||
{
|
||||
DocumentMapper mapper = createDocumentMapper(
|
||||
fieldMapping(b -> b.field("type", "unsigned_long").field("ignore_malformed", true))
|
||||
);
|
||||
Object malformedValue1 = "a";
|
||||
ParsedDocument doc = mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(jsonBuilder().startObject().field("field", malformedValue1).endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
IndexableField[] fields = doc.rootDoc().getFields("field");
|
||||
assertEquals(0, fields.length);
|
||||
assertArrayEquals(new String[] { "field" }, TermVectorsService.getValues(doc.rootDoc().getFields("_ignored")));
|
||||
|
||||
Object malformedValue2 = Boolean.FALSE;
|
||||
ParsedDocument doc2 = mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(jsonBuilder().startObject().field("field", malformedValue2).endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
IndexableField[] fields2 = doc2.rootDoc().getFields("field");
|
||||
assertEquals(0, fields2.length);
|
||||
assertArrayEquals(new String[] { "field" }, TermVectorsService.getValues(doc2.rootDoc().getFields("_ignored")));
|
||||
}
|
||||
}
|
||||
|
||||
public void testIndexingOutOfRangeValues() throws Exception {
|
||||
DocumentMapper mapper = createDocumentMapper(fieldMapping(this::minimalMapping));
|
||||
for (Object outOfRangeValue : new Object[] { "-1", -1L, "18446744073709551616", new BigInteger("18446744073709551616") }) {
|
||||
ThrowingRunnable runnable = () -> mapper.parse(
|
||||
new SourceToParse(
|
||||
"test",
|
||||
"_doc",
|
||||
"1",
|
||||
BytesReference.bytes(jsonBuilder().startObject().field("field", outOfRangeValue).endObject()),
|
||||
XContentType.JSON
|
||||
)
|
||||
);
|
||||
expectThrows(MapperParsingException.class, runnable);
|
||||
}
|
||||
}
|
||||
|
||||
public void testFetchSourceValue() throws IOException {
|
||||
Settings settings = Settings.builder().put(IndexMetadata.SETTING_VERSION_CREATED, Version.CURRENT.id).build();
|
||||
Mapper.BuilderContext context = new Mapper.BuilderContext(settings, new ContentPath());
|
||||
|
||||
UnsignedLongFieldMapper mapper = new UnsignedLongFieldMapper.Builder("field", settings).build(context);
|
||||
assertEquals(org.elasticsearch.common.collect.List.of(0L), fetchSourceValue(mapper, 0L));
|
||||
assertEquals(org.elasticsearch.common.collect.List.of(9223372036854775807L), fetchSourceValue(mapper, 9223372036854775807L));
|
||||
assertEquals(org.elasticsearch.common.collect.List.of(BIGINTEGER_2_64_MINUS_ONE), fetchSourceValue(mapper, "18446744073709551615"));
|
||||
assertEquals(org.elasticsearch.common.collect.List.of(), fetchSourceValue(mapper, ""));
|
||||
|
||||
UnsignedLongFieldMapper nullValueMapper = new UnsignedLongFieldMapper.Builder("field", settings).nullValue("18446744073709551615")
|
||||
.build(context);
|
||||
assertEquals(org.elasticsearch.common.collect.List.of(BIGINTEGER_2_64_MINUS_ONE), fetchSourceValue(nullValueMapper, ""));
|
||||
}
|
||||
|
||||
public void testExistsQueryDocValuesDisabled() throws IOException {
|
||||
MapperService mapperService = createMapperService(fieldMapping(b -> {
|
||||
minimalMapping(b);
|
||||
b.field("doc_values", false);
|
||||
}));
|
||||
assertExistsQuery(mapperService);
|
||||
assertParseMinimalWarnings();
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,152 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.xpack.unsignedlong;
|
||||
|
||||
import org.apache.lucene.document.LongPoint;
|
||||
import org.apache.lucene.search.MatchNoDocsQuery;
|
||||
import org.elasticsearch.index.mapper.FieldTypeTestCase;
|
||||
import org.elasticsearch.xpack.unsignedlong.UnsignedLongFieldMapper.UnsignedLongFieldType;
|
||||
|
||||
import java.util.Arrays;
|
||||
|
||||
import static org.elasticsearch.xpack.unsignedlong.UnsignedLongFieldMapper.UnsignedLongFieldType.parseTerm;
|
||||
import static org.elasticsearch.xpack.unsignedlong.UnsignedLongFieldMapper.UnsignedLongFieldType.parseLowerRangeTerm;
|
||||
import static org.elasticsearch.xpack.unsignedlong.UnsignedLongFieldMapper.UnsignedLongFieldType.parseUpperRangeTerm;
|
||||
|
||||
public class UnsignedLongFieldTypeTests extends FieldTypeTestCase {
|
||||
|
||||
public void testTermQuery() {
|
||||
UnsignedLongFieldType ft = new UnsignedLongFieldType("my_unsigned_long");
|
||||
|
||||
assertEquals(LongPoint.newExactQuery("my_unsigned_long", -9223372036854775808L), ft.termQuery(0, null));
|
||||
assertEquals(LongPoint.newExactQuery("my_unsigned_long", 0L), ft.termQuery("9223372036854775808", null));
|
||||
assertEquals(LongPoint.newExactQuery("my_unsigned_long", 9223372036854775807L), ft.termQuery("18446744073709551615", null));
|
||||
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termQuery(-1L, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termQuery(10.5, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termQuery("18446744073709551616", null));
|
||||
|
||||
expectThrows(NumberFormatException.class, () -> ft.termQuery("18incorrectnumber", null));
|
||||
}
|
||||
|
||||
public void testTermsQuery() {
|
||||
UnsignedLongFieldType ft = new UnsignedLongFieldType("my_unsigned_long");
|
||||
|
||||
assertEquals(
|
||||
LongPoint.newSetQuery("my_unsigned_long", -9223372036854775808L, 0L, 9223372036854775807L),
|
||||
ft.termsQuery(Arrays.asList("0", "9223372036854775808", "18446744073709551615"), null)
|
||||
);
|
||||
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termsQuery(Arrays.asList(-9223372036854775808L, -1L), null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.termsQuery(Arrays.asList("-0.5", "3.14", "18446744073709551616"), null));
|
||||
|
||||
expectThrows(NumberFormatException.class, () -> ft.termsQuery(Arrays.asList("18incorrectnumber"), null));
|
||||
}
|
||||
|
||||
public void testRangeQuery() {
|
||||
UnsignedLongFieldType ft = new UnsignedLongFieldType("my_unsigned_long", true, false, false, null);
|
||||
|
||||
assertEquals(
|
||||
LongPoint.newRangeQuery("my_unsigned_long", -9223372036854775808L, -9223372036854775808L),
|
||||
ft.rangeQuery(-1L, 0L, true, true, null)
|
||||
);
|
||||
assertEquals(
|
||||
LongPoint.newRangeQuery("my_unsigned_long", -9223372036854775808L, -9223372036854775808L),
|
||||
ft.rangeQuery(0.0, 0.5, true, true, null)
|
||||
);
|
||||
assertEquals(
|
||||
LongPoint.newRangeQuery("my_unsigned_long", 0, 0),
|
||||
ft.rangeQuery("9223372036854775807", "9223372036854775808", false, true, null)
|
||||
);
|
||||
assertEquals(
|
||||
LongPoint.newRangeQuery("my_unsigned_long", -9223372036854775808L, 9223372036854775806L),
|
||||
ft.rangeQuery(null, "18446744073709551614.5", true, true, null)
|
||||
);
|
||||
assertEquals(
|
||||
LongPoint.newRangeQuery("my_unsigned_long", 9223372036854775807L, 9223372036854775807L),
|
||||
ft.rangeQuery("18446744073709551615", "18446744073709551616", true, true, null)
|
||||
);
|
||||
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery(-1f, -0.5f, true, true, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery(-1L, 0L, true, false, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery(9223372036854775807L, 9223372036854775806L, true, true, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery("18446744073709551616", "18446744073709551616", true, true, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery("18446744073709551615", "18446744073709551616", false, true, null));
|
||||
assertEquals(new MatchNoDocsQuery(), ft.rangeQuery(9223372036854775807L, 9223372036854775806L, true, true, null));
|
||||
|
||||
expectThrows(NumberFormatException.class, () -> ft.rangeQuery("18incorrectnumber", "18incorrectnumber", true, true, null));
|
||||
}
|
||||
|
||||
public void testParseTermForTermQuery() {
|
||||
// values that represent proper unsigned long number
|
||||
assertEquals(0L, parseTerm("0").longValue());
|
||||
assertEquals(0L, parseTerm(0).longValue());
|
||||
assertEquals(9223372036854775807L, parseTerm(9223372036854775807L).longValue());
|
||||
assertEquals(-1L, parseTerm("18446744073709551615").longValue());
|
||||
|
||||
// values that represent numbers but not unsigned long and not in range of [0; 18446744073709551615]
|
||||
assertEquals(null, parseTerm("-9223372036854775808.05"));
|
||||
assertEquals(null, parseTerm(-9223372036854775808L));
|
||||
assertEquals(null, parseTerm(0.0));
|
||||
assertEquals(null, parseTerm(0.5));
|
||||
assertEquals(null, parseTerm("18446744073709551616"));
|
||||
|
||||
// wrongly formatted numbers
|
||||
expectThrows(NumberFormatException.class, () -> parseTerm("18incorrectnumber"));
|
||||
}
|
||||
|
||||
public void testParseLowerTermForRangeQuery() {
|
||||
// values that are lower than min for lowerTerm are converted to 0
|
||||
assertEquals(0L, parseLowerRangeTerm(-9223372036854775808L, true).longValue());
|
||||
assertEquals(0L, parseLowerRangeTerm("-9223372036854775808", true).longValue());
|
||||
assertEquals(0L, parseLowerRangeTerm("-1", true).longValue());
|
||||
assertEquals(0L, parseLowerRangeTerm("-0.5", true).longValue());
|
||||
|
||||
assertEquals(0L, parseLowerRangeTerm(0L, true).longValue());
|
||||
assertEquals(0L, parseLowerRangeTerm("0", true).longValue());
|
||||
assertEquals(0L, parseLowerRangeTerm("0.0", true).longValue());
|
||||
assertEquals(1L, parseLowerRangeTerm("0.5", true).longValue());
|
||||
assertEquals(9223372036854775807L, parseLowerRangeTerm(9223372036854775806L, false).longValue());
|
||||
assertEquals(9223372036854775807L, parseLowerRangeTerm(9223372036854775807L, true).longValue());
|
||||
assertEquals(-9223372036854775808L, parseLowerRangeTerm(9223372036854775807L, false).longValue());
|
||||
assertEquals(-1L, parseLowerRangeTerm("18446744073709551614", false).longValue());
|
||||
assertEquals(-1L, parseLowerRangeTerm("18446744073709551614.1", true).longValue());
|
||||
assertEquals(-1L, parseLowerRangeTerm("18446744073709551615", true).longValue());
|
||||
|
||||
// values that are higher than max for lowerTerm don't return results
|
||||
assertEquals(null, parseLowerRangeTerm("18446744073709551615", false));
|
||||
assertEquals(null, parseLowerRangeTerm("18446744073709551616", true));
|
||||
|
||||
// wrongly formatted numbers
|
||||
expectThrows(NumberFormatException.class, () -> parseLowerRangeTerm("18incorrectnumber", true));
|
||||
}
|
||||
|
||||
public void testParseUpperTermForRangeQuery() {
|
||||
// values that are lower than min for upperTerm don't return results
|
||||
assertEquals(null, parseUpperRangeTerm(-9223372036854775808L, true));
|
||||
assertEquals(null, parseUpperRangeTerm("-1", true));
|
||||
assertEquals(null, parseUpperRangeTerm("-0.5", true));
|
||||
assertEquals(null, parseUpperRangeTerm(0L, false));
|
||||
|
||||
assertEquals(0L, parseUpperRangeTerm(0L, true).longValue());
|
||||
assertEquals(0L, parseUpperRangeTerm("0", true).longValue());
|
||||
assertEquals(0L, parseUpperRangeTerm("0.0", true).longValue());
|
||||
assertEquals(0L, parseUpperRangeTerm("0.5", true).longValue());
|
||||
assertEquals(9223372036854775806L, parseUpperRangeTerm(9223372036854775807L, false).longValue());
|
||||
assertEquals(9223372036854775807L, parseUpperRangeTerm(9223372036854775807L, true).longValue());
|
||||
assertEquals(-2L, parseUpperRangeTerm("18446744073709551614.5", true).longValue());
|
||||
assertEquals(-2L, parseUpperRangeTerm("18446744073709551615", false).longValue());
|
||||
assertEquals(-1L, parseUpperRangeTerm("18446744073709551615", true).longValue());
|
||||
|
||||
// values that are higher than max for upperTerm are converted to "18446744073709551615" or -1 in singed representation
|
||||
assertEquals(-1L, parseUpperRangeTerm("18446744073709551615.8", true).longValue());
|
||||
assertEquals(-1L, parseUpperRangeTerm("18446744073709551616", true).longValue());
|
||||
|
||||
// wrongly formatted numbers
|
||||
expectThrows(NumberFormatException.class, () -> parseUpperRangeTerm("18incorrectnumber", true));
|
||||
}
|
||||
}
|
|
@ -0,0 +1,297 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.unsignedlong;
|
||||
|
||||
import org.elasticsearch.ElasticsearchException;
|
||||
import org.elasticsearch.action.index.IndexRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchPhaseExecutionException;
|
||||
import org.elasticsearch.action.search.SearchRequestBuilder;
|
||||
import org.elasticsearch.action.search.SearchResponse;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.index.query.QueryBuilders;
|
||||
import org.elasticsearch.plugins.Plugin;
|
||||
import org.elasticsearch.search.SearchHit;
|
||||
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram;
|
||||
import org.elasticsearch.search.aggregations.bucket.range.Range;
|
||||
import org.elasticsearch.search.aggregations.bucket.terms.Terms;
|
||||
import org.elasticsearch.search.aggregations.metrics.Sum;
|
||||
import org.elasticsearch.search.sort.SortOrder;
|
||||
import org.elasticsearch.test.ESIntegTestCase;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.math.BigInteger;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collection;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
|
||||
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
|
||||
import static org.elasticsearch.search.aggregations.AggregationBuilders.histogram;
|
||||
import static org.elasticsearch.search.aggregations.AggregationBuilders.range;
|
||||
import static org.elasticsearch.search.aggregations.AggregationBuilders.sum;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertSearchResponse;
|
||||
import static org.hamcrest.Matchers.containsString;
|
||||
import static org.elasticsearch.search.aggregations.AggregationBuilders.terms;
|
||||
|
||||
@ESIntegTestCase.SuiteScopeTestCase
|
||||
|
||||
public class UnsignedLongTests extends ESIntegTestCase {
|
||||
final int numDocs = 10;
|
||||
final Number[] values = {
|
||||
0L,
|
||||
0L,
|
||||
100L,
|
||||
9223372036854775807L,
|
||||
new BigInteger("9223372036854775808"),
|
||||
new BigInteger("10446744073709551613"),
|
||||
new BigInteger("18446744073709551614"),
|
||||
new BigInteger("18446744073709551614"),
|
||||
new BigInteger("18446744073709551615"),
|
||||
new BigInteger("18446744073709551615") };
|
||||
|
||||
@Override
|
||||
protected Collection<Class<? extends Plugin>> nodePlugins() {
|
||||
return Collections.singleton(UnsignedLongMapperPlugin.class);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void setupSuiteScopeCluster() throws Exception {
|
||||
Settings.Builder settings = Settings.builder().put(indexSettings()).put("number_of_shards", 1);
|
||||
prepareCreate("idx").addMapping("_doc", "ul_field", "type=unsigned_long").setSettings(settings).get();
|
||||
List<IndexRequestBuilder> builders = new ArrayList<>();
|
||||
for (int i = 0; i < numDocs; i++) {
|
||||
builders.add(
|
||||
client().prepareIndex("idx", "_doc").setSource(jsonBuilder().startObject().field("ul_field", values[i]).endObject())
|
||||
);
|
||||
}
|
||||
indexRandom(true, builders);
|
||||
ensureSearchable();
|
||||
}
|
||||
|
||||
public void testSort() {
|
||||
// asc sort
|
||||
{
|
||||
SearchResponse response = client().prepareSearch("idx")
|
||||
.setQuery(QueryBuilders.matchAllQuery())
|
||||
.setSize(numDocs)
|
||||
.addSort("ul_field", SortOrder.ASC)
|
||||
.get();
|
||||
assertSearchResponse(response);
|
||||
SearchHit[] hits = response.getHits().getHits();
|
||||
assertEquals(hits.length, numDocs);
|
||||
int i = 0;
|
||||
for (SearchHit hit : hits) {
|
||||
assertEquals(values[i++], hit.getSortValues()[0]);
|
||||
}
|
||||
}
|
||||
// desc sort
|
||||
{
|
||||
SearchResponse response = client().prepareSearch("idx")
|
||||
.setQuery(QueryBuilders.matchAllQuery())
|
||||
.setSize(numDocs)
|
||||
.addSort("ul_field", SortOrder.DESC)
|
||||
.get();
|
||||
assertSearchResponse(response);
|
||||
SearchHit[] hits = response.getHits().getHits();
|
||||
assertEquals(hits.length, numDocs);
|
||||
int i = numDocs - 1;
|
||||
for (SearchHit hit : hits) {
|
||||
assertEquals(values[i--], hit.getSortValues()[0]);
|
||||
}
|
||||
}
|
||||
// asc sort with search_after as Long
|
||||
{
|
||||
SearchResponse response = client().prepareSearch("idx")
|
||||
.setQuery(QueryBuilders.matchAllQuery())
|
||||
.setSize(numDocs)
|
||||
.addSort("ul_field", SortOrder.ASC)
|
||||
.searchAfter(new Long[] { 100L })
|
||||
.get();
|
||||
assertSearchResponse(response);
|
||||
SearchHit[] hits = response.getHits().getHits();
|
||||
assertEquals(hits.length, 7);
|
||||
int i = 3;
|
||||
for (SearchHit hit : hits) {
|
||||
assertEquals(values[i++], hit.getSortValues()[0]);
|
||||
}
|
||||
}
|
||||
// asc sort with search_after as BigInteger
|
||||
{
|
||||
SearchResponse response = client().prepareSearch("idx")
|
||||
.setQuery(QueryBuilders.matchAllQuery())
|
||||
.setSize(numDocs)
|
||||
.addSort("ul_field", SortOrder.ASC)
|
||||
.searchAfter(new BigInteger[] { new BigInteger("18446744073709551614") })
|
||||
.get();
|
||||
assertSearchResponse(response);
|
||||
SearchHit[] hits = response.getHits().getHits();
|
||||
assertEquals(hits.length, 2);
|
||||
int i = 8;
|
||||
for (SearchHit hit : hits) {
|
||||
assertEquals(values[i++], hit.getSortValues()[0]);
|
||||
}
|
||||
}
|
||||
// asc sort with search_after as BigInteger in String format
|
||||
{
|
||||
SearchResponse response = client().prepareSearch("idx")
|
||||
.setQuery(QueryBuilders.matchAllQuery())
|
||||
.setSize(numDocs)
|
||||
.addSort("ul_field", SortOrder.ASC)
|
||||
.searchAfter(new String[] { "18446744073709551614" })
|
||||
.get();
|
||||
assertSearchResponse(response);
|
||||
SearchHit[] hits = response.getHits().getHits();
|
||||
assertEquals(hits.length, 2);
|
||||
int i = 8;
|
||||
for (SearchHit hit : hits) {
|
||||
assertEquals(values[i++], hit.getSortValues()[0]);
|
||||
}
|
||||
}
|
||||
// asc sort with search_after of negative value should fail
|
||||
{
|
||||
SearchRequestBuilder srb = client().prepareSearch("idx")
|
||||
.setQuery(QueryBuilders.matchAllQuery())
|
||||
.setSize(numDocs)
|
||||
.addSort("ul_field", SortOrder.ASC)
|
||||
.searchAfter(new Long[] { -1L });
|
||||
ElasticsearchException exception = expectThrows(ElasticsearchException.class, () -> srb.get());
|
||||
assertThat(exception.getCause().getMessage(), containsString("Failed to parse search_after value"));
|
||||
}
|
||||
// asc sort with search_after of value>=2^64 should fail
|
||||
{
|
||||
SearchRequestBuilder srb = client().prepareSearch("idx")
|
||||
.setQuery(QueryBuilders.matchAllQuery())
|
||||
.setSize(numDocs)
|
||||
.addSort("ul_field", SortOrder.ASC)
|
||||
.searchAfter(new BigInteger[] { new BigInteger("18446744073709551616") });
|
||||
ElasticsearchException exception = expectThrows(ElasticsearchException.class, () -> srb.get());
|
||||
assertThat(exception.getCause().getMessage(), containsString("Failed to parse search_after value"));
|
||||
}
|
||||
// desc sort with search_after as BigInteger
|
||||
{
|
||||
SearchResponse response = client().prepareSearch("idx")
|
||||
.setQuery(QueryBuilders.matchAllQuery())
|
||||
.setSize(numDocs)
|
||||
.addSort("ul_field", SortOrder.DESC)
|
||||
.searchAfter(new BigInteger[] { new BigInteger("18446744073709551615") })
|
||||
.get();
|
||||
assertSearchResponse(response);
|
||||
SearchHit[] hits = response.getHits().getHits();
|
||||
assertEquals(hits.length, 8);
|
||||
int i = 7;
|
||||
for (SearchHit hit : hits) {
|
||||
assertEquals(values[i--], hit.getSortValues()[0]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public void testAggs() {
|
||||
// terms agg
|
||||
{
|
||||
SearchResponse response = client().prepareSearch("idx").setSize(0).addAggregation(terms("ul_terms").field("ul_field")).get();
|
||||
assertSearchResponse(response);
|
||||
Terms terms = response.getAggregations().get("ul_terms");
|
||||
|
||||
long[] expectedBucketDocCounts = { 2, 2, 2, 1, 1, 1, 1 };
|
||||
Object[] expectedBucketKeys = {
|
||||
0L,
|
||||
new BigInteger("18446744073709551614"),
|
||||
new BigInteger("18446744073709551615"),
|
||||
100L,
|
||||
9223372036854775807L,
|
||||
new BigInteger("9223372036854775808"),
|
||||
new BigInteger("10446744073709551613") };
|
||||
int i = 0;
|
||||
for (Terms.Bucket bucket : terms.getBuckets()) {
|
||||
assertEquals(expectedBucketDocCounts[i], bucket.getDocCount());
|
||||
assertEquals(expectedBucketKeys[i], bucket.getKey());
|
||||
i++;
|
||||
}
|
||||
}
|
||||
|
||||
// histogram agg
|
||||
{
|
||||
SearchResponse response = client().prepareSearch("idx")
|
||||
.setSize(0)
|
||||
.addAggregation(histogram("ul_histo").field("ul_field").interval(9.223372036854776E18).minDocCount(0))
|
||||
.get();
|
||||
assertSearchResponse(response);
|
||||
Histogram histo = response.getAggregations().get("ul_histo");
|
||||
|
||||
long[] expectedBucketDocCounts = { 3, 3, 4 };
|
||||
double[] expectedBucketKeys = { 0, 9.223372036854776E18, 1.8446744073709552E19 };
|
||||
int i = 0;
|
||||
for (Histogram.Bucket bucket : histo.getBuckets()) {
|
||||
assertEquals(expectedBucketDocCounts[i], bucket.getDocCount());
|
||||
assertEquals(expectedBucketKeys[i], bucket.getKey());
|
||||
i++;
|
||||
}
|
||||
}
|
||||
|
||||
// range agg
|
||||
{
|
||||
SearchResponse response = client().prepareSearch("idx")
|
||||
.setSize(0)
|
||||
.addAggregation(
|
||||
range("ul_range").field("ul_field")
|
||||
.addUnboundedTo(9.223372036854776E18)
|
||||
.addRange(9.223372036854776E18, 1.8446744073709552E19)
|
||||
.addUnboundedFrom(1.8446744073709552E19)
|
||||
)
|
||||
.get();
|
||||
assertSearchResponse(response);
|
||||
Range range = response.getAggregations().get("ul_range");
|
||||
|
||||
long[] expectedBucketDocCounts = { 3, 3, 4 };
|
||||
String[] expectedBucketKeys = {
|
||||
"*-9.223372036854776E18",
|
||||
"9.223372036854776E18-1.8446744073709552E19",
|
||||
"1.8446744073709552E19-*" };
|
||||
int i = 0;
|
||||
for (Range.Bucket bucket : range.getBuckets()) {
|
||||
assertEquals(expectedBucketDocCounts[i], bucket.getDocCount());
|
||||
assertEquals(expectedBucketKeys[i], bucket.getKey());
|
||||
i++;
|
||||
}
|
||||
}
|
||||
|
||||
// sum agg
|
||||
{
|
||||
SearchResponse response = client().prepareSearch("idx").setSize(0).addAggregation(sum("ul_sum").field("ul_field")).get();
|
||||
assertSearchResponse(response);
|
||||
Sum sum = response.getAggregations().get("ul_sum");
|
||||
double expectedSum = Arrays.stream(values).mapToDouble(Number::doubleValue).sum();
|
||||
assertEquals(expectedSum, sum.getValue(), 0.001);
|
||||
}
|
||||
}
|
||||
|
||||
public void testSortDifferentFormatsShouldFail() throws IOException, InterruptedException {
|
||||
Settings.Builder settings = Settings.builder().put(indexSettings()).put("number_of_shards", 1);
|
||||
prepareCreate("idx2").addMapping("_doc", "ul_field", "type=long").setSettings(settings).get();
|
||||
List<IndexRequestBuilder> builders = new ArrayList<>();
|
||||
for (int i = 0; i < 4; i++) {
|
||||
builders.add(
|
||||
client().prepareIndex("idx2", "_doc").setSource(jsonBuilder().startObject().field("ul_field", values[i]).endObject())
|
||||
);
|
||||
}
|
||||
indexRandom(true, builders);
|
||||
ensureSearchable();
|
||||
|
||||
Exception exception = expectThrows(
|
||||
SearchPhaseExecutionException.class,
|
||||
() -> client().prepareSearch()
|
||||
.setIndices("idx", "idx2")
|
||||
.setQuery(QueryBuilders.matchAllQuery())
|
||||
.addSort("ul_field", SortOrder.ASC)
|
||||
.get()
|
||||
);
|
||||
assertEquals(
|
||||
exception.getCause().getMessage(),
|
||||
"Can't do sort across indices, as a field has [unsigned_long] type in one index, and different type in another index!"
|
||||
);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,245 @@
|
|||
setup:
|
||||
|
||||
- skip:
|
||||
version: " - 7.9.99"
|
||||
reason: "unsigned_long was added in 7.10"
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test1
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
ul:
|
||||
type: unsigned_long
|
||||
|
||||
- do:
|
||||
bulk:
|
||||
index: test1
|
||||
refresh: true
|
||||
body: |
|
||||
{ "index": {"_id" : "1"} }
|
||||
{ "ul": 0 }
|
||||
{ "index": {"_id" : "2"} }
|
||||
{ "ul": 9223372036854775807 }
|
||||
{ "index": {"_id" : "3"} }
|
||||
{ "ul": 9223372036854775808 }
|
||||
{ "index": {"_id" : "4"} }
|
||||
{ "ul": 18446744073709551614 }
|
||||
{ "index": {"_id" : "5"} }
|
||||
{ "ul": 18446744073709551615 }
|
||||
|
||||
---
|
||||
"Exist query":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
exists:
|
||||
field: ul
|
||||
|
||||
- match: { "hits.total.value": 5 }
|
||||
|
||||
|
||||
---
|
||||
"Term query":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
query:
|
||||
term:
|
||||
ul: 0
|
||||
- match: { "hits.total.value": 1 }
|
||||
- match: {hits.hits.0._id: "1" }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
query:
|
||||
term:
|
||||
ul: 18446744073709551615
|
||||
- match: { "hits.total.value": 1 }
|
||||
- match: {hits.hits.0._id: "5" }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
query:
|
||||
term:
|
||||
ul: 18446744073709551616
|
||||
- match: { "hits.total.value": 0 }
|
||||
|
||||
---
|
||||
"Terms query":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
terms:
|
||||
ul: [0, 9223372036854775808, 18446744073709551615]
|
||||
|
||||
- match: { "hits.total.value": 3 }
|
||||
|
||||
---
|
||||
"Range query":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
range:
|
||||
ul:
|
||||
gte: 0
|
||||
- match: { "hits.total.value": 5 }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
range:
|
||||
ul:
|
||||
gte: 0.5
|
||||
- match: { "hits.total.value": 4 }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
range:
|
||||
ul:
|
||||
lte: 18446744073709551615
|
||||
- match: { "hits.total.value": 5 }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
query:
|
||||
range:
|
||||
ul:
|
||||
lte: "18446744073709551614.5" # this must be string, as number gets converted to double with loss of precision
|
||||
- match: { "hits.total.value": 4 }
|
||||
|
||||
---
|
||||
"Sort":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
sort: [ { ul: asc } ]
|
||||
|
||||
- match: { "hits.total.value": 5 }
|
||||
- match: {hits.hits.0._id: "1" }
|
||||
- match: {hits.hits.0.sort: [0] }
|
||||
- match: {hits.hits.1._id: "2" }
|
||||
- match: {hits.hits.1.sort: [9223372036854775807] }
|
||||
- match: {hits.hits.2._id: "3" }
|
||||
- match: {hits.hits.2.sort: [9223372036854775808] }
|
||||
- match: {hits.hits.3._id: "4" }
|
||||
- match: {hits.hits.3.sort: [18446744073709551614] }
|
||||
- match: {hits.hits.4._id: "5" }
|
||||
- match: {hits.hits.4.sort: [18446744073709551615] }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
sort: [ { ul: asc } ]
|
||||
search_after: [9223372036854775808]
|
||||
|
||||
- length: { hits.hits: 2 }
|
||||
- match: {hits.hits.0._id: "4" }
|
||||
- match: {hits.hits.0.sort: [18446744073709551614] }
|
||||
- match: {hits.hits.1._id: "5" }
|
||||
- match: {hits.hits.1.sort: [18446744073709551615] }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
sort: [ { ul: desc } ]
|
||||
|
||||
- match: { "hits.total.value": 5 }
|
||||
- match: {hits.hits.0._id: "5" }
|
||||
- match: {hits.hits.0.sort: [18446744073709551615] }
|
||||
- match: {hits.hits.1._id: "4" }
|
||||
- match: {hits.hits.1.sort: [18446744073709551614] }
|
||||
- match: {hits.hits.2._id: "3" }
|
||||
- match: {hits.hits.2.sort: [9223372036854775808] }
|
||||
- match: {hits.hits.3._id: "2" }
|
||||
- match: {hits.hits.3.sort: [9223372036854775807] }
|
||||
- match: {hits.hits.4._id: "1" }
|
||||
- match: {hits.hits.4.sort: [0] }
|
||||
|
||||
---
|
||||
"Aggs":
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
aggs:
|
||||
ul_terms:
|
||||
terms:
|
||||
field: ul
|
||||
- length: { aggregations.ul_terms.buckets: 5 }
|
||||
- match: { aggregations.ul_terms.buckets.0.key: 0 }
|
||||
- match: { aggregations.ul_terms.buckets.1.key: 9223372036854775807 }
|
||||
- match: { aggregations.ul_terms.buckets.2.key: 9223372036854775808 }
|
||||
- match: { aggregations.ul_terms.buckets.3.key: 18446744073709551614 }
|
||||
- match: { aggregations.ul_terms.buckets.4.key: 18446744073709551615 }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
aggs:
|
||||
ul_histogram:
|
||||
histogram:
|
||||
field: ul
|
||||
interval: 9223372036854775807
|
||||
- length: { aggregations.ul_histogram.buckets: 3 }
|
||||
- match: { aggregations.ul_histogram.buckets.0.key: 0.0 }
|
||||
- match: { aggregations.ul_histogram.buckets.0.doc_count: 1 }
|
||||
- match: { aggregations.ul_histogram.buckets.1.key: 9.223372036854776E18 }
|
||||
- match: { aggregations.ul_histogram.buckets.1.doc_count: 2 }
|
||||
- match: { aggregations.ul_histogram.buckets.2.key: 1.8446744073709552E19 }
|
||||
- match: { aggregations.ul_histogram.buckets.2.doc_count: 2 }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
aggs:
|
||||
ul_range:
|
||||
range:
|
||||
field: ul
|
||||
ranges: [
|
||||
{ "from": null, "to": 9223372036854775807 },
|
||||
{ "from": 9223372036854775807, "to" : 18446744073709551614},
|
||||
{ "from": 18446744073709551614}
|
||||
]
|
||||
- length: { aggregations.ul_range.buckets: 3 }
|
||||
- match: { aggregations.ul_range.buckets.0.doc_count: 1 }
|
||||
- match: { aggregations.ul_range.buckets.1.doc_count: 2 }
|
||||
- match: { aggregations.ul_range.buckets.2.doc_count: 2 }
|
|
@ -0,0 +1,80 @@
|
|||
---
|
||||
"Null value":
|
||||
- skip:
|
||||
version: " - 7.9.99"
|
||||
reason: "unsigned_long was added in 7.10"
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test1
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
ul:
|
||||
type: unsigned_long
|
||||
null_value: 17446744073709551615
|
||||
|
||||
- do:
|
||||
bulk:
|
||||
index: test1
|
||||
refresh: true
|
||||
body: |
|
||||
{ "index": {"_id" : "1"} }
|
||||
{ "ul": 0 }
|
||||
{ "index": {"_id" : "2_null"} }
|
||||
{ "ul": null }
|
||||
{ "index": {"_id" : "3_null"} }
|
||||
{ "ul": ""}
|
||||
{ "index": {"_id" : "4"} }
|
||||
{ "ul": 18446744073709551614 }
|
||||
{ "index": {"_id" : "5_missing"} }
|
||||
{}
|
||||
|
||||
# term query
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
query:
|
||||
term:
|
||||
ul: 17446744073709551615
|
||||
- match: { "hits.total.value": 2 }
|
||||
- match: {hits.hits.0._id: "2_null" }
|
||||
- match: {hits.hits.1._id: "3_null" }
|
||||
|
||||
|
||||
# asc sort
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
sort: { ul : { order: asc, missing : "_last" } }
|
||||
- match: { "hits.total.value": 5 }
|
||||
- match: {hits.hits.0._id: "1" }
|
||||
- match: {hits.hits.0.sort: [0] }
|
||||
- match: {hits.hits.1._id: "2_null" }
|
||||
- match: {hits.hits.1.sort: [17446744073709551615] }
|
||||
- match: {hits.hits.2._id: "3_null" }
|
||||
- match: {hits.hits.2.sort: [17446744073709551615] }
|
||||
- match: {hits.hits.3._id: "4" }
|
||||
- match: {hits.hits.3.sort: [18446744073709551614] }
|
||||
- match: {hits.hits.4._id: "5_missing" }
|
||||
- match: {hits.hits.4.sort: [18446744073709551615] }
|
||||
|
||||
# desc sort
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
sort: { ul: { order: desc, missing: "_first" } }
|
||||
- match: { "hits.total.value": 5 }
|
||||
- match: {hits.hits.0._id: "5_missing" }
|
||||
- match: {hits.hits.0.sort: [18446744073709551615] }
|
||||
- match: {hits.hits.1._id: "4" }
|
||||
- match: {hits.hits.1.sort: [18446744073709551614] }
|
||||
- match: {hits.hits.2._id: "2_null" }
|
||||
- match: {hits.hits.2.sort: [17446744073709551615] }
|
||||
- match: {hits.hits.3._id: "3_null" }
|
||||
- match: {hits.hits.3.sort: [17446744073709551615] }
|
||||
- match: {hits.hits.4._id: "1" }
|
||||
- match: {hits.hits.4.sort: [0] }
|
|
@ -0,0 +1,72 @@
|
|||
---
|
||||
"Multi keyword and unsigned_long fields":
|
||||
- skip:
|
||||
version: " - 7.9.99"
|
||||
reason: "unsigned_long was added in 7.10"
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test1
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
counter:
|
||||
type: keyword
|
||||
fields:
|
||||
ul:
|
||||
type: unsigned_long
|
||||
|
||||
- do:
|
||||
bulk:
|
||||
index: test1
|
||||
refresh: true
|
||||
body: |
|
||||
{ "index": {"_id" : "1"} }
|
||||
{ "counter": 0 }
|
||||
{ "index": {"_id" : "2"} }
|
||||
{ "counter": 9223372036854775808 }
|
||||
{ "index": {"_id" : "3"} }
|
||||
{ "counter": "9223372036854775808" }
|
||||
{ "index": {"_id" : "4"} }
|
||||
{ "counter": 18446744073709551614 }
|
||||
{ "index": {"_id" : "5"} }
|
||||
{ "counter": 18446744073709551615 }
|
||||
|
||||
# term query
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
query:
|
||||
term:
|
||||
counter.ul: 9223372036854775808
|
||||
- match: { "hits.total.value": 2 }
|
||||
- match: {hits.hits.0._id: "2" }
|
||||
- match: {hits.hits.1._id: "3" }
|
||||
|
||||
|
||||
# asc sort by keyword
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
sort: { counter : { order: asc} }
|
||||
- match: { "hits.total.value": 5 }
|
||||
- match: {hits.hits.0._id: "1" }
|
||||
- match: {hits.hits.1._id: "4" }
|
||||
- match: {hits.hits.2._id: "5" }
|
||||
- match: {hits.hits.3._id: "2" }
|
||||
- match: {hits.hits.4._id: "3" }
|
||||
|
||||
# asc sort by unsigned long
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
sort: { counter.ul: { order: asc} }
|
||||
- match: { "hits.total.value": 5 }
|
||||
- match: {hits.hits.0._id: "1" }
|
||||
- match: {hits.hits.1._id: "2" }
|
||||
- match: {hits.hits.2._id: "3" }
|
||||
- match: {hits.hits.3._id: "4" }
|
||||
- match: {hits.hits.4._id: "5" }
|
|
@ -0,0 +1,139 @@
|
|||
setup:
|
||||
- skip:
|
||||
version: " - 7.9.99"
|
||||
reason: "unsigned_long was added in 7.10"
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test_longs
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
my_counter:
|
||||
type: long
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test_unsigned_longs
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
my_counter:
|
||||
type: unsigned_long
|
||||
|
||||
- do:
|
||||
bulk:
|
||||
index: test_longs
|
||||
refresh: true
|
||||
body: |
|
||||
{ "index": {"_id" : "1"} }
|
||||
{ "my_counter": 0 }
|
||||
{ "index": {"_id" : "2"} }
|
||||
{ "my_counter": 1000000 }
|
||||
{ "index": {"_id" : "3"} }
|
||||
{ "my_counter": 9223372036854775807 }
|
||||
|
||||
- do:
|
||||
bulk:
|
||||
index: test_unsigned_longs
|
||||
refresh: true
|
||||
body: |
|
||||
{ "index": {"_id" : "1"} }
|
||||
{ "my_counter": 0 }
|
||||
{ "index": {"_id" : "2"} }
|
||||
{ "my_counter": 1000000 }
|
||||
{ "index": {"_id" : "3"} }
|
||||
{ "my_counter": 9223372036854775807 }
|
||||
{ "index": {"_id" : "4"} }
|
||||
{ "my_counter": 18446744073709551615 }
|
||||
|
||||
|
||||
---
|
||||
"Querying of different numeric types is supported":
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
range:
|
||||
my_counter:
|
||||
gte: 0
|
||||
- match: { "hits.total.value": 7 }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
range:
|
||||
my_counter:
|
||||
gt: 0
|
||||
lt: 9223372036854775807
|
||||
- match: { "hits.total.value": 2 }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
range:
|
||||
my_counter:
|
||||
gte: 9223372036854775807
|
||||
- match: { "hits.total.value": 3 }
|
||||
|
||||
|
||||
---
|
||||
"Aggregation of different numeric types is supported":
|
||||
# sum agg
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
body:
|
||||
size: 0
|
||||
aggs:
|
||||
my_counter_sum:
|
||||
sum:
|
||||
field: my_counter
|
||||
- match: { aggregations.my_counter_sum.value: 3.68934881474211E19 }
|
||||
|
||||
# histogram agg
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
body:
|
||||
size: 0
|
||||
aggs:
|
||||
my_counter_histo:
|
||||
histogram:
|
||||
field: my_counter
|
||||
interval: 9223372036854775807
|
||||
- length: { aggregations.my_counter_histo.buckets: 3 }
|
||||
- match: { aggregations.my_counter_histo.buckets.0.key: 0.0 }
|
||||
- match: { aggregations.my_counter_histo.buckets.0.doc_count: 4 }
|
||||
- match: { aggregations.my_counter_histo.buckets.1.key: 9.223372036854776E18 }
|
||||
- match: { aggregations.my_counter_histo.buckets.1.doc_count: 2 }
|
||||
- match: { aggregations.my_counter_histo.buckets.2.key: 1.8446744073709552E19 }
|
||||
- match: { aggregations.my_counter_histo.buckets.2.doc_count: 1 }
|
||||
|
||||
# terms agg bucket values are converted to double
|
||||
- do:
|
||||
search:
|
||||
index: test*
|
||||
body:
|
||||
size: 0
|
||||
aggs:
|
||||
my_counter_terms:
|
||||
terms:
|
||||
field: my_counter
|
||||
- length: { aggregations.my_counter_terms.buckets: 4 }
|
||||
- match: { aggregations.my_counter_terms.buckets.0.key: 0.0 }
|
||||
- match: { aggregations.my_counter_terms.buckets.0.doc_count: 2 }
|
||||
- match: { aggregations.my_counter_terms.buckets.1.key: 1000000.0 }
|
||||
- match: { aggregations.my_counter_terms.buckets.1.doc_count: 2 }
|
||||
- match: { aggregations.my_counter_terms.buckets.2.key: 9.223372036854776E18 }
|
||||
- match: { aggregations.my_counter_terms.buckets.2.doc_count: 2 }
|
||||
- match: { aggregations.my_counter_terms.buckets.3.key: 1.8446744073709552E19 }
|
||||
- match: { aggregations.my_counter_terms.buckets.3.doc_count: 1 }
|
|
@ -0,0 +1,110 @@
|
|||
setup:
|
||||
|
||||
- skip:
|
||||
version: " - 7.9.99"
|
||||
reason: "unsigned_long was added in 7.10"
|
||||
|
||||
- do:
|
||||
indices.create:
|
||||
index: test1
|
||||
body:
|
||||
mappings:
|
||||
properties:
|
||||
ul:
|
||||
type: unsigned_long
|
||||
|
||||
- do:
|
||||
bulk:
|
||||
index: test1
|
||||
refresh: true
|
||||
body: |
|
||||
{ "index": {"_id" : "1"} }
|
||||
{ "ul": 0 }
|
||||
{ "index": {"_id" : "2"} }
|
||||
{ "ul": 9223372036854775807 }
|
||||
{ "index": {"_id" : "3"} }
|
||||
{ "ul": 9223372036854775808 }
|
||||
{ "index": {"_id" : "4"} }
|
||||
{ "ul": 18446744073709551614 }
|
||||
{ "index": {"_id" : "5"} }
|
||||
{ "ul": 18446744073709551615 }
|
||||
|
||||
---
|
||||
"Scripted fields values return BigInteger or Long":
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
sort: [ { ul: desc } ]
|
||||
script_fields:
|
||||
scripted_ul:
|
||||
script:
|
||||
source: "doc['ul'].value"
|
||||
|
||||
- match: { hits.hits.0.fields.scripted_ul.0: 18446744073709551615 }
|
||||
- match: { hits.hits.1.fields.scripted_ul.0: 18446744073709551614 }
|
||||
- match: { hits.hits.2.fields.scripted_ul.0: 9223372036854775808 }
|
||||
- match: { hits.hits.3.fields.scripted_ul.0: 9223372036854775807 }
|
||||
- match: { hits.hits.4.fields.scripted_ul.0: 0 }
|
||||
|
||||
---
|
||||
"Scripted sort values":
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
sort:
|
||||
_script:
|
||||
order: desc
|
||||
type: number
|
||||
script:
|
||||
source: "doc['ul'].value"
|
||||
|
||||
- match: { hits.hits.0.sort: [1.8446744073709552E19] }
|
||||
- match: { hits.hits.1.sort: [1.8446744073709552E19] }
|
||||
- match: { hits.hits.2.sort: [9.223372036854776E18] }
|
||||
- match: { hits.hits.3.sort: [9.223372036854776E18] }
|
||||
- match: { hits.hits.4.sort: [0.0] }
|
||||
|
||||
---
|
||||
"Script query":
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
query:
|
||||
bool:
|
||||
filter:
|
||||
script:
|
||||
script:
|
||||
source: "doc['ul'].value.doubleValue() > 10E18"
|
||||
- match: { hits.total.value: 2 }
|
||||
- match: { hits.hits.0._id: "4" }
|
||||
- match: { hits.hits.1._id: "5" }
|
||||
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
size: 0
|
||||
query:
|
||||
bool:
|
||||
filter:
|
||||
script:
|
||||
script:
|
||||
source: "doc['ul'].size() > 0"
|
||||
- match: { hits.total.value: 5 }
|
||||
|
||||
---
|
||||
"script_score query":
|
||||
- do:
|
||||
search:
|
||||
index: test1
|
||||
body:
|
||||
query:
|
||||
script_score:
|
||||
query: {match_all: {}}
|
||||
script:
|
||||
source: "doc['ul'].value"
|
||||
|
||||
- match: { hits.total.value: 5 }
|
Loading…
Reference in New Issue