mirror of
https://github.com/honeymoose/OpenSearch.git
synced 2025-03-25 17:38:44 +00:00
rename _shard -> _index and also rename classes and variables
closes #4584
This commit is contained in:
parent
611dd0a396
commit
9f54e9782d
@ -2,7 +2,7 @@
|
||||
== Text scoring in scripts
|
||||
|
||||
|
||||
Text features, such as term or document frequency for a specific term can be accessed in scripts (see <<modules-scripting, scripting documentation>> ) with the `_shard` variable. This can be useful if, for example, you want to implement your own scoring model using for example a script inside a <<query-dsl-function-score-query,function score query>>.
|
||||
Text features, such as term or document frequency for a specific term can be accessed in scripts (see <<modules-scripting, scripting documentation>> ) with the `_index` variable. This can be useful if, for example, you want to implement your own scoring model using for example a script inside a <<query-dsl-function-score-query,function score query>>.
|
||||
Statistics over the document collection are computed *per shard*, not per
|
||||
index.
|
||||
|
||||
@ -35,15 +35,15 @@ depending on the shard the current document resides in.
|
||||
[float]
|
||||
=== Shard statistics:
|
||||
|
||||
`_shard.numDocs()`::
|
||||
`_index.numDocs()`::
|
||||
|
||||
Number of documents in shard.
|
||||
|
||||
`_shard.maxDoc()`::
|
||||
`_index.maxDoc()`::
|
||||
|
||||
Maximal document number in shard.
|
||||
|
||||
`_shard.numDeletedDocs()`::
|
||||
`_index.numDeletedDocs()`::
|
||||
|
||||
Number of deleted documents in shard.
|
||||
|
||||
@ -52,18 +52,18 @@ depending on the shard the current document resides in.
|
||||
=== Field statistics:
|
||||
|
||||
Field statistics can be accessed with a subscript operator like this:
|
||||
`_shard['FIELD']`.
|
||||
`_index['FIELD']`.
|
||||
|
||||
|
||||
`_shard['FIELD'].docCount()`::
|
||||
`_index['FIELD'].docCount()`::
|
||||
|
||||
Number of documents containing the field `FIELD`. Does not take deleted documents into account.
|
||||
|
||||
`_shard['FIELD'].sumttf()`::
|
||||
`_index['FIELD'].sumttf()`::
|
||||
|
||||
Sum of `ttf` over all terms that appear in field `FIELD` in all documents.
|
||||
|
||||
`_shard['FIELD'].sumdf()`::
|
||||
`_index['FIELD'].sumdf()`::
|
||||
|
||||
The sum of `df` s over all terms that appear in field `FIELD` in all
|
||||
documents.
|
||||
@ -71,30 +71,30 @@ Field statistics can be accessed with a subscript operator like this:
|
||||
|
||||
Field statistics are computed per shard and therfore these numbers can vary
|
||||
depending on the shard the current document resides in.
|
||||
The number of terms in a field cannot be accessed using the `_shard` variable. See <<mapping-core-types, word count mapping type>> on how to do that.
|
||||
The number of terms in a field cannot be accessed using the `_index` variable. See <<mapping-core-types, word count mapping type>> on how to do that.
|
||||
|
||||
[float]
|
||||
=== Term statistics:
|
||||
|
||||
Term statistics for a field can be accessed with a subscript operator like
|
||||
this: `_shard['FIELD']['TERM']`. This will never return null, even if term or field does not exist.
|
||||
If you do not need the term frequency, call `_shard['FIELD'].get('TERM', 0)`
|
||||
this: `_index['FIELD']['TERM']`. This will never return null, even if term or field does not exist.
|
||||
If you do not need the term frequency, call `_index['FIELD'].get('TERM', 0)`
|
||||
to avoid uneccesary initialization of the frequencies. The flag will have only
|
||||
affect is your set the `index_options` to `docs` (see <<mapping-core-types, mapping documentation>>).
|
||||
|
||||
|
||||
`_shard['FIELD']['TERM'].df()`::
|
||||
`_index['FIELD']['TERM'].df()`::
|
||||
|
||||
`df` of term `TERM` in field `FIELD`. Will be returned, even if the term
|
||||
is not present in the current document.
|
||||
|
||||
`_shard['FIELD']['TERM'].ttf()`::
|
||||
`_index['FIELD']['TERM'].ttf()`::
|
||||
|
||||
The sum of term frequencys of term `TERM` in field `FIELD` over all
|
||||
documents. Will be returned, even if the term is not present in the
|
||||
current document.
|
||||
|
||||
`_shard['FIELD']['TERM'].tf()`::
|
||||
`_index['FIELD']['TERM'].tf()`::
|
||||
|
||||
`tf` of term `TERM` in field `FIELD`. Will be 0 if the term is not present
|
||||
in the current document.
|
||||
@ -104,7 +104,7 @@ affect is your set the `index_options` to `docs` (see <<mapping-core-types, mapp
|
||||
=== Term positions, offsets and payloads:
|
||||
|
||||
If you need information on the positions of terms in a field, call
|
||||
`_shard['FIELD'].get('TERM', flag)` where flag can be
|
||||
`_index['FIELD'].get('TERM', flag)` where flag can be
|
||||
|
||||
[horizontal]
|
||||
`_POSITIONS`:: if you need the positions of the term
|
||||
@ -119,7 +119,7 @@ example, the following will return an object holding the positions and payloads,
|
||||
as well as all statistics:
|
||||
|
||||
|
||||
`_shard['FIELD'].get('TERM', _POSITIONS | _PAYLOADS)`
|
||||
`_index['FIELD'].get('TERM', _POSITIONS | _PAYLOADS)`
|
||||
|
||||
|
||||
Positions can be accessed with an iterator that returns an object
|
||||
@ -164,7 +164,7 @@ Example: sums up all payloads for the term `foo`.
|
||||
|
||||
[source,mvel]
|
||||
---------------------------------------------------------
|
||||
termInfo = _shard['my_field'].get('foo',_PAYLOADS);
|
||||
termInfo = _index['my_field'].get('foo',_PAYLOADS);
|
||||
score = 0;
|
||||
for (pos : termInfo) {
|
||||
score = score + pos.payloadAsInt(0);
|
||||
@ -176,8 +176,8 @@ return score;
|
||||
[float]
|
||||
=== Term vectors:
|
||||
|
||||
The `_shard` variable can only be used to gather statistics for single terms. If you want to use information on all terms in a field, you must store the term vectors (set `term_vector` in the mapping as described in the <<mapping-core-types,mapping documentation>>). To access them, call
|
||||
`_shard.getTermVectors()` to get a
|
||||
The `_index` variable can only be used to gather statistics for single terms. If you want to use information on all terms in a field, you must store the term vectors (set `term_vector` in the mapping as described in the <<mapping-core-types,mapping documentation>>). To access them, call
|
||||
`_index.getTermVectors()` to get a
|
||||
https://lucene.apache.org/core/4_0_0/core/org/apache/lucene/index/Fields.html[Fields]
|
||||
instance. This object can then be used as described in https://lucene.apache.org/core/4_0_0/core/org/apache/lucene/index/Fields.html[lucene doc] to iterate over fields and then for each field iterate over each term in the field.
|
||||
The method will return null if the term vectors were not stored.
|
||||
|
@ -19,7 +19,7 @@
|
||||
|
||||
package org.elasticsearch.script;
|
||||
|
||||
import org.elasticsearch.search.lookup.ShardTermsLookup;
|
||||
import org.elasticsearch.search.lookup.IndexLookup;
|
||||
|
||||
import org.apache.lucene.index.AtomicReaderContext;
|
||||
import org.apache.lucene.search.Scorer;
|
||||
@ -93,8 +93,8 @@ public abstract class AbstractSearchScript extends AbstractExecutableScript impl
|
||||
/**
|
||||
* Allows to access statistics on terms and fields.
|
||||
*/
|
||||
protected final ShardTermsLookup shardTerms() {
|
||||
return lookup.shardTerms();
|
||||
protected final IndexLookup indexLookup() {
|
||||
return lookup.indexLookup();
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -30,8 +30,8 @@ import java.util.Iterator;
|
||||
* */
|
||||
public class CachedPositionIterator extends PositionIterator {
|
||||
|
||||
public CachedPositionIterator(ScriptTerm termInfo) {
|
||||
super(termInfo);
|
||||
public CachedPositionIterator(IndexFieldTerm indexFieldTerm) {
|
||||
super(indexFieldTerm);
|
||||
}
|
||||
|
||||
// all payloads of the term in the current document in one bytes array.
|
||||
|
@ -30,14 +30,14 @@ import java.util.Map;
|
||||
/**
|
||||
* Script interface to all information regarding a field.
|
||||
* */
|
||||
public class ScriptTerms extends MinimalMap<String, ScriptTerm> {
|
||||
public class IndexField extends MinimalMap<String, IndexFieldTerm> {
|
||||
|
||||
/*
|
||||
* TermsInfo Objects that represent the Terms are stored in this map when
|
||||
* requested. Information such as frequency, doc frequency and positions
|
||||
* information can be retrieved from the TermInfo objects in this map.
|
||||
*/
|
||||
private final Map<String, ScriptTerm> terms = new HashMap<String, ScriptTerm>();
|
||||
private final Map<String, IndexFieldTerm> terms = new HashMap<String, IndexFieldTerm>();
|
||||
|
||||
// the name of this field
|
||||
private final String fieldName;
|
||||
@ -46,7 +46,7 @@ public class ScriptTerms extends MinimalMap<String, ScriptTerm> {
|
||||
* The holds the current reader. We need it to populate the field
|
||||
* statistics. We just delegate all requests there
|
||||
*/
|
||||
private ShardTermsLookup shardTermsLookup;
|
||||
private IndexLookup indexLookup;
|
||||
|
||||
/*
|
||||
* General field statistics such as number of documents containing the
|
||||
@ -58,7 +58,7 @@ public class ScriptTerms extends MinimalMap<String, ScriptTerm> {
|
||||
* Uodate posting lists in all TermInfo objects
|
||||
*/
|
||||
void setReader(AtomicReader reader) {
|
||||
for (ScriptTerm ti : terms.values()) {
|
||||
for (IndexFieldTerm ti : terms.values()) {
|
||||
ti.setNextReader(reader);
|
||||
}
|
||||
}
|
||||
@ -68,15 +68,15 @@ public class ScriptTerms extends MinimalMap<String, ScriptTerm> {
|
||||
* statistics of this field. Information on specific terms in this field can
|
||||
* be accessed by calling get(String term).
|
||||
*/
|
||||
public ScriptTerms(String fieldName, ShardTermsLookup shardTermsLookup) throws IOException {
|
||||
public IndexField(String fieldName, IndexLookup indexLookup) throws IOException {
|
||||
|
||||
assert fieldName != null;
|
||||
this.fieldName = fieldName;
|
||||
|
||||
assert shardTermsLookup != null;
|
||||
this.shardTermsLookup = shardTermsLookup;
|
||||
assert indexLookup != null;
|
||||
this.indexLookup = indexLookup;
|
||||
|
||||
fieldStats = shardTermsLookup.getIndexSearcher().collectionStatistics(fieldName);
|
||||
fieldStats = this.indexLookup.getIndexSearcher().collectionStatistics(fieldName);
|
||||
}
|
||||
|
||||
/* get number of documents containing the field */
|
||||
@ -107,29 +107,29 @@ public class ScriptTerms extends MinimalMap<String, ScriptTerm> {
|
||||
* advance which terms are requested, we could provide an array which the
|
||||
* user could then iterate over.
|
||||
*/
|
||||
public ScriptTerm get(Object key, int flags) {
|
||||
public IndexFieldTerm get(Object key, int flags) {
|
||||
String termString = (String) key;
|
||||
ScriptTerm termInfo = terms.get(termString);
|
||||
IndexFieldTerm indexFieldTerm = terms.get(termString);
|
||||
// see if we initialized already...
|
||||
if (termInfo == null) {
|
||||
termInfo = new ScriptTerm(termString, fieldName, shardTermsLookup, flags);
|
||||
terms.put(termString, termInfo);
|
||||
if (indexFieldTerm == null) {
|
||||
indexFieldTerm = new IndexFieldTerm(termString, fieldName, indexLookup, flags);
|
||||
terms.put(termString, indexFieldTerm);
|
||||
}
|
||||
termInfo.validateFlags(flags);
|
||||
return termInfo;
|
||||
indexFieldTerm.validateFlags(flags);
|
||||
return indexFieldTerm;
|
||||
}
|
||||
|
||||
/*
|
||||
* Returns a TermInfo object that can be used to access information on
|
||||
* specific terms. flags can be set as described in TermInfo.
|
||||
*/
|
||||
public ScriptTerm get(Object key) {
|
||||
public IndexFieldTerm get(Object key) {
|
||||
// per default, do not initialize any positions info
|
||||
return get(key, ShardTermsLookup.FLAG_FREQUENCIES);
|
||||
return get(key, IndexLookup.FLAG_FREQUENCIES);
|
||||
}
|
||||
|
||||
public void setDocIdInTerms(int docId) {
|
||||
for (ScriptTerm ti : terms.values()) {
|
||||
for (IndexFieldTerm ti : terms.values()) {
|
||||
ti.setNextDoc(docId);
|
||||
}
|
||||
}
|
@ -30,7 +30,7 @@ import java.util.Iterator;
|
||||
/**
|
||||
* Holds all information on a particular term in a field.
|
||||
* */
|
||||
public class ScriptTerm implements Iterable<TermPosition> {
|
||||
public class IndexFieldTerm implements Iterable<TermPosition> {
|
||||
|
||||
// The posting list for this term. Is null if the term or field does not
|
||||
// exist. Can be DocsEnum or DocsAndPositionsEnum.
|
||||
@ -90,16 +90,16 @@ public class ScriptTerm implements Iterable<TermPosition> {
|
||||
}
|
||||
|
||||
private boolean shouldRetrieveFrequenciesOnly() {
|
||||
return (flags & ~ShardTermsLookup.FLAG_FREQUENCIES) == 0;
|
||||
return (flags & ~IndexLookup.FLAG_FREQUENCIES) == 0;
|
||||
}
|
||||
|
||||
private int getLuceneFrequencyFlag(int flags) {
|
||||
return (flags & ShardTermsLookup.FLAG_FREQUENCIES) > 0 ? DocsEnum.FLAG_FREQS : DocsEnum.FLAG_NONE;
|
||||
return (flags & IndexLookup.FLAG_FREQUENCIES) > 0 ? DocsEnum.FLAG_FREQS : DocsEnum.FLAG_NONE;
|
||||
}
|
||||
|
||||
private int getLucenePositionsFlags(int flags) {
|
||||
int lucenePositionsFlags = (flags & ShardTermsLookup.FLAG_PAYLOADS) > 0 ? DocsAndPositionsEnum.FLAG_PAYLOADS : 0x0;
|
||||
lucenePositionsFlags |= (flags & ShardTermsLookup.FLAG_OFFSETS) > 0 ? DocsAndPositionsEnum.FLAG_OFFSETS : 0x0;
|
||||
int lucenePositionsFlags = (flags & IndexLookup.FLAG_PAYLOADS) > 0 ? DocsAndPositionsEnum.FLAG_PAYLOADS : 0x0;
|
||||
lucenePositionsFlags |= (flags & IndexLookup.FLAG_OFFSETS) > 0 ? DocsAndPositionsEnum.FLAG_OFFSETS : 0x0;
|
||||
return lucenePositionsFlags;
|
||||
}
|
||||
|
||||
@ -162,19 +162,19 @@ public class ScriptTerm implements Iterable<TermPosition> {
|
||||
}
|
||||
iterator.nextDoc();
|
||||
} catch (IOException e) {
|
||||
throw new ElasticSearchException("While trying to initialize term positions in ScriptTerm.setNextDoc() ", e);
|
||||
throw new ElasticSearchException("While trying to initialize term positions in IndexFieldTerm.setNextDoc() ", e);
|
||||
}
|
||||
}
|
||||
|
||||
public ScriptTerm(String term, String fieldName, ShardTermsLookup shardTermsLookup, int flags) {
|
||||
public IndexFieldTerm(String term, String fieldName, IndexLookup indexLookup, int flags) {
|
||||
assert fieldName != null;
|
||||
this.fieldName = fieldName;
|
||||
assert term != null;
|
||||
this.term = term;
|
||||
assert shardTermsLookup != null;
|
||||
assert indexLookup != null;
|
||||
identifier = new Term(fieldName, (String) term);
|
||||
this.flags = flags;
|
||||
boolean doRecord = ((flags & ShardTermsLookup.FLAG_CACHE) > 0);
|
||||
boolean doRecord = ((flags & IndexLookup.FLAG_CACHE) > 0);
|
||||
if (withPositions()) {
|
||||
if (!doRecord) {
|
||||
iterator = new PositionIterator(this);
|
||||
@ -184,11 +184,11 @@ public class ScriptTerm implements Iterable<TermPosition> {
|
||||
} else {
|
||||
iterator = new PositionIterator(this);
|
||||
}
|
||||
setNextReader(shardTermsLookup.getReader());
|
||||
setNextDoc(shardTermsLookup.getDocId());
|
||||
setNextReader(indexLookup.getReader());
|
||||
setNextDoc(indexLookup.getDocId());
|
||||
try {
|
||||
termStats = shardTermsLookup.getIndexSearcher().termStatistics(identifier,
|
||||
TermContext.build(shardTermsLookup.getReaderContext(), identifier));
|
||||
termStats = indexLookup.getIndexSearcher().termStatistics(identifier,
|
||||
TermContext.build(indexLookup.getReaderContext(), identifier));
|
||||
} catch (IOException e) {
|
||||
throw new ElasticSearchException("Cannot get term statistics: ", e);
|
||||
}
|
||||
@ -199,15 +199,15 @@ public class ScriptTerm implements Iterable<TermPosition> {
|
||||
}
|
||||
|
||||
protected boolean shouldRetrievePositions() {
|
||||
return (flags & ShardTermsLookup.FLAG_POSITIONS) > 0;
|
||||
return (flags & IndexLookup.FLAG_POSITIONS) > 0;
|
||||
}
|
||||
|
||||
protected boolean shouldRetrieveOffsets() {
|
||||
return (flags & ShardTermsLookup.FLAG_OFFSETS) > 0;
|
||||
return (flags & IndexLookup.FLAG_OFFSETS) > 0;
|
||||
}
|
||||
|
||||
protected boolean shouldRetrievePayloads() {
|
||||
return (flags & ShardTermsLookup.FLAG_PAYLOADS) > 0;
|
||||
return (flags & IndexLookup.FLAG_PAYLOADS) > 0;
|
||||
}
|
||||
|
||||
public int tf() throws IOException {
|
||||
@ -241,24 +241,24 @@ public class ScriptTerm implements Iterable<TermPosition> {
|
||||
}
|
||||
|
||||
private String getCallStatement(String calledFlags) {
|
||||
return "_shard['" + this.fieldName + "'].get('" + this.term + "', " + calledFlags + ")";
|
||||
return "_index['" + this.fieldName + "'].get('" + this.term + "', " + calledFlags + ")";
|
||||
}
|
||||
|
||||
private String getFlagsString(int flags2) {
|
||||
String flagsString = null;
|
||||
if ((flags2 & ShardTermsLookup.FLAG_FREQUENCIES) != 0) {
|
||||
if ((flags2 & IndexLookup.FLAG_FREQUENCIES) != 0) {
|
||||
flagsString = anddToFlagsString(flagsString, "_FREQUENCIES");
|
||||
}
|
||||
if ((flags2 & ShardTermsLookup.FLAG_POSITIONS) != 0) {
|
||||
if ((flags2 & IndexLookup.FLAG_POSITIONS) != 0) {
|
||||
flagsString = anddToFlagsString(flagsString, "_POSITIONS");
|
||||
}
|
||||
if ((flags2 & ShardTermsLookup.FLAG_OFFSETS) != 0) {
|
||||
if ((flags2 & IndexLookup.FLAG_OFFSETS) != 0) {
|
||||
flagsString = anddToFlagsString(flagsString, "_OFFSETS");
|
||||
}
|
||||
if ((flags2 & ShardTermsLookup.FLAG_PAYLOADS) != 0) {
|
||||
if ((flags2 & IndexLookup.FLAG_PAYLOADS) != 0) {
|
||||
flagsString = anddToFlagsString(flagsString, "_PAYLOADS");
|
||||
}
|
||||
if ((flags2 & ShardTermsLookup.FLAG_CACHE) != 0) {
|
||||
if ((flags2 & IndexLookup.FLAG_CACHE) != 0) {
|
||||
flagsString = anddToFlagsString(flagsString, "_CACHE");
|
||||
}
|
||||
return flagsString;
|
@ -29,36 +29,36 @@ import java.io.IOException;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
public class ShardTermsLookup extends MinimalMap<String, ScriptTerms> {
|
||||
public class IndexLookup extends MinimalMap<String, IndexField> {
|
||||
|
||||
/**
|
||||
* Flag to pass to {@link ScriptTerms#get(String, flags)} if you require
|
||||
* offsets in the returned {@link ScriptTerm}.
|
||||
* Flag to pass to {@link IndexField#get(String, flags)} if you require
|
||||
* offsets in the returned {@link IndexFieldTerm}.
|
||||
*/
|
||||
public static final int FLAG_OFFSETS = 2;
|
||||
|
||||
/**
|
||||
* Flag to pass to {@link ScriptTerms#get(String, flags)} if you require
|
||||
* payloads in the returned {@link ScriptTerm}.
|
||||
* Flag to pass to {@link IndexField#get(String, flags)} if you require
|
||||
* payloads in the returned {@link IndexFieldTerm}.
|
||||
*/
|
||||
public static final int FLAG_PAYLOADS = 4;
|
||||
|
||||
/**
|
||||
* Flag to pass to {@link ScriptTerms#get(String, flags)} if you require
|
||||
* frequencies in the returned {@link ScriptTerm}. Frequencies might be
|
||||
* Flag to pass to {@link IndexField#get(String, flags)} if you require
|
||||
* frequencies in the returned {@link IndexFieldTerm}. Frequencies might be
|
||||
* returned anyway for some lucene codecs even if this flag is no set.
|
||||
*/
|
||||
public static final int FLAG_FREQUENCIES = 8;
|
||||
|
||||
/**
|
||||
* Flag to pass to {@link ScriptTerms#get(String, flags)} if you require
|
||||
* positions in the returned {@link ScriptTerm}.
|
||||
* Flag to pass to {@link IndexField#get(String, flags)} if you require
|
||||
* positions in the returned {@link IndexFieldTerm}.
|
||||
*/
|
||||
public static final int FLAG_POSITIONS = 16;
|
||||
|
||||
/**
|
||||
* Flag to pass to {@link ScriptTerms#get(String, flags)} if you require
|
||||
* positions in the returned {@link ScriptTerm}.
|
||||
* Flag to pass to {@link IndexField#get(String, flags)} if you require
|
||||
* positions in the returned {@link IndexFieldTerm}.
|
||||
*/
|
||||
public static final int FLAG_CACHE = 32;
|
||||
|
||||
@ -82,7 +82,7 @@ public class ShardTermsLookup extends MinimalMap<String, ScriptTerms> {
|
||||
// stores the objects that are used in the script. we maintain this map
|
||||
// because we do not want to re-initialize the objects each time a field is
|
||||
// accessed
|
||||
private final Map<String, ScriptTerms> scriptTermsPerField = new HashMap<String, ScriptTerms>();
|
||||
private final Map<String, IndexField> indexFields = new HashMap<String, IndexField>();
|
||||
|
||||
// number of documents per shard. cached here because the computation is
|
||||
// expensive
|
||||
@ -116,12 +116,12 @@ public class ShardTermsLookup extends MinimalMap<String, ScriptTerms> {
|
||||
return numDeletedDocs;
|
||||
}
|
||||
|
||||
public ShardTermsLookup(Builder<String, Object> builder) {
|
||||
builder.put("_FREQUENCIES", ShardTermsLookup.FLAG_FREQUENCIES);
|
||||
builder.put("_POSITIONS", ShardTermsLookup.FLAG_POSITIONS);
|
||||
builder.put("_OFFSETS", ShardTermsLookup.FLAG_OFFSETS);
|
||||
builder.put("_PAYLOADS", ShardTermsLookup.FLAG_PAYLOADS);
|
||||
builder.put("_CACHE", ShardTermsLookup.FLAG_CACHE);
|
||||
public IndexLookup(Builder<String, Object> builder) {
|
||||
builder.put("_FREQUENCIES", IndexLookup.FLAG_FREQUENCIES);
|
||||
builder.put("_POSITIONS", IndexLookup.FLAG_POSITIONS);
|
||||
builder.put("_OFFSETS", IndexLookup.FLAG_OFFSETS);
|
||||
builder.put("_PAYLOADS", IndexLookup.FLAG_PAYLOADS);
|
||||
builder.put("_CACHE", IndexLookup.FLAG_CACHE);
|
||||
}
|
||||
|
||||
public void setNextReader(AtomicReaderContext context) {
|
||||
@ -138,7 +138,7 @@ public class ShardTermsLookup extends MinimalMap<String, ScriptTerms> {
|
||||
indexReaderContext = context.parent;
|
||||
} else {
|
||||
// parent reader may only be set once. TODO we could also call
|
||||
// scriptFields.clear() here instead of assertion just to be on
|
||||
// indexFields.clear() here instead of assertion just to be on
|
||||
// the save side
|
||||
assert (parentReader == context.parent.reader());
|
||||
}
|
||||
@ -151,7 +151,7 @@ public class ShardTermsLookup extends MinimalMap<String, ScriptTerms> {
|
||||
}
|
||||
|
||||
protected void setReaderInFields() {
|
||||
for (ScriptTerms stat : scriptTermsPerField.values()) {
|
||||
for (IndexField stat : indexFields.values()) {
|
||||
stat.setReader(reader);
|
||||
}
|
||||
}
|
||||
@ -163,7 +163,7 @@ public class ShardTermsLookup extends MinimalMap<String, ScriptTerms> {
|
||||
}
|
||||
// We assume that docs are processed in ascending order of id. If this
|
||||
// is not the case, we would have to re initialize all posting lists in
|
||||
// ScriptTerm. TODO: Instead of assert we could also call
|
||||
// IndexFieldTerm. TODO: Instead of assert we could also call
|
||||
// setReaderInFields(); here?
|
||||
if (this.docId > docId) {
|
||||
// This might happen if the same SearchLookup is used in different
|
||||
@ -171,15 +171,15 @@ public class ShardTermsLookup extends MinimalMap<String, ScriptTerms> {
|
||||
// In this case we do not want to re initialize posting list etc.
|
||||
// because we do not even know if term and field statistics will be
|
||||
// needed in this new phase.
|
||||
// Therefore we just remove all ScriptFields.
|
||||
scriptTermsPerField.clear();
|
||||
// Therefore we just remove all IndexFieldTerms.
|
||||
indexFields.clear();
|
||||
}
|
||||
this.docId = docId;
|
||||
setNextDocIdInFields();
|
||||
}
|
||||
|
||||
protected void setNextDocIdInFields() {
|
||||
for (ScriptTerms stat : scriptTermsPerField.values()) {
|
||||
for (IndexField stat : indexFields.values()) {
|
||||
stat.setDocIdInTerms(this.docId);
|
||||
}
|
||||
}
|
||||
@ -190,18 +190,18 @@ public class ShardTermsLookup extends MinimalMap<String, ScriptTerms> {
|
||||
* user could then iterate over.
|
||||
*/
|
||||
@Override
|
||||
public ScriptTerms get(Object key) {
|
||||
public IndexField get(Object key) {
|
||||
String stringField = (String) key;
|
||||
ScriptTerms scriptField = scriptTermsPerField.get(key);
|
||||
if (scriptField == null) {
|
||||
IndexField indexField = indexFields.get(key);
|
||||
if (indexField == null) {
|
||||
try {
|
||||
scriptField = new ScriptTerms(stringField, this);
|
||||
scriptTermsPerField.put(stringField, scriptField);
|
||||
indexField = new IndexField(stringField, this);
|
||||
indexFields.put(stringField, indexField);
|
||||
} catch (IOException e) {
|
||||
throw new ElasticSearchException(e.getMessage());
|
||||
}
|
||||
}
|
||||
return scriptField;
|
||||
return indexField;
|
||||
}
|
||||
|
||||
/*
|
@ -32,7 +32,7 @@ public class PositionIterator implements Iterator<TermPosition> {
|
||||
|
||||
private boolean resetted = false;
|
||||
|
||||
protected ScriptTerm scriptTerm;
|
||||
protected IndexFieldTerm indexFieldTerm;
|
||||
|
||||
protected int freq = -1;
|
||||
|
||||
@ -43,13 +43,13 @@ public class PositionIterator implements Iterator<TermPosition> {
|
||||
|
||||
private DocsAndPositionsEnum docsAndPos;
|
||||
|
||||
public PositionIterator(ScriptTerm termInfo) {
|
||||
this.scriptTerm = termInfo;
|
||||
public PositionIterator(IndexFieldTerm indexFieldTerm) {
|
||||
this.indexFieldTerm = indexFieldTerm;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void remove() {
|
||||
throw new UnsupportedOperationException("Cannot remove anything from TermPositions iterator.");
|
||||
throw new UnsupportedOperationException("Cannot remove anything from TermPosition iterator.");
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -75,9 +75,9 @@ public class PositionIterator implements Iterator<TermPosition> {
|
||||
public void nextDoc() throws IOException {
|
||||
resetted = false;
|
||||
currentPos = 0;
|
||||
freq = scriptTerm.tf();
|
||||
if (scriptTerm.docsEnum instanceof DocsAndPositionsEnum) {
|
||||
docsAndPos = (DocsAndPositionsEnum) scriptTerm.docsEnum;
|
||||
freq = indexFieldTerm.tf();
|
||||
if (indexFieldTerm.docsEnum instanceof DocsAndPositionsEnum) {
|
||||
docsAndPos = (DocsAndPositionsEnum) indexFieldTerm.docsEnum;
|
||||
} else {
|
||||
docsAndPos = EMPTY;
|
||||
}
|
||||
|
@ -37,7 +37,7 @@ public class SearchLookup {
|
||||
|
||||
final FieldsLookup fieldsLookup;
|
||||
|
||||
final ShardTermsLookup shardTermsLookup;
|
||||
final IndexLookup indexLookup;
|
||||
|
||||
final ImmutableMap<String, Object> asMap;
|
||||
|
||||
@ -46,13 +46,13 @@ public class SearchLookup {
|
||||
docMap = new DocLookup(mapperService, fieldDataService, types);
|
||||
sourceLookup = new SourceLookup();
|
||||
fieldsLookup = new FieldsLookup(mapperService, types);
|
||||
shardTermsLookup = new ShardTermsLookup(builder);
|
||||
indexLookup = new IndexLookup(builder);
|
||||
|
||||
builder.put("doc", docMap);
|
||||
builder.put("_doc", docMap);
|
||||
builder.put("_source", sourceLookup);
|
||||
builder.put("_fields", fieldsLookup);
|
||||
builder.put("_shard", shardTermsLookup);
|
||||
builder.put("_index", indexLookup);
|
||||
asMap = builder.build();
|
||||
}
|
||||
|
||||
@ -64,8 +64,8 @@ public class SearchLookup {
|
||||
return this.sourceLookup;
|
||||
}
|
||||
|
||||
public ShardTermsLookup shardTerms() {
|
||||
return this.shardTermsLookup;
|
||||
public IndexLookup indexLookup() {
|
||||
return this.indexLookup;
|
||||
}
|
||||
|
||||
public FieldsLookup fields() {
|
||||
@ -84,13 +84,13 @@ public class SearchLookup {
|
||||
docMap.setNextReader(context);
|
||||
sourceLookup.setNextReader(context);
|
||||
fieldsLookup.setNextReader(context);
|
||||
shardTermsLookup.setNextReader(context);
|
||||
indexLookup.setNextReader(context);
|
||||
}
|
||||
|
||||
public void setNextDocId(int docId) {
|
||||
docMap.setNextDocId(docId);
|
||||
sourceLookup.setNextDocId(docId);
|
||||
fieldsLookup.setNextDocId(docId);
|
||||
shardTermsLookup.setNextDocId(docId);
|
||||
indexLookup.setNextDocId(docId);
|
||||
}
|
||||
}
|
||||
|
@ -4,8 +4,8 @@ import org.elasticsearch.common.Nullable;
|
||||
import org.elasticsearch.script.AbstractSearchScript;
|
||||
import org.elasticsearch.script.ExecutableScript;
|
||||
import org.elasticsearch.script.NativeScriptFactory;
|
||||
import org.elasticsearch.search.lookup.ScriptTerm;
|
||||
import org.elasticsearch.search.lookup.ScriptTerms;
|
||||
import org.elasticsearch.search.lookup.IndexFieldTerm;
|
||||
import org.elasticsearch.search.lookup.IndexField;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
@ -38,12 +38,12 @@ public class NativeNaiveTFIDFScoreScript extends AbstractSearchScript {
|
||||
@Override
|
||||
public Object run() {
|
||||
float score = 0;
|
||||
ScriptTerms scriptTerms = shardTerms().get(field);
|
||||
IndexField indexField = indexLookup().get(field);
|
||||
for (int i = 0; i < terms.length; i++) {
|
||||
ScriptTerm scriptTerm = scriptTerms.get(terms[i]);
|
||||
IndexFieldTerm indexFieldTerm = indexField.get(terms[i]);
|
||||
try {
|
||||
if (scriptTerm.tf() != 0) {
|
||||
score += scriptTerm.tf() * scriptTerms.docCount() / scriptTerm.df();
|
||||
if (indexFieldTerm.tf() != 0) {
|
||||
score += indexFieldTerm.tf() * indexField.docCount() / indexFieldTerm.df();
|
||||
}
|
||||
} catch (IOException e) {
|
||||
throw new RuntimeException();
|
||||
|
@ -1,8 +1,8 @@
|
||||
package org.elasticsearch.benchmark.scripts.score.script;
|
||||
|
||||
import org.elasticsearch.search.lookup.ScriptTerm;
|
||||
import org.elasticsearch.search.lookup.ScriptTerms;
|
||||
import org.elasticsearch.search.lookup.ShardTermsLookup;
|
||||
import org.elasticsearch.search.lookup.IndexFieldTerm;
|
||||
import org.elasticsearch.search.lookup.IndexField;
|
||||
import org.elasticsearch.search.lookup.IndexLookup;
|
||||
import org.elasticsearch.search.lookup.TermPosition;
|
||||
|
||||
import org.elasticsearch.common.Nullable;
|
||||
@ -40,10 +40,10 @@ public class NativePayloadSumNoRecordScoreScript extends AbstractSearchScript {
|
||||
@Override
|
||||
public Object run() {
|
||||
float score = 0;
|
||||
ScriptTerms scriptTerms = shardTerms().get(field);
|
||||
IndexField indexField = indexLookup().get(field);
|
||||
for (int i = 0; i < terms.length; i++) {
|
||||
ScriptTerm scriptTerm = scriptTerms.get(terms[i], ShardTermsLookup.FLAG_PAYLOADS);
|
||||
for (TermPosition pos : scriptTerm) {
|
||||
IndexFieldTerm indexFieldTerm = indexField.get(terms[i], IndexLookup.FLAG_PAYLOADS);
|
||||
for (TermPosition pos : indexFieldTerm) {
|
||||
score += pos.payloadAsFloat(0);
|
||||
}
|
||||
}
|
||||
|
@ -1,8 +1,8 @@
|
||||
package org.elasticsearch.benchmark.scripts.score.script;
|
||||
|
||||
import org.elasticsearch.search.lookup.ScriptTerm;
|
||||
import org.elasticsearch.search.lookup.ScriptTerms;
|
||||
import org.elasticsearch.search.lookup.ShardTermsLookup;
|
||||
import org.elasticsearch.search.lookup.IndexFieldTerm;
|
||||
import org.elasticsearch.search.lookup.IndexField;
|
||||
import org.elasticsearch.search.lookup.IndexLookup;
|
||||
import org.elasticsearch.search.lookup.TermPosition;
|
||||
|
||||
import org.elasticsearch.common.Nullable;
|
||||
@ -40,10 +40,10 @@ public class NativePayloadSumScoreScript extends AbstractSearchScript {
|
||||
@Override
|
||||
public Object run() {
|
||||
float score = 0;
|
||||
ScriptTerms scriptTerms = shardTerms().get(field);
|
||||
IndexField indexField = indexLookup().get(field);
|
||||
for (int i = 0; i < terms.length; i++) {
|
||||
ScriptTerm scriptTerm = scriptTerms.get(terms[i], ShardTermsLookup.FLAG_PAYLOADS | ShardTermsLookup.FLAG_CACHE);
|
||||
for (TermPosition pos : scriptTerm) {
|
||||
IndexFieldTerm indexFieldTerm = indexField.get(terms[i], IndexLookup.FLAG_PAYLOADS | IndexLookup.FLAG_CACHE);
|
||||
for (TermPosition pos : indexFieldTerm) {
|
||||
score += pos.payloadAsFloat(0);
|
||||
}
|
||||
}
|
||||
|
@ -42,7 +42,7 @@ import java.util.concurrent.ExecutionException;
|
||||
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
|
||||
public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
public class IndexLookupTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
String includeAllFlag = "_FREQUENCIES | _OFFSETS | _PAYLOADS | _POSITIONS | _CACHE";
|
||||
String includeAllWithoutRecordFlag = "_FREQUENCIES | _OFFSETS | _PAYLOADS | _POSITIONS ";
|
||||
@ -149,9 +149,9 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
initTestData();
|
||||
|
||||
// check term frequencies for 'a'
|
||||
String scriptFieldScript = "termInfo = _shard['int_payload_field']['c']; termInfo.tf()";
|
||||
String scriptFieldScript = "term = _index['int_payload_field']['c']; term.tf()";
|
||||
scriptFieldScript = "1";
|
||||
String scoreScript = "termInfo = _shard['int_payload_field']['b']; termInfo.tf()";
|
||||
String scoreScript = "term = _index['int_payload_field']['b']; term.tf()";
|
||||
Map<String, Object> expectedResultsField = new HashMap<String, Object>();
|
||||
expectedResultsField.put("1", 1);
|
||||
expectedResultsField.put("2", 1);
|
||||
@ -172,19 +172,19 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
// should throw an exception, we cannot call with different flags twice
|
||||
// if the flags of the second call were not included in the first call.
|
||||
String script = "termInfo = _shard['int_payload_field']['b']; return _shard['int_payload_field'].get('b', _POSITIONS).tf();";
|
||||
String script = "term = _index['int_payload_field']['b']; return _index['int_payload_field'].get('b', _POSITIONS).tf();";
|
||||
try {
|
||||
client().prepareSearch("test").setQuery(QueryBuilders.matchAllQuery()).addScriptField("tvtest", script).execute().actionGet();
|
||||
} catch (SearchPhaseExecutionException e) {
|
||||
assertThat(
|
||||
e.getDetailedMessage()
|
||||
.indexOf(
|
||||
"You must call get with all required flags! Instead of _shard['int_payload_field'].get('b', _FREQUENCIES) and _shard['int_payload_field'].get('b', _POSITIONS) call _shard['int_payload_field'].get('b', _FREQUENCIES | _POSITIONS) once]; "),
|
||||
"You must call get with all required flags! Instead of _index['int_payload_field'].get('b', _FREQUENCIES) and _index['int_payload_field'].get('b', _POSITIONS) call _index['int_payload_field'].get('b', _FREQUENCIES | _POSITIONS) once]; "),
|
||||
Matchers.greaterThan(-1));
|
||||
}
|
||||
|
||||
// Should not throw an exception this way round
|
||||
script = "termInfo = _shard['int_payload_field'].get('b', _POSITIONS | _FREQUENCIES);return _shard['int_payload_field']['b'].tf();";
|
||||
script = "term = _index['int_payload_field'].get('b', _POSITIONS | _FREQUENCIES);return _index['int_payload_field']['b'].tf();";
|
||||
client().prepareSearch("test").setQuery(QueryBuilders.matchAllQuery()).addScriptField("tvtest", script).execute().actionGet();
|
||||
}
|
||||
|
||||
@ -203,8 +203,8 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
|
||||
initTestData();
|
||||
|
||||
String script = "termInfo = _shard['float_payload_field'].get('b'," + includeAllFlag
|
||||
+ "); payloadSum=0; for (pos : termInfo) {payloadSum = pos.payloadAsInt(0);} return payloadSum;";
|
||||
String script = "term = _index['float_payload_field'].get('b'," + includeAllFlag
|
||||
+ "); payloadSum=0; for (pos : term) {payloadSum = pos.payloadAsInt(0);} return payloadSum;";
|
||||
|
||||
// non existing field: sum should be 0
|
||||
HashMap<String, Object> zeroArray = new HashMap<String, Object>();
|
||||
@ -213,8 +213,8 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
zeroArray.put("3", 0);
|
||||
checkValueInEachDoc(script, zeroArray, 3);
|
||||
|
||||
script = "termInfo = _shard['int_payload_field'].get('b'," + includeAllFlag
|
||||
+ "); payloadSum=0; for (pos : termInfo) {payloadSum = payloadSum + pos.payloadAsInt(0);} return payloadSum;";
|
||||
script = "term = _index['int_payload_field'].get('b'," + includeAllFlag
|
||||
+ "); payloadSum=0; for (pos : term) {payloadSum = payloadSum + pos.payloadAsInt(0);} return payloadSum;";
|
||||
|
||||
// existing field: sums should be as here:
|
||||
zeroArray.put("1", 5);
|
||||
@ -248,7 +248,7 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
script = createPositionsArrayScriptIterateTwice("b", includeAllWithoutRecordFlag, "payloadAsInt(-1)");
|
||||
checkExceptions(script);
|
||||
|
||||
// no record and get TermInfoObject twice and iterate: should fail
|
||||
// no record and get termObject twice and iterate: should fail
|
||||
script = createPositionsArrayScriptGetInfoObjectTwice("b", includeAllWithoutRecordFlag, "position");
|
||||
checkExceptions(script);
|
||||
script = createPositionsArrayScriptGetInfoObjectTwice("b", includeAllWithoutRecordFlag, "startOffset");
|
||||
@ -261,27 +261,27 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
}
|
||||
|
||||
private String createPositionsArrayScriptGetInfoObjectTwice(String term, String flags, String what) {
|
||||
String script = "termInfo = _shard['int_payload_field'].get('" + term + "'," + flags
|
||||
+ "); array=[]; for (pos : termInfo) {array.add(pos." + what + ")} ;_shard['int_payload_field'].get('" + term + "',"
|
||||
+ flags + "); array=[]; for (pos : termInfo) {array.add(pos." + what + ")}";
|
||||
String script = "term = _index['int_payload_field'].get('" + term + "'," + flags
|
||||
+ "); array=[]; for (pos : term) {array.add(pos." + what + ")} ;_index['int_payload_field'].get('" + term + "',"
|
||||
+ flags + "); array=[]; for (pos : term) {array.add(pos." + what + ")}";
|
||||
return script;
|
||||
}
|
||||
|
||||
private String createPositionsArrayScriptIterateTwice(String term, String flags, String what) {
|
||||
String script = "termInfo = _shard['int_payload_field'].get('" + term + "'," + flags
|
||||
+ "); array=[]; for (pos : termInfo) {array.add(pos." + what + ")} array=[]; for (pos : termInfo) {array.add(pos." + what
|
||||
String script = "term = _index['int_payload_field'].get('" + term + "'," + flags
|
||||
+ "); array=[]; for (pos : term) {array.add(pos." + what + ")} array=[]; for (pos : term) {array.add(pos." + what
|
||||
+ ")} return array;";
|
||||
return script;
|
||||
}
|
||||
|
||||
private String createPositionsArrayScript(String field, String term, String flags, String what) {
|
||||
String script = "termInfo = _shard['" + field + "'].get('" + term + "'," + flags
|
||||
+ "); array=[]; for (pos : termInfo) {array.add(pos." + what + ")} return array;";
|
||||
String script = "term = _index['" + field + "'].get('" + term + "'," + flags
|
||||
+ "); array=[]; for (pos : term) {array.add(pos." + what + ")} return array;";
|
||||
return script;
|
||||
}
|
||||
|
||||
private String createPositionsArrayScriptDefaultGet(String field, String term, String what) {
|
||||
String script = "termInfo = _shard['" + field + "']['" + term + "']; array=[]; for (pos : termInfo) {array.add(pos." + what
|
||||
String script = "term = _index['" + field + "']['" + term + "']; array=[]; for (pos : term) {array.add(pos." + what
|
||||
+ ")} return array;";
|
||||
return script;
|
||||
}
|
||||
@ -427,36 +427,36 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
client().prepareIndex("test", "type1", "6").setSource("int_payload_field", "c|1"));
|
||||
|
||||
// get the number of all docs
|
||||
String script = "_shard.numDocs()";
|
||||
String script = "_index.numDocs()";
|
||||
checkValueInEachDoc(6, script, 6);
|
||||
|
||||
// get the number of docs with field float_payload_field
|
||||
script = "_shard['float_payload_field'].docCount()";
|
||||
script = "_index['float_payload_field'].docCount()";
|
||||
checkValueInEachDoc(3, script, 6);
|
||||
|
||||
// corner case: what if the field does not exist?
|
||||
script = "_shard['non_existent_field'].docCount()";
|
||||
script = "_index['non_existent_field'].docCount()";
|
||||
checkValueInEachDoc(0, script, 6);
|
||||
|
||||
// get the number of all tokens in all docs
|
||||
script = "_shard['float_payload_field'].sumttf()";
|
||||
script = "_index['float_payload_field'].sumttf()";
|
||||
checkValueInEachDoc(9, script, 6);
|
||||
|
||||
// corner case get the number of all tokens in all docs for non existent
|
||||
// field
|
||||
script = "_shard['non_existent_field'].sumttf()";
|
||||
script = "_index['non_existent_field'].sumttf()";
|
||||
checkValueInEachDoc(0, script, 6);
|
||||
|
||||
// get the sum of doc freqs in all docs
|
||||
script = "_shard['float_payload_field'].sumdf()";
|
||||
script = "_index['float_payload_field'].sumdf()";
|
||||
checkValueInEachDoc(5, script, 6);
|
||||
|
||||
// get the sum of doc freqs in all docs for non existent field
|
||||
script = "_shard['non_existent_field'].sumdf()";
|
||||
script = "_index['non_existent_field'].sumdf()";
|
||||
checkValueInEachDoc(0, script, 6);
|
||||
|
||||
// check term frequencies for 'a'
|
||||
script = "termInfo = _shard['float_payload_field']['a']; if (termInfo != null) {termInfo.tf()}";
|
||||
script = "term = _index['float_payload_field']['a']; if (term != null) {term.tf()}";
|
||||
Map<String, Object> expectedResults = new HashMap<String, Object>();
|
||||
expectedResults.put("1", 2);
|
||||
expectedResults.put("2", 0);
|
||||
@ -468,7 +468,7 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
expectedResults.clear();
|
||||
|
||||
// check doc frequencies for 'c'
|
||||
script = "termInfo = _shard['float_payload_field']['c']; if (termInfo != null) {termInfo.df()}";
|
||||
script = "term = _index['float_payload_field']['c']; if (term != null) {term.df()}";
|
||||
expectedResults.put("1", 1l);
|
||||
expectedResults.put("2", 1l);
|
||||
expectedResults.put("3", 1l);
|
||||
@ -479,7 +479,7 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
expectedResults.clear();
|
||||
|
||||
// check doc frequencies for term that does not exist
|
||||
script = "termInfo = _shard['float_payload_field']['non_existent_term']; if (termInfo != null) {termInfo.df()}";
|
||||
script = "term = _index['float_payload_field']['non_existent_term']; if (term != null) {term.df()}";
|
||||
expectedResults.put("1", 0l);
|
||||
expectedResults.put("2", 0l);
|
||||
expectedResults.put("3", 0l);
|
||||
@ -490,7 +490,7 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
expectedResults.clear();
|
||||
|
||||
// check doc frequencies for term that does not exist
|
||||
script = "termInfo = _shard['non_existent_field']['non_existent_term']; if (termInfo != null) {termInfo.tf()}";
|
||||
script = "term = _index['non_existent_field']['non_existent_term']; if (term != null) {term.tf()}";
|
||||
expectedResults.put("1", 0);
|
||||
expectedResults.put("2", 0);
|
||||
expectedResults.put("3", 0);
|
||||
@ -501,7 +501,7 @@ public class ShardLookupInScriptTests extends ElasticsearchIntegrationTest {
|
||||
expectedResults.clear();
|
||||
|
||||
// check total term frequencies for 'a'
|
||||
script = "termInfo = _shard['float_payload_field']['a']; if (termInfo != null) {termInfo.ttf()}";
|
||||
script = "term = _index['float_payload_field']['a']; if (term != null) {term.ttf()}";
|
||||
expectedResults.put("1", 4l);
|
||||
expectedResults.put("2", 4l);
|
||||
expectedResults.put("3", 4l);
|
Loading…
x
Reference in New Issue
Block a user