Merge branch 'master' into ccr

* master:
  Security: revert to old way of merging automata (#32254)
  Networking: Fix test leaking buffer (#32296)
  Undo a debugging change that snuck in during the field aliases merge.
  Painless: Update More Methods to New Naming Scheme (#32305)
  [TEST] Fix assumeFalse -> assumeTrue in SSLReloadIntegTests
  Ingest: Support integer and long hex values in convert (#32213)
  Introduce fips_mode setting and associated checks (#32326)
  Add V_6_3_3 version constant
  [DOCS] Removed extraneous callout number.
  Rest HL client: Add put license action (#32214)
  Add ERR to ranking evaluation documentation (#32314)
  Introduce Application Privileges with support for Kibana RBAC (#32309)
  Build: Shadow x-pack:protocol into x-pack:plugin:core (#32240)
  [Kerberos] Add Kerberos authentication support (#32263)
  [ML] Extract persistent task methods from MlMetadata (#32319)
  Add Restore Snapshot High Level REST API
  Register ERR metric with NamedXContentRegistry (#32320)
  fixes broken build for third-party-tests (#32315)
  Allow Integ Tests to run in a FIPS-140 JVM (#31989)
  [DOCS] Rollup Caps API incorrectly mentions GET Jobs API (#32280)
  awaitsfix testRandomClusterStateUpdates
  [TEST] add version skip to weighted_avg tests
  Consistent encoder names (#29492)
  Add WeightedAvg metric aggregation (#31037)
  Switch monitoring to new style Requests (#32255)
  Rename ranking evaluation `quality_level` to `metric_score` (#32168)
  Fix a test bug around nested aggregations and field aliases. (#32287)
  Add new permission for JDK11 to load JAAS libraries (#32132)
  Silence SSL reload test that fails on JDK 11
  [test] package pre-install java check (#32259)
  specify subdirs of lib, bin, modules in package (#32253)
  Switch x-pack:core to new style Requests (#32252)
  awaitsfix SSLConfigurationReloaderTests
  Painless: Clean up add methods in PainlessLookup (#32258)
  Fail shard if IndexShard#storeStats runs into an IOException (#32241)
  AwaitsFix RecoveryIT#testHistoryUUIDIsGenerated
  Remove unnecessary warning supressions (#32250)
  CCE when re-throwing "shard not available" exception in TransportShardMultiGetAction (#32185)
  Add new fields to monitoring template for Beats state (#32085)
This commit is contained in:
Nhat Nguyen 2018-07-24 21:33:50 -04:00
commit ab4deefbe0
455 changed files with 16899 additions and 2841 deletions

View File

@ -305,6 +305,39 @@ the `qa` subdirectory functions just like the top level `qa` subdirectory. The
Elasticsearch process. The `transport-client` subdirectory contains extensions Elasticsearch process. The `transport-client` subdirectory contains extensions
to Elasticsearch's standard transport client to work properly with x-pack. to Elasticsearch's standard transport client to work properly with x-pack.
### Gradle Build
We use Gradle to build Elasticsearch because it is flexible enough to not only
build and package Elasticsearch, but also orchestrate all of the ways that we
have to test Elasticsearch.
#### Configurations
Gradle organizes dependencies and build artifacts into "configurations" and
allows you to use these configurations arbitrarilly. Here are some of the most
common configurations in our build and how we use them:
<dl>
<dt>`compile`</dt><dd>Code that is on the classpath at both compile and
runtime. If the [`shadow`][shadow-plugin] plugin is applied to the project then
this code is bundled into the jar produced by the project.</dd>
<dt>`runtime`</dt><dd>Code that is not on the classpath at compile time but is
on the classpath at runtime. We mostly use this configuration to make sure that
we do not accidentally compile against dependencies of our dependencies also
known as "transitive" dependencies".</dd>
<dt>`compileOnly`</dt><dd>Code that is on the classpath at comile time but that
should not be shipped with the project because it is "provided" by the runtime
somehow. Elasticsearch plugins use this configuration to include dependencies
that are bundled with Elasticsearch's server.</dd>
<dt>`shadow`</dt><dd>Only available in projects with the shadow plugin. Code
that is on the classpath at both compile and runtime but it *not* bundled into
the jar produced by the project. If you depend on a project with the `shadow`
plugin then you need to depend on this configuration because it will bring
along all of the dependencies you need at runtime.</dd>
<dt>`testCompile`</dt><dd>Code that is on the classpath for compiling tests
that are part of this project but not production code. The canonical example
of this is `junit`.</dd>
</dl>
Contributing as part of a class Contributing as part of a class
------------------------------- -------------------------------
@ -337,3 +370,4 @@ repeating in this section because it has come up in this context.
[eclipse]: http://www.eclipse.org/community/eclipse_newsletter/2017/june/ [eclipse]: http://www.eclipse.org/community/eclipse_newsletter/2017/june/
[intellij]: https://blog.jetbrains.com/idea/2017/07/intellij-idea-2017-2-is-here-smart-sleek-and-snappy/ [intellij]: https://blog.jetbrains.com/idea/2017/07/intellij-idea-2017-2-is-here-smart-sleek-and-snappy/
[shadow-plugin]: https://github.com/johnrengelman/shadow

View File

@ -516,6 +516,31 @@ allprojects {
tasks.eclipse.dependsOn(cleanEclipse, copyEclipseSettings) tasks.eclipse.dependsOn(cleanEclipse, copyEclipseSettings)
} }
allprojects {
/*
* IntelliJ and Eclipse don't know about the shadow plugin so when we're
* in "IntelliJ mode" or "Eclipse mode" add "runtime" dependencies
* eveywhere where we see a "shadow" dependency which will cause them to
* reference shadowed projects directly rather than rely on the shadowing
* to include them. This is the correct thing for it to do because it
* doesn't run the jar shadowing at all. This isn't needed for the project
* itself because the IDE configuration is done by SourceSets but it is
* *is* needed for projects that depends on the project doing the shadowing.
* Without this they won't properly depend on the shadowed project.
*/
if (isEclipse || isIdea) {
configurations.all { Configuration configuration ->
dependencies.all { Dependency dep ->
if (dep instanceof ProjectDependency) {
if (dep.getTargetConfiguration() == 'shadow') {
configuration.dependencies.add(project.dependencies.project(path: dep.dependencyProject.path, configuration: 'runtime'))
}
}
}
}
}
}
// we need to add the same --debug-jvm option as // we need to add the same --debug-jvm option as
// the real RunTask has, so we can pass it through // the real RunTask has, so we can pass it through
class Run extends DefaultTask { class Run extends DefaultTask {

View File

@ -131,6 +131,9 @@ class BuildPlugin implements Plugin<Project> {
runtimeJavaVersionEnum = JavaVersion.toVersion(findJavaSpecificationVersion(project, runtimeJavaHome)) runtimeJavaVersionEnum = JavaVersion.toVersion(findJavaSpecificationVersion(project, runtimeJavaHome))
} }
String inFipsJvmScript = 'print(java.security.Security.getProviders()[0].name.toLowerCase().contains("fips"));'
boolean inFipsJvm = Boolean.parseBoolean(runJavascript(project, runtimeJavaHome, inFipsJvmScript))
// Build debugging info // Build debugging info
println '=======================================' println '======================================='
println 'Elasticsearch Build Hamster says Hello!' println 'Elasticsearch Build Hamster says Hello!'
@ -202,6 +205,7 @@ class BuildPlugin implements Plugin<Project> {
project.rootProject.ext.buildChecksDone = true project.rootProject.ext.buildChecksDone = true
project.rootProject.ext.minimumCompilerVersion = minimumCompilerVersion project.rootProject.ext.minimumCompilerVersion = minimumCompilerVersion
project.rootProject.ext.minimumRuntimeVersion = minimumRuntimeVersion project.rootProject.ext.minimumRuntimeVersion = minimumRuntimeVersion
project.rootProject.ext.inFipsJvm = inFipsJvm
} }
project.targetCompatibility = project.rootProject.ext.minimumRuntimeVersion project.targetCompatibility = project.rootProject.ext.minimumRuntimeVersion
@ -213,6 +217,7 @@ class BuildPlugin implements Plugin<Project> {
project.ext.compilerJavaVersion = project.rootProject.ext.compilerJavaVersion project.ext.compilerJavaVersion = project.rootProject.ext.compilerJavaVersion
project.ext.runtimeJavaVersion = project.rootProject.ext.runtimeJavaVersion project.ext.runtimeJavaVersion = project.rootProject.ext.runtimeJavaVersion
project.ext.javaVersions = project.rootProject.ext.javaVersions project.ext.javaVersions = project.rootProject.ext.javaVersions
project.ext.inFipsJvm = project.rootProject.ext.inFipsJvm
} }
private static String findCompilerJavaHome() { private static String findCompilerJavaHome() {
@ -386,6 +391,9 @@ class BuildPlugin implements Plugin<Project> {
project.configurations.compile.dependencies.all(disableTransitiveDeps) project.configurations.compile.dependencies.all(disableTransitiveDeps)
project.configurations.testCompile.dependencies.all(disableTransitiveDeps) project.configurations.testCompile.dependencies.all(disableTransitiveDeps)
project.configurations.compileOnly.dependencies.all(disableTransitiveDeps) project.configurations.compileOnly.dependencies.all(disableTransitiveDeps)
project.plugins.withType(ShadowPlugin).whenPluginAdded {
project.configurations.shadow.dependencies.all(disableTransitiveDeps)
}
} }
/** Adds repositories used by ES dependencies */ /** Adds repositories used by ES dependencies */
@ -770,7 +778,11 @@ class BuildPlugin implements Plugin<Project> {
systemProperty property.getKey(), property.getValue() systemProperty property.getKey(), property.getValue()
} }
} }
// Set the system keystore/truststore password if we're running tests in a FIPS-140 JVM
if (project.inFipsJvm) {
systemProperty 'javax.net.ssl.trustStorePassword', 'password'
systemProperty 'javax.net.ssl.keyStorePassword', 'password'
}
boolean assertionsEnabled = Boolean.parseBoolean(System.getProperty('tests.asserts', 'true')) boolean assertionsEnabled = Boolean.parseBoolean(System.getProperty('tests.asserts', 'true'))
enableSystemAssertions assertionsEnabled enableSystemAssertions assertionsEnabled
enableAssertions assertionsEnabled enableAssertions assertionsEnabled
@ -873,11 +885,20 @@ class BuildPlugin implements Plugin<Project> {
project.dependencyLicenses.dependencies = project.configurations.runtime.fileCollection { project.dependencyLicenses.dependencies = project.configurations.runtime.fileCollection {
it.group.startsWith('org.elasticsearch') == false it.group.startsWith('org.elasticsearch') == false
} - project.configurations.compileOnly } - project.configurations.compileOnly
project.plugins.withType(ShadowPlugin).whenPluginAdded {
project.dependencyLicenses.dependencies += project.configurations.shadow.fileCollection {
it.group.startsWith('org.elasticsearch') == false
}
}
} }
private static configureDependenciesInfo(Project project) { private static configureDependenciesInfo(Project project) {
Task deps = project.tasks.create("dependenciesInfo", DependenciesInfoTask.class) Task deps = project.tasks.create("dependenciesInfo", DependenciesInfoTask.class)
deps.runtimeConfiguration = project.configurations.runtime deps.runtimeConfiguration = project.configurations.runtime
project.plugins.withType(ShadowPlugin).whenPluginAdded {
deps.runtimeConfiguration = project.configurations.create('infoDeps')
deps.runtimeConfiguration.extendsFrom(project.configurations.runtime, project.configurations.shadow)
}
deps.compileOnlyConfiguration = project.configurations.compileOnly deps.compileOnlyConfiguration = project.configurations.compileOnly
project.afterEvaluate { project.afterEvaluate {
deps.mappings = project.dependencyLicenses.mappings deps.mappings = project.dependencyLicenses.mappings

View File

@ -48,18 +48,6 @@ public class PluginBuildPlugin extends BuildPlugin {
@Override @Override
public void apply(Project project) { public void apply(Project project) {
super.apply(project) super.apply(project)
project.plugins.withType(ShadowPlugin).whenPluginAdded {
/*
* We've not tested these plugins together and we're fairly sure
* they aren't going to work properly as is *and* we're not really
* sure *why* you'd want to shade stuff in plugins. So we throw an
* exception here to make you come and read this comment. If you
* have a need for shadow while building plugins then know that you
* are probably going to have to fight with gradle for a while....
*/
throw new InvalidUserDataException('elasticsearch.esplugin is not '
+ 'compatible with com.github.johnrengelman.shadow');
}
configureDependencies(project) configureDependencies(project)
// this afterEvaluate must happen before the afterEvaluate added by integTest creation, // this afterEvaluate must happen before the afterEvaluate added by integTest creation,
// so that the file name resolution for installing the plugin will be setup // so that the file name resolution for installing the plugin will be setup
@ -153,8 +141,13 @@ public class PluginBuildPlugin extends BuildPlugin {
include(buildProperties.descriptorOutput.name) include(buildProperties.descriptorOutput.name)
} }
from pluginMetadata // metadata (eg custom security policy) from pluginMetadata // metadata (eg custom security policy)
from project.jar // this plugin's jar /*
from project.configurations.runtime - project.configurations.compileOnly // the dep jars * If the plugin is using the shadow plugin then we need to bundle
* "shadow" things rather than the default jar and dependencies so
* we don't hit jar hell.
*/
from { project.plugins.hasPlugin(ShadowPlugin) ? project.shadowJar : project.jar }
from { project.plugins.hasPlugin(ShadowPlugin) ? project.configurations.shadow : project.configurations.runtime - project.configurations.compileOnly }
// extra files for the plugin to go into the zip // extra files for the plugin to go into the zip
from('src/main/packaging') // TODO: move all config/bin/_size/etc into packaging from('src/main/packaging') // TODO: move all config/bin/_size/etc into packaging
from('src/main') { from('src/main') {

View File

@ -0,0 +1,66 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.client;
import org.elasticsearch.action.ActionListener;
import org.elasticsearch.protocol.xpack.license.PutLicenseRequest;
import org.elasticsearch.protocol.xpack.license.PutLicenseResponse;
import java.io.IOException;
import static java.util.Collections.emptySet;
/**
* A wrapper for the {@link RestHighLevelClient} that provides methods for
* accessing the Elastic License-related methods
* <p>
* See the <a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/licensing-apis.html">
* X-Pack Licensing APIs on elastic.co</a> for more information.
*/
public class LicenseClient {
private final RestHighLevelClient restHighLevelClient;
LicenseClient(RestHighLevelClient restHighLevelClient) {
this.restHighLevelClient = restHighLevelClient;
}
/**
* Updates license for the cluster.
* @param options the request options (e.g. headers), use {@link RequestOptions#DEFAULT} if nothing needs to be customized
* @return the response
* @throws IOException in case there is a problem sending the request or parsing back the response
*/
public PutLicenseResponse putLicense(PutLicenseRequest request, RequestOptions options) throws IOException {
return restHighLevelClient.performRequestAndParseEntity(request, RequestConverters::putLicense, options,
PutLicenseResponse::fromXContent, emptySet());
}
/**
* Asynchronously updates license for the cluster cluster.
* @param options the request options (e.g. headers), use {@link RequestOptions#DEFAULT} if nothing needs to be customized
* @param listener the listener to be notified upon request completion
*/
public void putLicenseAsync(PutLicenseRequest request, RequestOptions options, ActionListener<PutLicenseResponse> listener) {
restHighLevelClient.performRequestAsyncAndParseEntity(request, RequestConverters::putLicense, options,
PutLicenseResponse::fromXContent, listener, emptySet());
}
}

View File

@ -40,6 +40,7 @@ import org.elasticsearch.action.admin.cluster.settings.ClusterGetSettingsRequest
import org.elasticsearch.action.admin.cluster.settings.ClusterUpdateSettingsRequest; import org.elasticsearch.action.admin.cluster.settings.ClusterUpdateSettingsRequest;
import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotRequest; import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotRequest;
import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsRequest; import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsRequest;
import org.elasticsearch.action.admin.cluster.snapshots.restore.RestoreSnapshotRequest;
import org.elasticsearch.action.admin.cluster.storedscripts.DeleteStoredScriptRequest; import org.elasticsearch.action.admin.cluster.storedscripts.DeleteStoredScriptRequest;
import org.elasticsearch.action.admin.cluster.storedscripts.GetStoredScriptRequest; import org.elasticsearch.action.admin.cluster.storedscripts.GetStoredScriptRequest;
import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotRequest; import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotRequest;
@ -108,6 +109,7 @@ import org.elasticsearch.index.rankeval.RankEvalRequest;
import org.elasticsearch.protocol.xpack.XPackInfoRequest; import org.elasticsearch.protocol.xpack.XPackInfoRequest;
import org.elasticsearch.protocol.xpack.watcher.PutWatchRequest; import org.elasticsearch.protocol.xpack.watcher.PutWatchRequest;
import org.elasticsearch.protocol.xpack.XPackUsageRequest; import org.elasticsearch.protocol.xpack.XPackUsageRequest;
import org.elasticsearch.protocol.xpack.license.PutLicenseRequest;
import org.elasticsearch.rest.action.search.RestSearchAction; import org.elasticsearch.rest.action.search.RestSearchAction;
import org.elasticsearch.script.mustache.MultiSearchTemplateRequest; import org.elasticsearch.script.mustache.MultiSearchTemplateRequest;
import org.elasticsearch.script.mustache.SearchTemplateRequest; import org.elasticsearch.script.mustache.SearchTemplateRequest;
@ -980,6 +982,20 @@ final class RequestConverters {
return request; return request;
} }
static Request restoreSnapshot(RestoreSnapshotRequest restoreSnapshotRequest) throws IOException {
String endpoint = new EndpointBuilder().addPathPartAsIs("_snapshot")
.addPathPart(restoreSnapshotRequest.repository())
.addPathPart(restoreSnapshotRequest.snapshot())
.addPathPartAsIs("_restore")
.build();
Request request = new Request(HttpPost.METHOD_NAME, endpoint);
Params parameters = new Params(request);
parameters.withMasterTimeout(restoreSnapshotRequest.masterNodeTimeout());
parameters.withWaitForCompletion(restoreSnapshotRequest.waitForCompletion());
request.setEntity(createEntity(restoreSnapshotRequest, REQUEST_BODY_CONTENT_TYPE));
return request;
}
static Request deleteSnapshot(DeleteSnapshotRequest deleteSnapshotRequest) { static Request deleteSnapshot(DeleteSnapshotRequest deleteSnapshotRequest) {
String endpoint = new EndpointBuilder().addPathPartAsIs("_snapshot") String endpoint = new EndpointBuilder().addPathPartAsIs("_snapshot")
.addPathPart(deleteSnapshotRequest.repository()) .addPathPart(deleteSnapshotRequest.repository())
@ -1124,6 +1140,18 @@ final class RequestConverters {
return request; return request;
} }
static Request putLicense(PutLicenseRequest putLicenseRequest) {
Request request = new Request(HttpPut.METHOD_NAME, "/_xpack/license");
Params parameters = new Params(request);
parameters.withTimeout(putLicenseRequest.timeout());
parameters.withMasterTimeout(putLicenseRequest.masterNodeTimeout());
if (putLicenseRequest.isAcknowledge()) {
parameters.putParam("acknowledge", "true");
}
request.setJsonEntity(putLicenseRequest.getLicenseDefinition());
return request;
}
private static HttpEntity createEntity(ToXContent toXContent, XContentType xContentType) throws IOException { private static HttpEntity createEntity(ToXContent toXContent, XContentType xContentType) throws IOException {
BytesRef source = XContentHelper.toXContent(toXContent, xContentType, false).toBytesRef(); BytesRef source = XContentHelper.toXContent(toXContent, xContentType, false).toBytesRef();
return new ByteArrayEntity(source.bytes, source.offset, source.length, createContentType(xContentType)); return new ByteArrayEntity(source.bytes, source.offset, source.length, createContentType(xContentType));

View File

@ -30,6 +30,8 @@ import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyReposito
import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryResponse; import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryResponse;
import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotRequest; import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotRequest;
import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotResponse; import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotResponse;
import org.elasticsearch.action.admin.cluster.snapshots.restore.RestoreSnapshotRequest;
import org.elasticsearch.action.admin.cluster.snapshots.restore.RestoreSnapshotResponse;
import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusRequest; import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusRequest;
import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusResponse; import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusResponse;
import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotRequest; import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotRequest;
@ -252,6 +254,36 @@ public final class SnapshotClient {
SnapshotsStatusResponse::fromXContent, listener, emptySet()); SnapshotsStatusResponse::fromXContent, listener, emptySet());
} }
/**
* Restores a snapshot.
* See <a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-snapshots.html"> Snapshot and Restore
* API on elastic.co</a>
*
* @param restoreSnapshotRequest the request
* @param options the request options (e.g. headers), use {@link RequestOptions#DEFAULT} if nothing needs to be customized
* @return the response
* @throws IOException in case there is a problem sending the request or parsing back the response
*/
public RestoreSnapshotResponse restore(RestoreSnapshotRequest restoreSnapshotRequest, RequestOptions options) throws IOException {
return restHighLevelClient.performRequestAndParseEntity(restoreSnapshotRequest, RequestConverters::restoreSnapshot, options,
RestoreSnapshotResponse::fromXContent, emptySet());
}
/**
* Asynchronously restores a snapshot.
* See <a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-snapshots.html"> Snapshot and Restore
* API on elastic.co</a>
*
* @param restoreSnapshotRequest the request
* @param options the request options (e.g. headers), use {@link RequestOptions#DEFAULT} if nothing needs to be customized
* @param listener the listener to be notified upon request completion
*/
public void restoreAsync(RestoreSnapshotRequest restoreSnapshotRequest, RequestOptions options,
ActionListener<RestoreSnapshotResponse> listener) {
restHighLevelClient.performRequestAsyncAndParseEntity(restoreSnapshotRequest, RequestConverters::restoreSnapshot, options,
RestoreSnapshotResponse::fromXContent, listener, emptySet());
}
/** /**
* Deletes a snapshot. * Deletes a snapshot.
* See <a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-snapshots.html"> Snapshot and Restore * See <a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-snapshots.html"> Snapshot and Restore

View File

@ -42,10 +42,12 @@ public final class XPackClient {
private final RestHighLevelClient restHighLevelClient; private final RestHighLevelClient restHighLevelClient;
private final WatcherClient watcherClient; private final WatcherClient watcherClient;
private final LicenseClient licenseClient;
XPackClient(RestHighLevelClient restHighLevelClient) { XPackClient(RestHighLevelClient restHighLevelClient) {
this.restHighLevelClient = restHighLevelClient; this.restHighLevelClient = restHighLevelClient;
this.watcherClient = new WatcherClient(restHighLevelClient); this.watcherClient = new WatcherClient(restHighLevelClient);
this.licenseClient = new LicenseClient(restHighLevelClient);
} }
public WatcherClient watcher() { public WatcherClient watcher() {
@ -100,4 +102,15 @@ public final class XPackClient {
restHighLevelClient.performRequestAsyncAndParseEntity(request, RequestConverters::xpackUsage, options, restHighLevelClient.performRequestAsyncAndParseEntity(request, RequestConverters::xpackUsage, options,
XPackUsageResponse::fromXContent, listener, emptySet()); XPackUsageResponse::fromXContent, listener, emptySet());
} }
/**
* A wrapper for the {@link RestHighLevelClient} that provides methods for
* accessing the Elastic Licensing APIs.
* <p>
* See the <a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/licensing-apis.html">
* X-Pack APIs on elastic.co</a> for more information.
*/
public LicenseClient license() {
return licenseClient;
}
} }

View File

@ -22,7 +22,11 @@ package org.elasticsearch.client;
import org.elasticsearch.action.search.SearchRequest; import org.elasticsearch.action.search.SearchRequest;
import org.elasticsearch.action.support.IndicesOptions; import org.elasticsearch.action.support.IndicesOptions;
import org.elasticsearch.index.query.MatchAllQueryBuilder; import org.elasticsearch.index.query.MatchAllQueryBuilder;
import org.elasticsearch.index.rankeval.DiscountedCumulativeGain;
import org.elasticsearch.index.rankeval.EvalQueryQuality; import org.elasticsearch.index.rankeval.EvalQueryQuality;
import org.elasticsearch.index.rankeval.EvaluationMetric;
import org.elasticsearch.index.rankeval.ExpectedReciprocalRank;
import org.elasticsearch.index.rankeval.MeanReciprocalRank;
import org.elasticsearch.index.rankeval.PrecisionAtK; import org.elasticsearch.index.rankeval.PrecisionAtK;
import org.elasticsearch.index.rankeval.RankEvalRequest; import org.elasticsearch.index.rankeval.RankEvalRequest;
import org.elasticsearch.index.rankeval.RankEvalResponse; import org.elasticsearch.index.rankeval.RankEvalResponse;
@ -35,8 +39,10 @@ import org.junit.Before;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.function.Supplier;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import java.util.stream.Stream; import java.util.stream.Stream;
@ -64,15 +70,7 @@ public class RankEvalIT extends ESRestHighLevelClientTestCase {
* calculation where all unlabeled documents are treated as not relevant. * calculation where all unlabeled documents are treated as not relevant.
*/ */
public void testRankEvalRequest() throws IOException { public void testRankEvalRequest() throws IOException {
SearchSourceBuilder testQuery = new SearchSourceBuilder(); List<RatedRequest> specifications = createTestEvaluationSpec();
testQuery.query(new MatchAllQueryBuilder());
List<RatedDocument> amsterdamRatedDocs = createRelevant("index" , "amsterdam1", "amsterdam2", "amsterdam3", "amsterdam4");
amsterdamRatedDocs.addAll(createRelevant("index2", "amsterdam0"));
RatedRequest amsterdamRequest = new RatedRequest("amsterdam_query", amsterdamRatedDocs, testQuery);
RatedRequest berlinRequest = new RatedRequest("berlin_query", createRelevant("index", "berlin"), testQuery);
List<RatedRequest> specifications = new ArrayList<>();
specifications.add(amsterdamRequest);
specifications.add(berlinRequest);
PrecisionAtK metric = new PrecisionAtK(1, false, 10); PrecisionAtK metric = new PrecisionAtK(1, false, 10);
RankEvalSpec spec = new RankEvalSpec(specifications, metric); RankEvalSpec spec = new RankEvalSpec(specifications, metric);
@ -80,7 +78,7 @@ public class RankEvalIT extends ESRestHighLevelClientTestCase {
RankEvalResponse response = execute(rankEvalRequest, highLevelClient()::rankEval, highLevelClient()::rankEvalAsync); RankEvalResponse response = execute(rankEvalRequest, highLevelClient()::rankEval, highLevelClient()::rankEvalAsync);
// the expected Prec@ for the first query is 5/7 and the expected Prec@ for the second is 1/7, divided by 2 to get the average // the expected Prec@ for the first query is 5/7 and the expected Prec@ for the second is 1/7, divided by 2 to get the average
double expectedPrecision = (1.0 / 7.0 + 5.0 / 7.0) / 2.0; double expectedPrecision = (1.0 / 7.0 + 5.0 / 7.0) / 2.0;
assertEquals(expectedPrecision, response.getEvaluationResult(), Double.MIN_VALUE); assertEquals(expectedPrecision, response.getMetricScore(), Double.MIN_VALUE);
Map<String, EvalQueryQuality> partialResults = response.getPartialResults(); Map<String, EvalQueryQuality> partialResults = response.getPartialResults();
assertEquals(2, partialResults.size()); assertEquals(2, partialResults.size());
EvalQueryQuality amsterdamQueryQuality = partialResults.get("amsterdam_query"); EvalQueryQuality amsterdamQueryQuality = partialResults.get("amsterdam_query");
@ -114,6 +112,38 @@ public class RankEvalIT extends ESRestHighLevelClientTestCase {
response = execute(rankEvalRequest, highLevelClient()::rankEval, highLevelClient()::rankEvalAsync); response = execute(rankEvalRequest, highLevelClient()::rankEval, highLevelClient()::rankEvalAsync);
} }
private static List<RatedRequest> createTestEvaluationSpec() {
SearchSourceBuilder testQuery = new SearchSourceBuilder();
testQuery.query(new MatchAllQueryBuilder());
List<RatedDocument> amsterdamRatedDocs = createRelevant("index" , "amsterdam1", "amsterdam2", "amsterdam3", "amsterdam4");
amsterdamRatedDocs.addAll(createRelevant("index2", "amsterdam0"));
RatedRequest amsterdamRequest = new RatedRequest("amsterdam_query", amsterdamRatedDocs, testQuery);
RatedRequest berlinRequest = new RatedRequest("berlin_query", createRelevant("index", "berlin"), testQuery);
List<RatedRequest> specifications = new ArrayList<>();
specifications.add(amsterdamRequest);
specifications.add(berlinRequest);
return specifications;
}
/**
* Test case checks that the default metrics are registered and usable
*/
public void testMetrics() throws IOException {
List<RatedRequest> specifications = createTestEvaluationSpec();
List<Supplier<EvaluationMetric>> metrics = Arrays.asList(PrecisionAtK::new, MeanReciprocalRank::new, DiscountedCumulativeGain::new,
() -> new ExpectedReciprocalRank(1));
double expectedScores[] = new double[] {0.4285714285714286, 0.75, 1.6408962261063627, 0.4407738095238095};
int i = 0;
for (Supplier<EvaluationMetric> metricSupplier : metrics) {
RankEvalSpec spec = new RankEvalSpec(specifications, metricSupplier.get());
RankEvalRequest rankEvalRequest = new RankEvalRequest(spec, new String[] { "index", "index2" });
RankEvalResponse response = execute(rankEvalRequest, highLevelClient()::rankEval, highLevelClient()::rankEvalAsync);
assertEquals(expectedScores[i], response.getMetricScore(), Double.MIN_VALUE);
i++;
}
}
private static List<RatedDocument> createRelevant(String indexName, String... docs) { private static List<RatedDocument> createRelevant(String indexName, String... docs) {
return Stream.of(docs).map(s -> new RatedDocument(indexName, s, 1)).collect(Collectors.toList()); return Stream.of(docs).map(s -> new RatedDocument(indexName, s, 1)).collect(Collectors.toList());
} }

View File

@ -41,6 +41,7 @@ import org.elasticsearch.action.admin.cluster.settings.ClusterUpdateSettingsRequ
import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotRequest; import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotRequest;
import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotRequest; import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotRequest;
import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsRequest; import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsRequest;
import org.elasticsearch.action.admin.cluster.snapshots.restore.RestoreSnapshotRequest;
import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusRequest; import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusRequest;
import org.elasticsearch.action.admin.cluster.storedscripts.DeleteStoredScriptRequest; import org.elasticsearch.action.admin.cluster.storedscripts.DeleteStoredScriptRequest;
import org.elasticsearch.action.admin.cluster.storedscripts.GetStoredScriptRequest; import org.elasticsearch.action.admin.cluster.storedscripts.GetStoredScriptRequest;
@ -2198,6 +2199,31 @@ public class RequestConvertersTests extends ESTestCase {
assertThat(request.getEntity(), is(nullValue())); assertThat(request.getEntity(), is(nullValue()));
} }
public void testRestoreSnapshot() throws IOException {
Map<String, String> expectedParams = new HashMap<>();
String repository = randomIndicesNames(1, 1)[0];
String snapshot = "snapshot-" + randomAlphaOfLengthBetween(2, 5).toLowerCase(Locale.ROOT);
String endpoint = String.format(Locale.ROOT, "/_snapshot/%s/%s/_restore", repository, snapshot);
RestoreSnapshotRequest restoreSnapshotRequest = new RestoreSnapshotRequest(repository, snapshot);
setRandomMasterTimeout(restoreSnapshotRequest, expectedParams);
if (randomBoolean()) {
restoreSnapshotRequest.waitForCompletion(true);
expectedParams.put("wait_for_completion", "true");
}
if (randomBoolean()) {
String timeout = randomTimeValue();
restoreSnapshotRequest.masterNodeTimeout(timeout);
expectedParams.put("master_timeout", timeout);
}
Request request = RequestConverters.restoreSnapshot(restoreSnapshotRequest);
assertThat(endpoint, equalTo(request.getEndpoint()));
assertThat(HttpPost.METHOD_NAME, equalTo(request.getMethod()));
assertThat(expectedParams, equalTo(request.getParameters()));
assertToXContentBody(restoreSnapshotRequest, request.getEntity());
}
public void testDeleteSnapshot() { public void testDeleteSnapshot() {
Map<String, String> expectedParams = new HashMap<>(); Map<String, String> expectedParams = new HashMap<>();
String repository = randomIndicesNames(1, 1)[0]; String repository = randomIndicesNames(1, 1)[0];

View File

@ -20,6 +20,7 @@
package org.elasticsearch.client; package org.elasticsearch.client;
import com.fasterxml.jackson.core.JsonParseException; import com.fasterxml.jackson.core.JsonParseException;
import org.apache.http.HttpEntity; import org.apache.http.HttpEntity;
import org.apache.http.HttpHost; import org.apache.http.HttpHost;
import org.apache.http.HttpResponse; import org.apache.http.HttpResponse;
@ -60,6 +61,7 @@ import org.elasticsearch.common.xcontent.cbor.CborXContent;
import org.elasticsearch.common.xcontent.smile.SmileXContent; import org.elasticsearch.common.xcontent.smile.SmileXContent;
import org.elasticsearch.index.rankeval.DiscountedCumulativeGain; import org.elasticsearch.index.rankeval.DiscountedCumulativeGain;
import org.elasticsearch.index.rankeval.EvaluationMetric; import org.elasticsearch.index.rankeval.EvaluationMetric;
import org.elasticsearch.index.rankeval.ExpectedReciprocalRank;
import org.elasticsearch.index.rankeval.MeanReciprocalRank; import org.elasticsearch.index.rankeval.MeanReciprocalRank;
import org.elasticsearch.index.rankeval.MetricDetail; import org.elasticsearch.index.rankeval.MetricDetail;
import org.elasticsearch.index.rankeval.PrecisionAtK; import org.elasticsearch.index.rankeval.PrecisionAtK;
@ -616,7 +618,7 @@ public class RestHighLevelClientTests extends ESTestCase {
public void testProvidedNamedXContents() { public void testProvidedNamedXContents() {
List<NamedXContentRegistry.Entry> namedXContents = RestHighLevelClient.getProvidedNamedXContents(); List<NamedXContentRegistry.Entry> namedXContents = RestHighLevelClient.getProvidedNamedXContents();
assertEquals(8, namedXContents.size()); assertEquals(10, namedXContents.size());
Map<Class<?>, Integer> categories = new HashMap<>(); Map<Class<?>, Integer> categories = new HashMap<>();
List<String> names = new ArrayList<>(); List<String> names = new ArrayList<>();
for (NamedXContentRegistry.Entry namedXContent : namedXContents) { for (NamedXContentRegistry.Entry namedXContent : namedXContents) {
@ -630,14 +632,16 @@ public class RestHighLevelClientTests extends ESTestCase {
assertEquals(Integer.valueOf(2), categories.get(Aggregation.class)); assertEquals(Integer.valueOf(2), categories.get(Aggregation.class));
assertTrue(names.contains(ChildrenAggregationBuilder.NAME)); assertTrue(names.contains(ChildrenAggregationBuilder.NAME));
assertTrue(names.contains(MatrixStatsAggregationBuilder.NAME)); assertTrue(names.contains(MatrixStatsAggregationBuilder.NAME));
assertEquals(Integer.valueOf(3), categories.get(EvaluationMetric.class)); assertEquals(Integer.valueOf(4), categories.get(EvaluationMetric.class));
assertTrue(names.contains(PrecisionAtK.NAME)); assertTrue(names.contains(PrecisionAtK.NAME));
assertTrue(names.contains(DiscountedCumulativeGain.NAME)); assertTrue(names.contains(DiscountedCumulativeGain.NAME));
assertTrue(names.contains(MeanReciprocalRank.NAME)); assertTrue(names.contains(MeanReciprocalRank.NAME));
assertEquals(Integer.valueOf(3), categories.get(MetricDetail.class)); assertTrue(names.contains(ExpectedReciprocalRank.NAME));
assertEquals(Integer.valueOf(4), categories.get(MetricDetail.class));
assertTrue(names.contains(PrecisionAtK.NAME)); assertTrue(names.contains(PrecisionAtK.NAME));
assertTrue(names.contains(MeanReciprocalRank.NAME)); assertTrue(names.contains(MeanReciprocalRank.NAME));
assertTrue(names.contains(DiscountedCumulativeGain.NAME)); assertTrue(names.contains(DiscountedCumulativeGain.NAME));
assertTrue(names.contains(ExpectedReciprocalRank.NAME));
} }
public void testApiNamingConventions() throws Exception { public void testApiNamingConventions() throws Exception {
@ -661,7 +665,6 @@ public class RestHighLevelClientTests extends ESTestCase {
"reindex_rethrottle", "reindex_rethrottle",
"render_search_template", "render_search_template",
"scripts_painless_execute", "scripts_painless_execute",
"snapshot.restore",
"tasks.get", "tasks.get",
"termvectors", "termvectors",
"update_by_query" "update_by_query"

View File

@ -28,6 +28,8 @@ import org.elasticsearch.action.admin.cluster.repositories.put.PutRepositoryRequ
import org.elasticsearch.action.admin.cluster.repositories.put.PutRepositoryResponse; import org.elasticsearch.action.admin.cluster.repositories.put.PutRepositoryResponse;
import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryRequest; import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryRequest;
import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryResponse; import org.elasticsearch.action.admin.cluster.repositories.verify.VerifyRepositoryResponse;
import org.elasticsearch.action.admin.cluster.snapshots.restore.RestoreSnapshotRequest;
import org.elasticsearch.action.admin.cluster.snapshots.restore.RestoreSnapshotResponse;
import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusRequest; import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusRequest;
import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusResponse; import org.elasticsearch.action.admin.cluster.snapshots.status.SnapshotsStatusResponse;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
@ -40,12 +42,15 @@ import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsResponse
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.repositories.fs.FsRepository; import org.elasticsearch.repositories.fs.FsRepository;
import org.elasticsearch.rest.RestStatus; import org.elasticsearch.rest.RestStatus;
import org.elasticsearch.snapshots.RestoreInfo;
import java.io.IOException; import java.io.IOException;
import java.util.Collections;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.contains;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.greaterThan;
import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.is;
public class SnapshotIT extends ESRestHighLevelClientTestCase { public class SnapshotIT extends ESRestHighLevelClientTestCase {
@ -205,6 +210,42 @@ public class SnapshotIT extends ESRestHighLevelClientTestCase {
assertThat(response.getSnapshots().get(0).getIndices().containsKey(testIndex), is(true)); assertThat(response.getSnapshots().get(0).getIndices().containsKey(testIndex), is(true));
} }
public void testRestoreSnapshot() throws IOException {
String testRepository = "test";
String testSnapshot = "snapshot_1";
String testIndex = "test_index";
String restoredIndex = testIndex + "_restored";
PutRepositoryResponse putRepositoryResponse = createTestRepository(testRepository, FsRepository.TYPE, "{\"location\": \".\"}");
assertTrue(putRepositoryResponse.isAcknowledged());
createIndex(testIndex, Settings.EMPTY);
assertTrue("index [" + testIndex + "] should have been created", indexExists(testIndex));
CreateSnapshotRequest createSnapshotRequest = new CreateSnapshotRequest(testRepository, testSnapshot);
createSnapshotRequest.indices(testIndex);
createSnapshotRequest.waitForCompletion(true);
CreateSnapshotResponse createSnapshotResponse = createTestSnapshot(createSnapshotRequest);
assertEquals(RestStatus.OK, createSnapshotResponse.status());
deleteIndex(testIndex);
assertFalse("index [" + testIndex + "] should have been deleted", indexExists(testIndex));
RestoreSnapshotRequest request = new RestoreSnapshotRequest(testRepository, testSnapshot);
request.waitForCompletion(true);
request.renamePattern(testIndex);
request.renameReplacement(restoredIndex);
RestoreSnapshotResponse response = execute(request, highLevelClient().snapshot()::restore,
highLevelClient().snapshot()::restoreAsync);
RestoreInfo restoreInfo = response.getRestoreInfo();
assertThat(restoreInfo.name(), equalTo(testSnapshot));
assertThat(restoreInfo.indices(), equalTo(Collections.singletonList(restoredIndex)));
assertThat(restoreInfo.successfulShards(), greaterThan(0));
assertThat(restoreInfo.failedShards(), equalTo(0));
}
public void testDeleteSnapshot() throws IOException { public void testDeleteSnapshot() throws IOException {
String repository = "test_repository"; String repository = "test_repository";
String snapshot = "test_snapshot"; String snapshot = "test_snapshot";

View File

@ -0,0 +1,106 @@
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.elasticsearch.client.documentation;
import org.elasticsearch.action.ActionListener;
import org.elasticsearch.action.LatchedActionListener;
import org.elasticsearch.client.ESRestHighLevelClientTestCase;
import org.elasticsearch.client.RequestOptions;
import org.elasticsearch.client.RestHighLevelClient;
import org.elasticsearch.protocol.xpack.license.LicensesStatus;
import org.elasticsearch.protocol.xpack.license.PutLicenseRequest;
import org.elasticsearch.protocol.xpack.license.PutLicenseResponse;
import java.util.Map;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import static org.hamcrest.Matchers.hasSize;
import static org.hamcrest.Matchers.not;
import static org.hamcrest.Matchers.startsWith;
/**
* Documentation for Licensing APIs in the high level java client.
* Code wrapped in {@code tag} and {@code end} tags is included in the docs.
*/
public class LicensingDocumentationIT extends ESRestHighLevelClientTestCase {
public void testPutLicense() throws Exception {
RestHighLevelClient client = highLevelClient();
String license = "{\"license\": {\"uid\":\"893361dc-9749-4997-93cb-802e3d7fa4a8\",\"type\":\"gold\"," +
"\"issue_date_in_millis\":1411948800000,\"expiry_date_in_millis\":1914278399999,\"max_nodes\":1,\"issued_to\":\"issued_to\"," +
"\"issuer\":\"issuer\",\"signature\":\"AAAAAgAAAA3U8+YmnvwC+CWsV/mRAAABmC9ZN0hjZDBGYnVyRXpCOW5Bb3FjZDAxOWpSbTVoMVZwUzRxVk1PSm" +
"kxakxZdW5IMlhlTHNoN1N2MXMvRFk4d3JTZEx3R3RRZ0pzU3lobWJKZnQvSEFva0ppTHBkWkprZWZSQi9iNmRQNkw1SlpLN0lDalZCS095MXRGN1lIZlpYcVVTTn" +
"FrcTE2dzhJZmZrdFQrN3JQeGwxb0U0MXZ0dDJHSERiZTVLOHNzSDByWnpoZEphZHBEZjUrTVBxRENNSXNsWWJjZllaODdzVmEzUjNiWktNWGM5TUhQV2plaUo4Q1" +
"JOUml4MXNuL0pSOEhQaVB2azhmUk9QVzhFeTFoM1Q0RnJXSG53MWk2K055c28zSmRnVkF1b2JSQkFLV2VXUmVHNDZ2R3o2VE1qbVNQS2lxOHN5bUErZlNIWkZSVm" +
"ZIWEtaSU9wTTJENDVvT1NCYklacUYyK2FwRW9xa0t6dldMbmMzSGtQc3FWOTgzZ3ZUcXMvQkt2RUZwMFJnZzlvL2d2bDRWUzh6UG5pdENGWFRreXNKNkE9PQAAAQ" +
"Be8GfzDm6T537Iuuvjetb3xK5dvg0K5NQapv+rczWcQFxgCuzbF8plkgetP1aAGZP4uRESDQPMlOCsx4d0UqqAm9f7GbBQ3l93P+PogInPFeEH9NvOmaAQovmxVM" +
"9SE6DsDqlX4cXSO+bgWpXPTd2LmpoQc1fXd6BZ8GeuyYpVHVKp9hVU0tAYjw6HzYOE7+zuO1oJYOxElqy66AnIfkvHrvni+flym3tE7tDTgsDRaz7W3iBhaqiSnt" +
"EqabEkvHdPHQdSR99XGaEvnHO1paK01/35iZF6OXHsF7CCj+558GRXiVxzueOe7TsGSSt8g7YjZwV9bRCyU7oB4B/nidgI\"}}";
{
//tag::put-license-execute
PutLicenseRequest request = new PutLicenseRequest();
request.setLicenseDefinition(license); // <1>
request.setAcknowledge(false); // <2>
PutLicenseResponse response = client.xpack().license().putLicense(request, RequestOptions.DEFAULT);
//end::put-license-execute
//tag::put-license-response
LicensesStatus status = response.status(); // <1>
assertEquals(status, LicensesStatus.VALID); // <2>
boolean acknowledged = response.isAcknowledged(); // <3>
String acknowledgeHeader = response.acknowledgeHeader(); // <4>
Map<String, String[]> acknowledgeMessages = response.acknowledgeMessages(); // <5>
//end::put-license-response
assertFalse(acknowledged); // Should fail because we are trying to downgrade from platinum trial to gold
assertThat(acknowledgeHeader, startsWith("This license update requires acknowledgement."));
assertThat(acknowledgeMessages.keySet(), not(hasSize(0)));
}
{
PutLicenseRequest request = new PutLicenseRequest();
// tag::put-license-execute-listener
ActionListener<PutLicenseResponse> listener = new ActionListener<PutLicenseResponse>() {
@Override
public void onResponse(PutLicenseResponse indexResponse) {
// <1>
}
@Override
public void onFailure(Exception e) {
// <2>
}
};
// end::put-license-execute-listener
// Replace the empty listener by a blocking listener in test
final CountDownLatch latch = new CountDownLatch(1);
listener = new LatchedActionListener<>(listener, latch);
// tag::put-license-execute-async
client.xpack().license().putLicenseAsync(
request, RequestOptions.DEFAULT, listener); // <1>
// end::put-license-execute-async
assertTrue(latch.await(30L, TimeUnit.SECONDS));
}
}
}

View File

@ -1136,14 +1136,14 @@ public class SearchDocumentationIT extends ESRestHighLevelClientTestCase {
// end::rank-eval-execute // end::rank-eval-execute
// tag::rank-eval-response // tag::rank-eval-response
double evaluationResult = response.getEvaluationResult(); // <1> double evaluationResult = response.getMetricScore(); // <1>
assertEquals(1.0 / 3.0, evaluationResult, 0.0); assertEquals(1.0 / 3.0, evaluationResult, 0.0);
Map<String, EvalQueryQuality> partialResults = Map<String, EvalQueryQuality> partialResults =
response.getPartialResults(); response.getPartialResults();
EvalQueryQuality evalQuality = EvalQueryQuality evalQuality =
partialResults.get("kimchy_query"); // <2> partialResults.get("kimchy_query"); // <2>
assertEquals("kimchy_query", evalQuality.getId()); assertEquals("kimchy_query", evalQuality.getId());
double qualityLevel = evalQuality.getQualityLevel(); // <3> double qualityLevel = evalQuality.metricScore(); // <3>
assertEquals(1.0 / 3.0, qualityLevel, 0.0); assertEquals(1.0 / 3.0, qualityLevel, 0.0);
List<RatedSearchHit> hitsAndRatings = evalQuality.getHitsAndRatings(); List<RatedSearchHit> hitsAndRatings = evalQuality.getHitsAndRatings();
RatedSearchHit ratedSearchHit = hitsAndRatings.get(2); RatedSearchHit ratedSearchHit = hitsAndRatings.get(2);

View File

@ -33,6 +33,8 @@ import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotReq
import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotResponse; import org.elasticsearch.action.admin.cluster.snapshots.create.CreateSnapshotResponse;
import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsRequest; import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsRequest;
import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsResponse; import org.elasticsearch.action.admin.cluster.snapshots.get.GetSnapshotsResponse;
import org.elasticsearch.action.admin.cluster.snapshots.restore.RestoreSnapshotRequest;
import org.elasticsearch.action.admin.cluster.snapshots.restore.RestoreSnapshotResponse;
import org.elasticsearch.action.admin.indices.create.CreateIndexRequest; import org.elasticsearch.action.admin.indices.create.CreateIndexRequest;
import org.elasticsearch.action.support.IndicesOptions; import org.elasticsearch.action.support.IndicesOptions;
import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotRequest; import org.elasticsearch.action.admin.cluster.snapshots.delete.DeleteSnapshotRequest;
@ -53,12 +55,15 @@ import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.repositories.fs.FsRepository; import org.elasticsearch.repositories.fs.FsRepository;
import org.elasticsearch.rest.RestStatus; import org.elasticsearch.rest.RestStatus;
import org.elasticsearch.snapshots.RestoreInfo;
import org.elasticsearch.snapshots.SnapshotId; import org.elasticsearch.snapshots.SnapshotId;
import org.elasticsearch.snapshots.SnapshotInfo; import org.elasticsearch.snapshots.SnapshotInfo;
import org.elasticsearch.snapshots.SnapshotShardFailure; import org.elasticsearch.snapshots.SnapshotShardFailure;
import org.elasticsearch.snapshots.SnapshotState; import org.elasticsearch.snapshots.SnapshotState;
import java.io.IOException; import java.io.IOException;
import java.util.Collections;
import java.util.EnumSet;
import java.util.HashMap; import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Locale; import java.util.Locale;
@ -263,6 +268,107 @@ public class SnapshotClientDocumentationIT extends ESRestHighLevelClientTestCase
} }
} }
public void testRestoreSnapshot() throws IOException {
RestHighLevelClient client = highLevelClient();
createTestRepositories();
createTestIndex();
createTestSnapshots();
// tag::restore-snapshot-request
RestoreSnapshotRequest request = new RestoreSnapshotRequest(repositoryName, snapshotName);
// end::restore-snapshot-request
// we need to restore as a different index name
// tag::restore-snapshot-request-masterTimeout
request.masterNodeTimeout(TimeValue.timeValueMinutes(1)); // <1>
request.masterNodeTimeout("1m"); // <2>
// end::restore-snapshot-request-masterTimeout
// tag::restore-snapshot-request-waitForCompletion
request.waitForCompletion(true); // <1>
// end::restore-snapshot-request-waitForCompletion
// tag::restore-snapshot-request-partial
request.partial(false); // <1>
// end::restore-snapshot-request-partial
// tag::restore-snapshot-request-include-global-state
request.includeGlobalState(false); // <1>
// end::restore-snapshot-request-include-global-state
// tag::restore-snapshot-request-include-aliases
request.includeAliases(false); // <1>
// end::restore-snapshot-request-include-aliases
// tag::restore-snapshot-request-indices
request.indices("test_index");
// end::restore-snapshot-request-indices
String restoredIndexName = "restored_index";
// tag::restore-snapshot-request-rename
request.renamePattern("test_(.+)"); // <1>
request.renameReplacement("restored_$1"); // <2>
// end::restore-snapshot-request-rename
// tag::restore-snapshot-request-index-settings
request.indexSettings( // <1>
Settings.builder()
.put("index.number_of_replicas", 0)
.build());
request.ignoreIndexSettings("index.refresh_interval", "index.search.idle.after"); // <2>
request.indicesOptions(new IndicesOptions( // <3>
EnumSet.of(IndicesOptions.Option.IGNORE_UNAVAILABLE),
EnumSet.of(IndicesOptions.WildcardStates.OPEN)));
// end::restore-snapshot-request-index-settings
// tag::restore-snapshot-execute
RestoreSnapshotResponse response = client.snapshot().restore(request, RequestOptions.DEFAULT);
// end::restore-snapshot-execute
// tag::restore-snapshot-response
RestoreInfo restoreInfo = response.getRestoreInfo();
List<String> indices = restoreInfo.indices(); // <1>
// end::restore-snapshot-response
assertEquals(Collections.singletonList(restoredIndexName), indices);
assertEquals(0, restoreInfo.failedShards());
assertTrue(restoreInfo.successfulShards() > 0);
}
public void testRestoreSnapshotAsync() throws InterruptedException {
RestHighLevelClient client = highLevelClient();
{
RestoreSnapshotRequest request = new RestoreSnapshotRequest();
// tag::restore-snapshot-execute-listener
ActionListener<RestoreSnapshotResponse> listener =
new ActionListener<RestoreSnapshotResponse>() {
@Override
public void onResponse(RestoreSnapshotResponse restoreSnapshotResponse) {
// <1>
}
@Override
public void onFailure(Exception e) {
// <2>
}
};
// end::restore-snapshot-execute-listener
// Replace the empty listener by a blocking listener in test
final CountDownLatch latch = new CountDownLatch(1);
listener = new LatchedActionListener<>(listener, latch);
// tag::restore-snapshot-execute-async
client.snapshot().restoreAsync(request, RequestOptions.DEFAULT, listener); // <1>
// end::restore-snapshot-execute-async
assertTrue(latch.await(30L, TimeUnit.SECONDS));
}
}
public void testSnapshotDeleteRepository() throws IOException { public void testSnapshotDeleteRepository() throws IOException {
RestHighLevelClient client = highLevelClient(); RestHighLevelClient client = highLevelClient();

View File

@ -21,6 +21,7 @@ package org.elasticsearch.transport.client;
import io.netty.util.ThreadDeathWatcher; import io.netty.util.ThreadDeathWatcher;
import io.netty.util.concurrent.GlobalEventExecutor; import io.netty.util.concurrent.GlobalEventExecutor;
import org.elasticsearch.client.transport.TransportClient; import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.SuppressForbidden; import org.elasticsearch.common.SuppressForbidden;
import org.elasticsearch.common.network.NetworkModule; import org.elasticsearch.common.network.NetworkModule;

View File

@ -125,27 +125,18 @@ Closure commonPackageConfig(String type, boolean oss) {
fileMode 0644 fileMode 0644
} }
into('lib') { into('lib') {
with copySpec {
with libFiles(oss) with libFiles(oss)
// we need to specify every intermediate directory so we iterate through the parents; duplicate calls with the same part are fine
eachFile { FileCopyDetails fcp ->
String[] segments = fcp.relativePath.segments
for (int i = segments.length - 2; i > 0 && segments[i] != 'lib'; --i) {
directory('/' + segments[0..i].join('/'), 0755)
}
fcp.mode = 0644
}
}
} }
into('modules') { into('modules') {
with copySpec {
with modulesFiles(oss) with modulesFiles(oss)
// we need to specify every intermediate directory so we iterate through the parents; duplicate calls with the same part are fine }
// we need to specify every intermediate directory in these paths so the package managers know they are explicitly
// intended to manage them; otherwise they may be left behind on uninstallation. duplicate calls of the same
// directory are fine
eachFile { FileCopyDetails fcp -> eachFile { FileCopyDetails fcp ->
String[] segments = fcp.relativePath.segments String[] segments = fcp.relativePath.segments
for (int i = segments.length - 2; i > 0 && segments[i] != 'modules'; --i) { for (int i = segments.length - 2; i > 2; --i) {
directory('/' + segments[0..i].join('/'), 0755) directory('/' + segments[0..i].join('/'), 0755)
}
if (segments[-2] == 'bin') { if (segments[-2] == 'bin') {
fcp.mode = 0755 fcp.mode = 0755
} else { } else {
@ -154,7 +145,6 @@ Closure commonPackageConfig(String type, boolean oss) {
} }
} }
} }
}
// license files // license files
if (type == 'deb') { if (type == 'deb') {
@ -333,12 +323,6 @@ Closure commonRpmConfig(boolean oss) {
// without this the rpm will have parent dirs of any files we copy in, eg /etc/elasticsearch // without this the rpm will have parent dirs of any files we copy in, eg /etc/elasticsearch
addParentDirs false addParentDirs false
// Declare the folders so that the RPM package manager removes
// them when upgrading or removing the package
directory('/usr/share/elasticsearch/bin', 0755)
directory('/usr/share/elasticsearch/lib', 0755)
directory('/usr/share/elasticsearch/modules', 0755)
} }
} }

View File

@ -379,9 +379,9 @@ buildRestTests.setups['exams'] = '''
refresh: true refresh: true
body: | body: |
{"index":{}} {"index":{}}
{"grade": 100} {"grade": 100, "weight": 2}
{"index":{}} {"index":{}}
{"grade": 50}''' {"grade": 50, "weight": 3}'''
buildRestTests.setups['stored_example_script'] = ''' buildRestTests.setups['stored_example_script'] = '''
# Simple script to load a field. Not really a good example, but a simple one. # Simple script to load a field. Not really a good example, but a simple one.

View File

@ -0,0 +1,66 @@
[[java-rest-high-put-license]]
=== Update License
[[java-rest-high-put-license-execution]]
==== Execution
The license can be added or updated using the `putLicense()` method:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/LicensingDocumentationIT.java[put-license-execute]
--------------------------------------------------
<1> Set the categories of information to retrieve. The the default is to
return no information which is useful for checking if {xpack} is installed
but not much else.
<2> A JSON document containing the license information.
[[java-rest-high-put-license-response]]
==== Response
The returned `PutLicenseResponse` contains the `LicensesStatus`,
`acknowledged` flag and possible acknowledge messages. The acknowledge messages
are present if you previously had a license with more features than one you
are trying to update and you didn't set the `acknowledge` flag to `true`. In this case
you need to display the messages to the end user and if they agree, resubmit the
license with the `acknowledge` flag set to `true`. Please note that the request will
still return a 200 return code even if requires an acknowledgement. So, it is
necessary to check the `acknowledged` flag.
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/LicensingDocumentationIT.java[put-license-response]
--------------------------------------------------
<1> The status of the license
<2> Make sure that the license is valid.
<3> Check the acknowledge flag.
<4> It should be true if license is acknowledge.
<5> Otherwise we can see the acknowledge messages in `acknowledgeHeader()` and check
component-specific messages in `acknowledgeMessages()`.
[[java-rest-high-put-license-async]]
==== Asynchronous Execution
This request can be executed asynchronously:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/LicensingDocumentationIT.java[put-license-execute-async]
--------------------------------------------------
<1> The `PutLicenseRequest` to execute and the `ActionListener` to use when
the execution completes
The asynchronous method does not block and returns immediately. Once it is
completed the `ActionListener` is called back using the `onResponse` method
if the execution successfully completed or using the `onFailure` method if
it failed.
A typical listener for `PutLicenseResponse` looks like:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/LicensingDocumentationIT.java[put-license-execute-listener]
--------------------------------------------------
<1> Called when the execution is successfully completed. The response is
provided as an argument
<2> Called in case of failure. The raised exception is provided as an argument

View File

@ -0,0 +1,144 @@
[[java-rest-high-snapshot-restore-snapshot]]
=== Restore Snapshot API
The Restore Snapshot API allows to restore a snapshot.
[[java-rest-high-snapshot-restore-snapshot-request]]
==== Restore Snapshot Request
A `RestoreSnapshotRequest`:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-request]
--------------------------------------------------
==== Limiting Indices to Restore
By default all indices are restored. With the `indices` property you can
provide a list of indices that should be restored:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-request-indices]
--------------------------------------------------
<1> Request that Elasticsearch only restores "test_index".
==== Renaming Indices
You can rename indices using regular expressions when restoring a snapshot:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-request-rename]
--------------------------------------------------
<1> A regular expression matching the indices that should be renamed.
<2> A replacement pattern that references the group from the regular
expression as `$1`. "test_index" from the snapshot is restored as
"restored_index" in this example.
==== Index Settings and Options
You can also customize index settings and options when restoring:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-request-index-settings]
--------------------------------------------------
<1> Use `#indexSettings()` to set any specific index setting for the indices
that are restored.
<2> Use `#ignoreIndexSettings()` to provide index settings that should be
ignored from the original indices.
<3> Set `IndicesOptions.Option.IGNORE_UNAVAILABLE` in `#indicesOptions()` to
have the restore succeed even if indices are missing in the snapshot.
==== Further Arguments
The following arguments can optionally be provided:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-request-masterTimeout]
--------------------------------------------------
<1> Timeout to connect to the master node as a `TimeValue`
<2> Timeout to connect to the master node as a `String`
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-request-waitForCompletion]
--------------------------------------------------
<1> Boolean indicating whether to wait until the snapshot has been restored.
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-request-partial]
--------------------------------------------------
<1> Boolean indicating whether the entire snapshot should succeed although one
or more indices participating in the snapshot dont have all primary
shards available.
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-request-include-global-state]
--------------------------------------------------
<1> Boolean indicating whether restored templates that dont currently exist
in the cluster are added and existing templates with the same name are
replaced by the restored templates. The restored persistent settings are
added to the existing persistent settings.
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-request-include-aliases]
--------------------------------------------------
<1> Boolean to control whether aliases should be restored. Set to `false` to
prevent aliases from being restored together with associated indices.
[[java-rest-high-snapshot-restore-snapshot-sync]]
==== Synchronous Execution
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-execute]
--------------------------------------------------
[[java-rest-high-snapshot-restore-snapshot-async]]
==== Asynchronous Execution
The asynchronous execution of a restore snapshot request requires both the
`RestoreSnapshotRequest` instance and an `ActionListener` instance to be
passed to the asynchronous method:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-execute-async]
--------------------------------------------------
<1> The `RestoreSnapshotRequest` to execute and the `ActionListener`
to use when the execution completes
The asynchronous method does not block and returns immediately. Once it is
completed the `ActionListener` is called back using the `onResponse` method
if the execution successfully completed or using the `onFailure` method if
it failed.
A typical listener for `RestoreSnapshotResponse` looks like:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-execute-listener]
--------------------------------------------------
<1> Called when the execution is successfully completed. The response is
provided as an argument.
<2> Called in case of a failure. The raised exception is provided as an argument.
[[java-rest-high-cluster-restore-snapshot-response]]
==== Restore Snapshot Response
The returned `RestoreSnapshotResponse` allows to retrieve information about the
executed operation as follows:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/SnapshotClientDocumentationIT.java[restore-snapshot-response]
--------------------------------------------------
<1> The `RestoreInfo` contains details about the restored snapshot like the indices or
the number of successfully restored and failed shards.

View File

@ -186,3 +186,12 @@ The Java High Level REST Client supports the following Scripts APIs:
include::script/get_script.asciidoc[] include::script/get_script.asciidoc[]
include::script/delete_script.asciidoc[] include::script/delete_script.asciidoc[]
== Licensing APIs
The Java High Level REST Client supports the following Licensing APIs:
* <<java-rest-high-put-license>>
include::licensing/put-license.asciidoc[]

View File

@ -17,15 +17,15 @@ The `phonetic` token filter takes the following settings:
`encoder`:: `encoder`::
Which phonetic encoder to use. Accepts `metaphone` (default), Which phonetic encoder to use. Accepts `metaphone` (default),
`doublemetaphone`, `soundex`, `refinedsoundex`, `caverphone1`, `double_metaphone`, `soundex`, `refined_soundex`, `caverphone1`,
`caverphone2`, `cologne`, `nysiis`, `koelnerphonetik`, `haasephonetik`, `caverphone2`, `cologne`, `nysiis`, `koelnerphonetik`, `haasephonetik`,
`beidermorse`, `daitch_mokotoff`. `beider_morse`, `daitch_mokotoff`.
`replace`:: `replace`::
Whether or not the original token should be replaced by the phonetic Whether or not the original token should be replaced by the phonetic
token. Accepts `true` (default) and `false`. Not supported by token. Accepts `true` (default) and `false`. Not supported by
`beidermorse` encoding. `beider_morse` encoding.
[source,js] [source,js]
-------------------------------------------------- --------------------------------------------------

View File

@ -13,6 +13,8 @@ bucket aggregations (some bucket aggregations enable you to sort the returned bu
include::metrics/avg-aggregation.asciidoc[] include::metrics/avg-aggregation.asciidoc[]
include::metrics/weighted-avg-aggregation.asciidoc[]
include::metrics/cardinality-aggregation.asciidoc[] include::metrics/cardinality-aggregation.asciidoc[]
include::metrics/extendedstats-aggregation.asciidoc[] include::metrics/extendedstats-aggregation.asciidoc[]

View File

@ -0,0 +1,202 @@
[[search-aggregations-metrics-weight-avg-aggregation]]
=== Weighted Avg Aggregation
A `single-value` metrics aggregation that computes the weighted average of numeric values that are extracted from the aggregated documents.
These values can be extracted either from specific numeric fields in the documents.
When calculating a regular average, each datapoint has an equal "weight" ... it contributes equally to the final value. Weighted averages,
on the other hand, weight each datapoint differently. The amount that each datapoint contributes to the final value is extracted from the
document, or provided by a script.
As a formula, a weighted average is the `∑(value * weight) / ∑(weight)`
A regular average can be thought of as a weighted average where every value has an implicit weight of `1`.
.`weighted_avg` Parameters
|===
|Parameter Name |Description |Required |Default Value
|`value` | The configuration for the field or script that provides the values |Required |
|`weight` | The configuration for the field or script that provides the weights |Required |
|`format` | The numeric response formatter |Optional |
|`value_type` | A hint about the values for pure scripts or unmapped fields |Optional |
|===
The `value` and `weight` objects have per-field specific configuration:
.`value` Parameters
|===
|Parameter Name |Description |Required |Default Value
|`field` | The field that values should be extracted from |Required |
|`missing` | A value to use if the field is missing entirely |Optional |
|`script` | A script which provides the values for the document. This is mutually exclusive with `field` |Optional
|===
.`weight` Parameters
|===
|Parameter Name |Description |Required |Default Value
|`field` | The field that weights should be extracted from |Required |
|`missing` | A weight to use if the field is missing entirely |Optional |
|`script` | A script which provides the weights for the document. This is mutually exclusive with `field` |Optional
|===
==== Examples
If our documents have a `"grade"` field that holds a 0-100 numeric score, and a `"weight"` field which holds an arbitrary numeric weight,
we can calculate the weighted average using:
[source,js]
--------------------------------------------------
POST /exams/_search
{
"size": 0,
"aggs" : {
"weighted_grade": {
"weighted_avg": {
"value": {
"field": "grade"
},
"weight": {
"field": "weight"
}
}
}
}
}
--------------------------------------------------
// CONSOLE
// TEST[setup:exams]
Which yields a response like:
[source,js]
--------------------------------------------------
{
...
"aggregations": {
"weighted_grade": {
"value": 70.0
}
}
}
--------------------------------------------------
// TESTRESPONSE[s/\.\.\./"took": $body.took,"timed_out": false,"_shards": $body._shards,"hits": $body.hits,/]
While multiple values-per-field are allowed, only one weight is allowed. If the aggregation encounters
a document that has more than one weight (e.g. the weight field is a multi-valued field) it will throw an exception.
If you have this situation, you will need to specify a `script` for the weight field, and use the script
to combine the multiple values into a single value to be used.
This single weight will be applied independently to each value extracted from the `value` field.
This example show how a single document with multiple values will be averaged with a single weight:
[source,js]
--------------------------------------------------
POST /exams/_doc?refresh
{
"grade": [1, 2, 3],
"weight": 2
}
POST /exams/_search
{
"size": 0,
"aggs" : {
"weighted_grade": {
"weighted_avg": {
"value": {
"field": "grade"
},
"weight": {
"field": "weight"
}
}
}
}
}
--------------------------------------------------
// CONSOLE
// TEST
The three values (`1`, `2`, and `3`) will be included as independent values, all with the weight of `2`:
[source,js]
--------------------------------------------------
{
...
"aggregations": {
"weighted_grade": {
"value": 2.0
}
}
}
--------------------------------------------------
// TESTRESPONSE[s/\.\.\./"took": $body.took,"timed_out": false,"_shards": $body._shards,"hits": $body.hits,/]
The aggregation returns `2.0` as the result, which matches what we would expect when calculating by hand:
`((1*2) + (2*2) + (3*2)) / (2+2+2) == 2`
==== Script
Both the value and the weight can be derived from a script, instead of a field. As a simple example, the following
will add one to the grade and weight in the document using a script:
[source,js]
--------------------------------------------------
POST /exams/_search
{
"size": 0,
"aggs" : {
"weighted_grade": {
"weighted_avg": {
"value": {
"script": "doc.grade.value + 1"
},
"weight": {
"script": "doc.weight.value + 1"
}
}
}
}
}
--------------------------------------------------
// CONSOLE
// TEST[setup:exams]
==== Missing values
The `missing` parameter defines how documents that are missing a value should be treated.
The default behavior is different for `value` and `weight`:
By default, if the `value` field is missing the document is ignored and the aggregation moves on to the next document.
If the `weight` field is missing, it is assumed to have a weight of `1` (like a normal average).
Both of these defaults can be overridden with the `missing` parameter:
[source,js]
--------------------------------------------------
POST /exams/_search
{
"size": 0,
"aggs" : {
"weighted_grade": {
"weighted_avg": {
"value": {
"field": "grade",
"missing": 2
},
"weight": {
"field": "weight",
"missing": 3
}
}
}
}
}
--------------------------------------------------
// CONSOLE
// TEST[setup:exams]

View File

@ -259,6 +259,56 @@ in the query. Defaults to 10.
|`normalize` | If set to `true`, this metric will calculate the https://en.wikipedia.org/wiki/Discounted_cumulative_gain#Normalized_DCG[Normalized DCG]. |`normalize` | If set to `true`, this metric will calculate the https://en.wikipedia.org/wiki/Discounted_cumulative_gain#Normalized_DCG[Normalized DCG].
|======================================================================= |=======================================================================
[float]
==== Expected Reciprocal Rank (ERR)
Expected Reciprocal Rank (ERR) is an extension of the classical reciprocal rank for the graded relevance case
(Olivier Chapelle, Donald Metzler, Ya Zhang, and Pierre Grinspan. 2009. http://olivier.chapelle.cc/pub/err.pdf[Expected reciprocal rank for graded relevance].)
It is based on the assumption of a cascade model of search, in which a user scans through ranked search
results in order and stops at the first document that satisfies the information need. For this reason, it
is a good metric for question answering and navigation queries, but less so for survey oriented information
needs where the user is interested in finding many relevant documents in the top k results.
The metric models the expectation of the reciprocal of the position at which a user stops reading through
the result list. This means that relevant document in top ranking positions will contribute much to the
overall score. However, the same document will contribute much less to the score if it appears in a lower rank,
even more so if there are some relevant (but maybe less relevant) documents preceding it.
In this way, the ERR metric discounts documents which are shown after very relevant documents. This introduces
a notion of dependency in the ordering of relevant documents that e.g. Precision or DCG don't account for.
[source,js]
--------------------------------
GET /twitter/_rank_eval
{
"requests": [
{
"id": "JFK query",
"request": { "query": { "match_all": {}}},
"ratings": []
}],
"metric": {
"expected_reciprocal_rank": {
"maximum_relevance" : 3,
"k" : 20
}
}
}
--------------------------------
// CONSOLE
// TEST[setup:twitter]
The `expected_reciprocal_rank` metric takes the following parameters:
[cols="<,<",options="header",]
|=======================================================================
|Parameter |Description
| `maximum_relevance` | Mandatory parameter. The highest relevance grade used in the user supplied
relevance judgments.
|`k` | sets the maximum number of documents retrieved per query. This value will act in place of the usual `size` parameter
in the query. Defaults to 10.
|=======================================================================
[float] [float]
=== Response format === Response format
@ -270,10 +320,10 @@ that shows potential errors of individual queries. The response has the followin
-------------------------------- --------------------------------
{ {
"rank_eval": { "rank_eval": {
"quality_level": 0.4, <1> "metric_score": 0.4, <1>
"details": { "details": {
"my_query_id1": { <2> "my_query_id1": { <2>
"quality_level": 0.6, <3> "metric_score": 0.6, <3>
"unrated_docs": [ <4> "unrated_docs": [ <4>
{ {
"_index": "my_index", "_index": "my_index",
@ -308,7 +358,7 @@ that shows potential errors of individual queries. The response has the followin
<1> the overall evaluation quality calculated by the defined metric <1> the overall evaluation quality calculated by the defined metric
<2> the `details` section contains one entry for every query in the original `requests` section, keyed by the search request id <2> the `details` section contains one entry for every query in the original `requests` section, keyed by the search request id
<3> the `quality_level` in the `details` section shows the contribution of this query to the global quality score <3> the `metric_score` in the `details` section shows the contribution of this query to the global quality metric score
<4> the `unrated_docs` section contains an `_index` and `_id` entry for each document in the search result for this <4> the `unrated_docs` section contains an `_index` and `_id` entry for each document in the search result for this
query that didn't have a ratings value. This can be used to ask the user to supply ratings for these documents query that didn't have a ratings value. This can be used to ask the user to supply ratings for these documents
<5> the `hits` section shows a grouping of the search results with their supplied rating <5> the `hits` section shows a grouping of the search results with their supplied rating

View File

@ -137,7 +137,6 @@ public class ChannelFactoryTests extends ESTestCase {
super(rawChannelFactory); super(rawChannelFactory);
} }
@SuppressWarnings("unchecked")
@Override @Override
public NioSocketChannel createChannel(NioSelector selector, SocketChannel channel) throws IOException { public NioSocketChannel createChannel(NioSelector selector, SocketChannel channel) throws IOException {
NioSocketChannel nioSocketChannel = new NioSocketChannel(channel); NioSocketChannel nioSocketChannel = new NioSocketChannel(channel);

View File

@ -120,7 +120,6 @@ public class EventHandlerTests extends ESTestCase {
verify(channelFactory, times(2)).acceptNioChannel(same(serverContext), same(selectorSupplier)); verify(channelFactory, times(2)).acceptNioChannel(same(serverContext), same(selectorSupplier));
} }
@SuppressWarnings("unchecked")
public void testHandleAcceptCallsServerAcceptCallback() throws IOException { public void testHandleAcceptCallsServerAcceptCallback() throws IOException {
NioSocketChannel childChannel = new NioSocketChannel(mock(SocketChannel.class)); NioSocketChannel childChannel = new NioSocketChannel(mock(SocketChannel.class));
SocketChannelContext childContext = mock(SocketChannelContext.class); SocketChannelContext childContext = mock(SocketChannelContext.class);

View File

@ -275,7 +275,6 @@ public class SocketChannelContextTests extends ESTestCase {
} }
} }
@SuppressWarnings("unchecked")
public void testCloseClosesChannelBuffer() throws IOException { public void testCloseClosesChannelBuffer() throws IOException {
try (SocketChannel realChannel = SocketChannel.open()) { try (SocketChannel realChannel = SocketChannel.open()) {
when(channel.getRawChannel()).thenReturn(realChannel); when(channel.getRawChannel()).thenReturn(realChannel);

View File

@ -26,7 +26,7 @@ import org.elasticsearch.search.MultiValueMode;
import org.elasticsearch.search.aggregations.AggregationBuilder; import org.elasticsearch.search.aggregations.AggregationBuilder;
import org.elasticsearch.search.aggregations.AggregatorFactories; import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.AggregatorFactory; import org.elasticsearch.search.aggregations.AggregatorFactory;
import org.elasticsearch.search.aggregations.support.MultiValuesSourceAggregationBuilder; import org.elasticsearch.search.aggregations.support.ArrayValuesSourceAggregationBuilder;
import org.elasticsearch.search.aggregations.support.ValueType; import org.elasticsearch.search.aggregations.support.ValueType;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSource.Numeric; import org.elasticsearch.search.aggregations.support.ValuesSource.Numeric;
@ -38,7 +38,7 @@ import java.io.IOException;
import java.util.Map; import java.util.Map;
public class MatrixStatsAggregationBuilder public class MatrixStatsAggregationBuilder
extends MultiValuesSourceAggregationBuilder.LeafOnly<ValuesSource.Numeric, MatrixStatsAggregationBuilder> { extends ArrayValuesSourceAggregationBuilder.LeafOnly<ValuesSource.Numeric, MatrixStatsAggregationBuilder> {
public static final String NAME = "matrix_stats"; public static final String NAME = "matrix_stats";
private MultiValueMode multiValueMode = MultiValueMode.AVG; private MultiValueMode multiValueMode = MultiValueMode.AVG;

View File

@ -30,7 +30,7 @@ import org.elasticsearch.search.aggregations.LeafBucketCollector;
import org.elasticsearch.search.aggregations.LeafBucketCollectorBase; import org.elasticsearch.search.aggregations.LeafBucketCollectorBase;
import org.elasticsearch.search.aggregations.metrics.MetricsAggregator; import org.elasticsearch.search.aggregations.metrics.MetricsAggregator;
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.MultiValuesSource.NumericMultiValuesSource; import org.elasticsearch.search.aggregations.support.ArrayValuesSource.NumericArrayValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.internal.SearchContext;
@ -43,7 +43,7 @@ import java.util.Map;
**/ **/
final class MatrixStatsAggregator extends MetricsAggregator { final class MatrixStatsAggregator extends MetricsAggregator {
/** Multiple ValuesSource with field names */ /** Multiple ValuesSource with field names */
private final NumericMultiValuesSource valuesSources; private final NumericArrayValuesSource valuesSources;
/** array of descriptive stats, per shard, needed to compute the correlation */ /** array of descriptive stats, per shard, needed to compute the correlation */
ObjectArray<RunningStats> stats; ObjectArray<RunningStats> stats;
@ -53,7 +53,7 @@ final class MatrixStatsAggregator extends MetricsAggregator {
Map<String,Object> metaData) throws IOException { Map<String,Object> metaData) throws IOException {
super(name, context, parent, pipelineAggregators, metaData); super(name, context, parent, pipelineAggregators, metaData);
if (valuesSources != null && !valuesSources.isEmpty()) { if (valuesSources != null && !valuesSources.isEmpty()) {
this.valuesSources = new NumericMultiValuesSource(valuesSources, multiValueMode); this.valuesSources = new NumericArrayValuesSource(valuesSources, multiValueMode);
stats = context.bigArrays().newObjectArray(1); stats = context.bigArrays().newObjectArray(1);
} else { } else {
this.valuesSources = null; this.valuesSources = null;

View File

@ -23,7 +23,7 @@ import org.elasticsearch.search.aggregations.Aggregator;
import org.elasticsearch.search.aggregations.AggregatorFactories; import org.elasticsearch.search.aggregations.AggregatorFactories;
import org.elasticsearch.search.aggregations.AggregatorFactory; import org.elasticsearch.search.aggregations.AggregatorFactory;
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator; import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
import org.elasticsearch.search.aggregations.support.MultiValuesSourceAggregatorFactory; import org.elasticsearch.search.aggregations.support.ArrayValuesSourceAggregatorFactory;
import org.elasticsearch.search.aggregations.support.ValuesSource; import org.elasticsearch.search.aggregations.support.ValuesSource;
import org.elasticsearch.search.aggregations.support.ValuesSourceConfig; import org.elasticsearch.search.aggregations.support.ValuesSourceConfig;
import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.internal.SearchContext;
@ -33,7 +33,7 @@ import java.util.List;
import java.util.Map; import java.util.Map;
final class MatrixStatsAggregatorFactory final class MatrixStatsAggregatorFactory
extends MultiValuesSourceAggregatorFactory<ValuesSource.Numeric, MatrixStatsAggregatorFactory> { extends ArrayValuesSourceAggregatorFactory<ValuesSource.Numeric, MatrixStatsAggregatorFactory> {
private final MultiValueMode multiValueMode; private final MultiValueMode multiValueMode;

View File

@ -21,14 +21,14 @@ package org.elasticsearch.search.aggregations.matrix.stats;
import org.elasticsearch.common.ParseField; import org.elasticsearch.common.ParseField;
import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.search.MultiValueMode; import org.elasticsearch.search.MultiValueMode;
import org.elasticsearch.search.aggregations.support.MultiValuesSourceParser.NumericValuesSourceParser; import org.elasticsearch.search.aggregations.support.ArrayValuesSourceParser.NumericValuesSourceParser;
import org.elasticsearch.search.aggregations.support.ValueType; import org.elasticsearch.search.aggregations.support.ValueType;
import org.elasticsearch.search.aggregations.support.ValuesSourceType; import org.elasticsearch.search.aggregations.support.ValuesSourceType;
import java.io.IOException; import java.io.IOException;
import java.util.Map; import java.util.Map;
import static org.elasticsearch.search.aggregations.support.MultiValuesSourceAggregationBuilder.MULTIVALUE_MODE_FIELD; import static org.elasticsearch.search.aggregations.support.ArrayValuesSourceAggregationBuilder.MULTIVALUE_MODE_FIELD;
public class MatrixStatsParser extends NumericValuesSourceParser { public class MatrixStatsParser extends NumericValuesSourceParser {

View File

@ -28,13 +28,13 @@ import java.util.Map;
/** /**
* Class to encapsulate a set of ValuesSource objects labeled by field name * Class to encapsulate a set of ValuesSource objects labeled by field name
*/ */
public abstract class MultiValuesSource <VS extends ValuesSource> { public abstract class ArrayValuesSource<VS extends ValuesSource> {
protected MultiValueMode multiValueMode; protected MultiValueMode multiValueMode;
protected String[] names; protected String[] names;
protected VS[] values; protected VS[] values;
public static class NumericMultiValuesSource extends MultiValuesSource<ValuesSource.Numeric> { public static class NumericArrayValuesSource extends ArrayValuesSource<ValuesSource.Numeric> {
public NumericMultiValuesSource(Map<String, ValuesSource.Numeric> valuesSources, MultiValueMode multiValueMode) { public NumericArrayValuesSource(Map<String, ValuesSource.Numeric> valuesSources, MultiValueMode multiValueMode) {
super(valuesSources, multiValueMode); super(valuesSources, multiValueMode);
if (valuesSources != null) { if (valuesSources != null) {
this.values = valuesSources.values().toArray(new ValuesSource.Numeric[0]); this.values = valuesSources.values().toArray(new ValuesSource.Numeric[0]);
@ -51,8 +51,8 @@ public abstract class MultiValuesSource <VS extends ValuesSource> {
} }
} }
public static class BytesMultiValuesSource extends MultiValuesSource<ValuesSource.Bytes> { public static class BytesArrayValuesSource extends ArrayValuesSource<ValuesSource.Bytes> {
public BytesMultiValuesSource(Map<String, ValuesSource.Bytes> valuesSources, MultiValueMode multiValueMode) { public BytesArrayValuesSource(Map<String, ValuesSource.Bytes> valuesSources, MultiValueMode multiValueMode) {
super(valuesSources, multiValueMode); super(valuesSources, multiValueMode);
this.values = valuesSources.values().toArray(new ValuesSource.Bytes[0]); this.values = valuesSources.values().toArray(new ValuesSource.Bytes[0]);
} }
@ -62,14 +62,14 @@ public abstract class MultiValuesSource <VS extends ValuesSource> {
} }
} }
public static class GeoPointValuesSource extends MultiValuesSource<ValuesSource.GeoPoint> { public static class GeoPointValuesSource extends ArrayValuesSource<ValuesSource.GeoPoint> {
public GeoPointValuesSource(Map<String, ValuesSource.GeoPoint> valuesSources, MultiValueMode multiValueMode) { public GeoPointValuesSource(Map<String, ValuesSource.GeoPoint> valuesSources, MultiValueMode multiValueMode) {
super(valuesSources, multiValueMode); super(valuesSources, multiValueMode);
this.values = valuesSources.values().toArray(new ValuesSource.GeoPoint[0]); this.values = valuesSources.values().toArray(new ValuesSource.GeoPoint[0]);
} }
} }
private MultiValuesSource(Map<String, ?> valuesSources, MultiValueMode multiValueMode) { private ArrayValuesSource(Map<String, ?> valuesSources, MultiValueMode multiValueMode) {
if (valuesSources != null) { if (valuesSources != null) {
this.names = valuesSources.keySet().toArray(new String[0]); this.names = valuesSources.keySet().toArray(new String[0]);
} }

View File

@ -44,13 +44,13 @@ import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Objects; import java.util.Objects;
public abstract class MultiValuesSourceAggregationBuilder<VS extends ValuesSource, AB extends MultiValuesSourceAggregationBuilder<VS, AB>> public abstract class ArrayValuesSourceAggregationBuilder<VS extends ValuesSource, AB extends ArrayValuesSourceAggregationBuilder<VS, AB>>
extends AbstractAggregationBuilder<AB> { extends AbstractAggregationBuilder<AB> {
public static final ParseField MULTIVALUE_MODE_FIELD = new ParseField("mode"); public static final ParseField MULTIVALUE_MODE_FIELD = new ParseField("mode");
public abstract static class LeafOnly<VS extends ValuesSource, AB extends MultiValuesSourceAggregationBuilder<VS, AB>> public abstract static class LeafOnly<VS extends ValuesSource, AB extends ArrayValuesSourceAggregationBuilder<VS, AB>>
extends MultiValuesSourceAggregationBuilder<VS, AB> { extends ArrayValuesSourceAggregationBuilder<VS, AB> {
protected LeafOnly(String name, ValuesSourceType valuesSourceType, ValueType targetValueType) { protected LeafOnly(String name, ValuesSourceType valuesSourceType, ValueType targetValueType) {
super(name, valuesSourceType, targetValueType); super(name, valuesSourceType, targetValueType);
@ -94,7 +94,7 @@ public abstract class MultiValuesSourceAggregationBuilder<VS extends ValuesSourc
private Object missing = null; private Object missing = null;
private Map<String, Object> missingMap = Collections.emptyMap(); private Map<String, Object> missingMap = Collections.emptyMap();
protected MultiValuesSourceAggregationBuilder(String name, ValuesSourceType valuesSourceType, ValueType targetValueType) { protected ArrayValuesSourceAggregationBuilder(String name, ValuesSourceType valuesSourceType, ValueType targetValueType) {
super(name); super(name);
if (valuesSourceType == null) { if (valuesSourceType == null) {
throw new IllegalArgumentException("[valuesSourceType] must not be null: [" + name + "]"); throw new IllegalArgumentException("[valuesSourceType] must not be null: [" + name + "]");
@ -103,7 +103,7 @@ public abstract class MultiValuesSourceAggregationBuilder<VS extends ValuesSourc
this.targetValueType = targetValueType; this.targetValueType = targetValueType;
} }
protected MultiValuesSourceAggregationBuilder(MultiValuesSourceAggregationBuilder<VS, AB> clone, protected ArrayValuesSourceAggregationBuilder(ArrayValuesSourceAggregationBuilder<VS, AB> clone,
Builder factoriesBuilder, Map<String, Object> metaData) { Builder factoriesBuilder, Map<String, Object> metaData) {
super(clone, factoriesBuilder, metaData); super(clone, factoriesBuilder, metaData);
this.valuesSourceType = clone.valuesSourceType; this.valuesSourceType = clone.valuesSourceType;
@ -115,7 +115,7 @@ public abstract class MultiValuesSourceAggregationBuilder<VS extends ValuesSourc
this.missing = clone.missing; this.missing = clone.missing;
} }
protected MultiValuesSourceAggregationBuilder(StreamInput in, ValuesSourceType valuesSourceType, ValueType targetValueType) protected ArrayValuesSourceAggregationBuilder(StreamInput in, ValuesSourceType valuesSourceType, ValueType targetValueType)
throws IOException { throws IOException {
super(in); super(in);
assert false == serializeTargetValueType() : "Wrong read constructor called for subclass that provides its targetValueType"; assert false == serializeTargetValueType() : "Wrong read constructor called for subclass that provides its targetValueType";
@ -124,7 +124,7 @@ public abstract class MultiValuesSourceAggregationBuilder<VS extends ValuesSourc
read(in); read(in);
} }
protected MultiValuesSourceAggregationBuilder(StreamInput in, ValuesSourceType valuesSourceType) throws IOException { protected ArrayValuesSourceAggregationBuilder(StreamInput in, ValuesSourceType valuesSourceType) throws IOException {
super(in); super(in);
assert serializeTargetValueType() : "Wrong read constructor called for subclass that serializes its targetValueType"; assert serializeTargetValueType() : "Wrong read constructor called for subclass that serializes its targetValueType";
this.valuesSourceType = valuesSourceType; this.valuesSourceType = valuesSourceType;
@ -239,10 +239,10 @@ public abstract class MultiValuesSourceAggregationBuilder<VS extends ValuesSourc
} }
@Override @Override
protected final MultiValuesSourceAggregatorFactory<VS, ?> doBuild(SearchContext context, AggregatorFactory<?> parent, protected final ArrayValuesSourceAggregatorFactory<VS, ?> doBuild(SearchContext context, AggregatorFactory<?> parent,
AggregatorFactories.Builder subFactoriesBuilder) throws IOException { AggregatorFactories.Builder subFactoriesBuilder) throws IOException {
Map<String, ValuesSourceConfig<VS>> configs = resolveConfig(context); Map<String, ValuesSourceConfig<VS>> configs = resolveConfig(context);
MultiValuesSourceAggregatorFactory<VS, ?> factory = innerBuild(context, configs, parent, subFactoriesBuilder); ArrayValuesSourceAggregatorFactory<VS, ?> factory = innerBuild(context, configs, parent, subFactoriesBuilder);
return factory; return factory;
} }
@ -255,8 +255,9 @@ public abstract class MultiValuesSourceAggregationBuilder<VS extends ValuesSourc
return configs; return configs;
} }
protected abstract MultiValuesSourceAggregatorFactory<VS, ?> innerBuild(SearchContext context, protected abstract ArrayValuesSourceAggregatorFactory<VS, ?> innerBuild(SearchContext context,
Map<String, ValuesSourceConfig<VS>> configs, AggregatorFactory<?> parent, Map<String, ValuesSourceConfig<VS>> configs,
AggregatorFactory<?> parent,
AggregatorFactories.Builder subFactoriesBuilder) throws IOException; AggregatorFactories.Builder subFactoriesBuilder) throws IOException;
public ValuesSourceConfig<VS> config(SearchContext context, String field, Script script) { public ValuesSourceConfig<VS> config(SearchContext context, String field, Script script) {
@ -362,7 +363,7 @@ public abstract class MultiValuesSourceAggregationBuilder<VS extends ValuesSourc
@Override @Override
protected final boolean doEquals(Object obj) { protected final boolean doEquals(Object obj) {
MultiValuesSourceAggregationBuilder<?, ?> other = (MultiValuesSourceAggregationBuilder<?, ?>) obj; ArrayValuesSourceAggregationBuilder<?, ?> other = (ArrayValuesSourceAggregationBuilder<?, ?>) obj;
if (!Objects.equals(fields, other.fields)) if (!Objects.equals(fields, other.fields))
return false; return false;
if (!Objects.equals(format, other.format)) if (!Objects.equals(format, other.format))

View File

@ -30,13 +30,14 @@ import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
public abstract class MultiValuesSourceAggregatorFactory<VS extends ValuesSource, AF extends MultiValuesSourceAggregatorFactory<VS, AF>> public abstract class ArrayValuesSourceAggregatorFactory<VS extends ValuesSource, AF extends ArrayValuesSourceAggregatorFactory<VS, AF>>
extends AggregatorFactory<AF> { extends AggregatorFactory<AF> {
protected Map<String, ValuesSourceConfig<VS>> configs; protected Map<String, ValuesSourceConfig<VS>> configs;
public MultiValuesSourceAggregatorFactory(String name, Map<String, ValuesSourceConfig<VS>> configs, public ArrayValuesSourceAggregatorFactory(String name, Map<String, ValuesSourceConfig<VS>> configs,
SearchContext context, AggregatorFactory<?> parent, AggregatorFactories.Builder subFactoriesBuilder, SearchContext context, AggregatorFactory<?> parent,
AggregatorFactories.Builder subFactoriesBuilder,
Map<String, Object> metaData) throws IOException { Map<String, Object> metaData) throws IOException {
super(name, context, parent, subFactoriesBuilder, metaData); super(name, context, parent, subFactoriesBuilder, metaData);
this.configs = configs; this.configs = configs;
@ -63,6 +64,7 @@ public abstract class MultiValuesSourceAggregatorFactory<VS extends ValuesSource
Map<String, Object> metaData) throws IOException; Map<String, Object> metaData) throws IOException;
protected abstract Aggregator doCreateInternal(Map<String, VS> valuesSources, Aggregator parent, boolean collectsFromSingleBucket, protected abstract Aggregator doCreateInternal(Map<String, VS> valuesSources, Aggregator parent, boolean collectsFromSingleBucket,
List<PipelineAggregator> pipelineAggregators, Map<String, Object> metaData) throws IOException; List<PipelineAggregator> pipelineAggregators,
Map<String, Object> metaData) throws IOException;
} }

View File

@ -33,30 +33,30 @@ import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
public abstract class MultiValuesSourceParser<VS extends ValuesSource> implements Aggregator.Parser { public abstract class ArrayValuesSourceParser<VS extends ValuesSource> implements Aggregator.Parser {
public abstract static class AnyValuesSourceParser extends MultiValuesSourceParser<ValuesSource> { public abstract static class AnyValuesSourceParser extends ArrayValuesSourceParser<ValuesSource> {
protected AnyValuesSourceParser(boolean formattable) { protected AnyValuesSourceParser(boolean formattable) {
super(formattable, ValuesSourceType.ANY, null); super(formattable, ValuesSourceType.ANY, null);
} }
} }
public abstract static class NumericValuesSourceParser extends MultiValuesSourceParser<ValuesSource.Numeric> { public abstract static class NumericValuesSourceParser extends ArrayValuesSourceParser<ValuesSource.Numeric> {
protected NumericValuesSourceParser(boolean formattable) { protected NumericValuesSourceParser(boolean formattable) {
super(formattable, ValuesSourceType.NUMERIC, ValueType.NUMERIC); super(formattable, ValuesSourceType.NUMERIC, ValueType.NUMERIC);
} }
} }
public abstract static class BytesValuesSourceParser extends MultiValuesSourceParser<ValuesSource.Bytes> { public abstract static class BytesValuesSourceParser extends ArrayValuesSourceParser<ValuesSource.Bytes> {
protected BytesValuesSourceParser(boolean formattable) { protected BytesValuesSourceParser(boolean formattable) {
super(formattable, ValuesSourceType.BYTES, ValueType.STRING); super(formattable, ValuesSourceType.BYTES, ValueType.STRING);
} }
} }
public abstract static class GeoPointValuesSourceParser extends MultiValuesSourceParser<ValuesSource.GeoPoint> { public abstract static class GeoPointValuesSourceParser extends ArrayValuesSourceParser<ValuesSource.GeoPoint> {
protected GeoPointValuesSourceParser(boolean formattable) { protected GeoPointValuesSourceParser(boolean formattable) {
super(formattable, ValuesSourceType.GEOPOINT, ValueType.GEOPOINT); super(formattable, ValuesSourceType.GEOPOINT, ValueType.GEOPOINT);
@ -67,14 +67,14 @@ public abstract class MultiValuesSourceParser<VS extends ValuesSource> implement
private ValuesSourceType valuesSourceType = null; private ValuesSourceType valuesSourceType = null;
private ValueType targetValueType = null; private ValueType targetValueType = null;
private MultiValuesSourceParser(boolean formattable, ValuesSourceType valuesSourceType, ValueType targetValueType) { private ArrayValuesSourceParser(boolean formattable, ValuesSourceType valuesSourceType, ValueType targetValueType) {
this.valuesSourceType = valuesSourceType; this.valuesSourceType = valuesSourceType;
this.targetValueType = targetValueType; this.targetValueType = targetValueType;
this.formattable = formattable; this.formattable = formattable;
} }
@Override @Override
public final MultiValuesSourceAggregationBuilder<VS, ?> parse(String aggregationName, XContentParser parser) public final ArrayValuesSourceAggregationBuilder<VS, ?> parse(String aggregationName, XContentParser parser)
throws IOException { throws IOException {
List<String> fields = null; List<String> fields = null;
@ -140,7 +140,7 @@ public abstract class MultiValuesSourceParser<VS extends ValuesSource> implement
} }
} }
MultiValuesSourceAggregationBuilder<VS, ?> factory = createFactory(aggregationName, this.valuesSourceType, this.targetValueType, ArrayValuesSourceAggregationBuilder<VS, ?> factory = createFactory(aggregationName, this.valuesSourceType, this.targetValueType,
otherOptions); otherOptions);
if (fields != null) { if (fields != null) {
factory.fields(fields); factory.fields(fields);
@ -182,7 +182,7 @@ public abstract class MultiValuesSourceParser<VS extends ValuesSource> implement
/** /**
* Creates a {@link ValuesSourceAggregationBuilder} from the information * Creates a {@link ValuesSourceAggregationBuilder} from the information
* gathered by the subclass. Options parsed in * gathered by the subclass. Options parsed in
* {@link MultiValuesSourceParser} itself will be added to the factory * {@link ArrayValuesSourceParser} itself will be added to the factory
* after it has been returned by this method. * after it has been returned by this method.
* *
* @param aggregationName * @param aggregationName
@ -197,11 +197,13 @@ public abstract class MultiValuesSourceParser<VS extends ValuesSource> implement
* method * method
* @return the created factory * @return the created factory
*/ */
protected abstract MultiValuesSourceAggregationBuilder<VS, ?> createFactory(String aggregationName, ValuesSourceType valuesSourceType, protected abstract ArrayValuesSourceAggregationBuilder<VS, ?> createFactory(String aggregationName,
ValueType targetValueType, Map<ParseField, Object> otherOptions); ValuesSourceType valuesSourceType,
ValueType targetValueType,
Map<ParseField, Object> otherOptions);
/** /**
* Allows subclasses of {@link MultiValuesSourceParser} to parse extra * Allows subclasses of {@link ArrayValuesSourceParser} to parse extra
* parameters and store them in a {@link Map} which will later be passed to * parameters and store them in a {@link Map} which will later be passed to
* {@link #createFactory(String, ValuesSourceType, ValueType, Map)}. * {@link #createFactory(String, ValuesSourceType, ValueType, Map)}.
* *

View File

@ -43,7 +43,6 @@ class MultiPassStats {
this.fieldBKey = fieldBName; this.fieldBKey = fieldBName;
} }
@SuppressWarnings("unchecked")
void computeStats(final List<Double> fieldA, final List<Double> fieldB) { void computeStats(final List<Double> fieldA, final List<Double> fieldB) {
// set count // set count
count = fieldA.size(); count = fieldA.size();

View File

@ -42,7 +42,11 @@ public final class ConvertProcessor extends AbstractProcessor {
@Override @Override
public Object convert(Object value) { public Object convert(Object value) {
try { try {
return Integer.parseInt(value.toString()); String strValue = value.toString();
if (strValue.startsWith("0x") || strValue.startsWith("-0x")) {
return Integer.decode(strValue);
}
return Integer.parseInt(strValue);
} catch(NumberFormatException e) { } catch(NumberFormatException e) {
throw new IllegalArgumentException("unable to convert [" + value + "] to integer", e); throw new IllegalArgumentException("unable to convert [" + value + "] to integer", e);
} }
@ -52,7 +56,11 @@ public final class ConvertProcessor extends AbstractProcessor {
@Override @Override
public Object convert(Object value) { public Object convert(Object value) {
try { try {
return Long.parseLong(value.toString()); String strValue = value.toString();
if (strValue.startsWith("0x") || strValue.startsWith("-0x")) {
return Long.decode(strValue);
}
return Long.parseLong(strValue);
} catch(NumberFormatException e) { } catch(NumberFormatException e) {
throw new IllegalArgumentException("unable to convert [" + value + "] to long", e); throw new IllegalArgumentException("unable to convert [" + value + "] to long", e);
} }

View File

@ -49,6 +49,33 @@ public class ConvertProcessorTests extends ESTestCase {
assertThat(ingestDocument.getFieldValue(fieldName, Integer.class), equalTo(randomInt)); assertThat(ingestDocument.getFieldValue(fieldName, Integer.class), equalTo(randomInt));
} }
public void testConvertIntHex() throws Exception {
IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random());
int randomInt = randomInt();
String intString = randomInt < 0 ? "-0x" + Integer.toHexString(-randomInt) : "0x" + Integer.toHexString(randomInt);
String fieldName = RandomDocumentPicks.addRandomField(random(), ingestDocument, intString);
Processor processor = new ConvertProcessor(randomAlphaOfLength(10), fieldName, fieldName, Type.INTEGER, false);
processor.execute(ingestDocument);
assertThat(ingestDocument.getFieldValue(fieldName, Integer.class), equalTo(randomInt));
}
public void testConvertIntLeadingZero() throws Exception {
IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random());
String fieldName = RandomDocumentPicks.addRandomField(random(), ingestDocument, "010");
Processor processor = new ConvertProcessor(randomAlphaOfLength(10), fieldName, fieldName, Type.INTEGER, false);
processor.execute(ingestDocument);
assertThat(ingestDocument.getFieldValue(fieldName, Integer.class), equalTo(10));
}
public void testConvertIntHexError() {
IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random());
String value = "0x" + randomAlphaOfLengthBetween(1, 10);
String fieldName = RandomDocumentPicks.addRandomField(random(), ingestDocument, value);
Processor processor = new ConvertProcessor(randomAlphaOfLength(10), fieldName, fieldName, Type.INTEGER, false);
IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () -> processor.execute(ingestDocument));
assertThat(e.getMessage(), equalTo("unable to convert [" + value + "] to integer"));
}
public void testConvertIntList() throws Exception { public void testConvertIntList() throws Exception {
IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random()); IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random());
int numItems = randomIntBetween(1, 10); int numItems = randomIntBetween(1, 10);
@ -92,6 +119,33 @@ public class ConvertProcessorTests extends ESTestCase {
assertThat(ingestDocument.getFieldValue(fieldName, Long.class), equalTo(randomLong)); assertThat(ingestDocument.getFieldValue(fieldName, Long.class), equalTo(randomLong));
} }
public void testConvertLongHex() throws Exception {
IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random());
long randomLong = randomLong();
String longString = randomLong < 0 ? "-0x" + Long.toHexString(-randomLong) : "0x" + Long.toHexString(randomLong);
String fieldName = RandomDocumentPicks.addRandomField(random(), ingestDocument, longString);
Processor processor = new ConvertProcessor(randomAlphaOfLength(10), fieldName, fieldName, Type.LONG, false);
processor.execute(ingestDocument);
assertThat(ingestDocument.getFieldValue(fieldName, Long.class), equalTo(randomLong));
}
public void testConvertLongLeadingZero() throws Exception {
IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random());
String fieldName = RandomDocumentPicks.addRandomField(random(), ingestDocument, "010");
Processor processor = new ConvertProcessor(randomAlphaOfLength(10), fieldName, fieldName, Type.LONG, false);
processor.execute(ingestDocument);
assertThat(ingestDocument.getFieldValue(fieldName, Long.class), equalTo(10L));
}
public void testConvertLongHexError() {
IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random());
String value = "0x" + randomAlphaOfLengthBetween(1, 10);
String fieldName = RandomDocumentPicks.addRandomField(random(), ingestDocument, value);
Processor processor = new ConvertProcessor(randomAlphaOfLength(10), fieldName, fieldName, Type.LONG, false);
IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () -> processor.execute(ingestDocument));
assertThat(e.getMessage(), equalTo("unable to convert [" + value + "] to long"));
}
public void testConvertLongList() throws Exception { public void testConvertLongList() throws Exception {
IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random()); IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random());
int numItems = randomIntBetween(1, 10); int numItems = randomIntBetween(1, 10);

View File

@ -146,7 +146,6 @@ public class JsonProcessorTests extends ESTestCase {
assertThat(exception.getMessage(), equalTo("field [field] not present as part of path [field]")); assertThat(exception.getMessage(), equalTo("field [field] not present as part of path [field]"));
} }
@SuppressWarnings("unchecked")
public void testAddToRoot() throws Exception { public void testAddToRoot() throws Exception {
String processorTag = randomAlphaOfLength(3); String processorTag = randomAlphaOfLength(3);
String randomTargetField = randomAlphaOfLength(2); String randomTargetField = randomAlphaOfLength(2);

View File

@ -30,30 +30,30 @@ import java.util.Map;
public final class PainlessLookup { public final class PainlessLookup {
public Collection<PainlessClass> getStructs() { public Collection<PainlessClass> getStructs() {
return javaClassesToPainlessStructs.values(); return classesToPainlessClasses.values();
} }
private final Map<String, Class<?>> painlessTypesToJavaClasses; private final Map<String, Class<?>> canonicalClassNamesToClasses;
private final Map<Class<?>, PainlessClass> javaClassesToPainlessStructs; private final Map<Class<?>, PainlessClass> classesToPainlessClasses;
PainlessLookup(Map<String, Class<?>> painlessTypesToJavaClasses, Map<Class<?>, PainlessClass> javaClassesToPainlessStructs) { PainlessLookup(Map<String, Class<?>> canonicalClassNamesToClasses, Map<Class<?>, PainlessClass> classesToPainlessClasses) {
this.painlessTypesToJavaClasses = Collections.unmodifiableMap(painlessTypesToJavaClasses); this.canonicalClassNamesToClasses = Collections.unmodifiableMap(canonicalClassNamesToClasses);
this.javaClassesToPainlessStructs = Collections.unmodifiableMap(javaClassesToPainlessStructs); this.classesToPainlessClasses = Collections.unmodifiableMap(classesToPainlessClasses);
} }
public Class<?> getClassFromBinaryName(String painlessType) { public Class<?> getClassFromBinaryName(String painlessType) {
return painlessTypesToJavaClasses.get(painlessType.replace('$', '.')); return canonicalClassNamesToClasses.get(painlessType.replace('$', '.'));
} }
public boolean isSimplePainlessType(String painlessType) { public boolean isSimplePainlessType(String painlessType) {
return painlessTypesToJavaClasses.containsKey(painlessType); return canonicalClassNamesToClasses.containsKey(painlessType);
} }
public PainlessClass getPainlessStructFromJavaClass(Class<?> clazz) { public PainlessClass getPainlessStructFromJavaClass(Class<?> clazz) {
return javaClassesToPainlessStructs.get(clazz); return classesToPainlessClasses.get(clazz);
} }
public Class<?> getJavaClassFromPainlessType(String painlessType) { public Class<?> getJavaClassFromPainlessType(String painlessType) {
return PainlessLookupUtility.canonicalTypeNameToType(painlessType, painlessTypesToJavaClasses); return PainlessLookupUtility.canonicalTypeNameToType(painlessType, canonicalClassNamesToClasses);
} }
} }

View File

@ -36,7 +36,9 @@ import java.util.Objects;
* classes to be represented. The set of available classes will always be a subset of the available types. * classes to be represented. The set of available classes will always be a subset of the available types.
* *
* Under ambiguous circumstances most variable names are prefixed with asm, java, or painless. If the variable value is the same for asm, * Under ambiguous circumstances most variable names are prefixed with asm, java, or painless. If the variable value is the same for asm,
* java, and painless, no prefix is used. * java, and painless, no prefix is used. Target is used as a prefix to represent if a constructor, method, or field is being
* called/accessed on that specific class. Parameter is often a postfix used to represent if a type is used as a parameter to a
* constructor, method, or field.
* *
* <ul> * <ul>
* <li> - javaClassName (String) - the fully qualified java class name where '$' tokens represent inner classes excluding * <li> - javaClassName (String) - the fully qualified java class name where '$' tokens represent inner classes excluding
@ -150,8 +152,8 @@ public final class PainlessLookupUtility {
String canonicalTypeName = type.getCanonicalName(); String canonicalTypeName = type.getCanonicalName();
if (canonicalTypeName.startsWith(def.class.getName())) { if (canonicalTypeName.startsWith(def.class.getCanonicalName())) {
canonicalTypeName = canonicalTypeName.replace(def.class.getName(), DEF_TYPE_NAME); canonicalTypeName = canonicalTypeName.replace(def.class.getCanonicalName(), DEF_CLASS_NAME);
} }
return canonicalTypeName; return canonicalTypeName;
@ -351,7 +353,7 @@ public final class PainlessLookupUtility {
/** /**
* The def type name as specified in the source for a script. * The def type name as specified in the source for a script.
*/ */
public static final String DEF_TYPE_NAME = "def"; public static final String DEF_CLASS_NAME = "def";
/** /**
* The method name for all constructors. * The method name for all constructors.

View File

@ -148,7 +148,7 @@ class java.lang.Character {
int MAX_RADIX int MAX_RADIX
char MAX_SURROGATE char MAX_SURROGATE
char MAX_VALUE char MAX_VALUE
char MIN_CODE_POINT int MIN_CODE_POINT
char MIN_HIGH_SURROGATE char MIN_HIGH_SURROGATE
char MIN_LOW_SURROGATE char MIN_LOW_SURROGATE
int MIN_RADIX int MIN_RADIX

View File

@ -26,7 +26,7 @@ import java.util.Map;
public class InitializerTests extends ScriptTestCase { public class InitializerTests extends ScriptTestCase {
@SuppressWarnings({"unchecked", "rawtypes"}) @SuppressWarnings({"rawtypes"})
public void testArrayInitializers() { public void testArrayInitializers() {
int[] ints = (int[])exec("new int[] {}"); int[] ints = (int[])exec("new int[] {}");
@ -59,7 +59,7 @@ public class InitializerTests extends ScriptTestCase {
assertEquals("aaaaaa", objects[3]); assertEquals("aaaaaa", objects[3]);
} }
@SuppressWarnings({"unchecked", "rawtypes"}) @SuppressWarnings({"rawtypes"})
public void testListInitializers() { public void testListInitializers() {
List list = (List)exec("[]"); List list = (List)exec("[]");
@ -91,7 +91,7 @@ public class InitializerTests extends ScriptTestCase {
assertEquals("aaaaaa", list.get(3)); assertEquals("aaaaaa", list.get(3));
} }
@SuppressWarnings({"unchecked", "rawtypes"}) @SuppressWarnings({"rawtypes"})
public void testMapInitializers() { public void testMapInitializers() {
Map map = (Map)exec("[:]"); Map map = (Map)exec("[:]");

View File

@ -126,8 +126,6 @@ public class DiscountedCumulativeGain implements EvaluationMetric {
@Override @Override
public EvalQueryQuality evaluate(String taskId, SearchHit[] hits, public EvalQueryQuality evaluate(String taskId, SearchHit[] hits,
List<RatedDocument> ratedDocs) { List<RatedDocument> ratedDocs) {
List<Integer> allRatings = ratedDocs.stream().mapToInt(RatedDocument::getRating).boxed()
.collect(Collectors.toList());
List<RatedSearchHit> ratedHits = joinHitsWithRatings(hits, ratedDocs); List<RatedSearchHit> ratedHits = joinHitsWithRatings(hits, ratedDocs);
List<Integer> ratingsInSearchHits = new ArrayList<>(ratedHits.size()); List<Integer> ratingsInSearchHits = new ArrayList<>(ratedHits.size());
int unratedResults = 0; int unratedResults = 0;
@ -144,6 +142,8 @@ public class DiscountedCumulativeGain implements EvaluationMetric {
double idcg = 0; double idcg = 0;
if (normalize) { if (normalize) {
List<Integer> allRatings = ratedDocs.stream().mapToInt(RatedDocument::getRating).boxed()
.collect(Collectors.toList());
Collections.sort(allRatings, Comparator.nullsLast(Collections.reverseOrder())); Collections.sort(allRatings, Comparator.nullsLast(Collections.reverseOrder()));
idcg = computeDCG(allRatings.subList(0, Math.min(ratingsInSearchHits.size(), allRatings.size()))); idcg = computeDCG(allRatings.subList(0, Math.min(ratingsInSearchHits.size(), allRatings.size())));
if (idcg != 0) { if (idcg != 0) {

View File

@ -41,19 +41,19 @@ import java.util.Objects;
public class EvalQueryQuality implements ToXContentFragment, Writeable { public class EvalQueryQuality implements ToXContentFragment, Writeable {
private final String queryId; private final String queryId;
private final double evaluationResult; private final double metricScore;
private MetricDetail optionalMetricDetails; private MetricDetail optionalMetricDetails;
private final List<RatedSearchHit> ratedHits; private final List<RatedSearchHit> ratedHits;
public EvalQueryQuality(String id, double evaluationResult) { public EvalQueryQuality(String id, double metricScore) {
this.queryId = id; this.queryId = id;
this.evaluationResult = evaluationResult; this.metricScore = metricScore;
this.ratedHits = new ArrayList<>(); this.ratedHits = new ArrayList<>();
} }
public EvalQueryQuality(StreamInput in) throws IOException { public EvalQueryQuality(StreamInput in) throws IOException {
this.queryId = in.readString(); this.queryId = in.readString();
this.evaluationResult = in.readDouble(); this.metricScore = in.readDouble();
this.ratedHits = in.readList(RatedSearchHit::new); this.ratedHits = in.readList(RatedSearchHit::new);
this.optionalMetricDetails = in.readOptionalNamedWriteable(MetricDetail.class); this.optionalMetricDetails = in.readOptionalNamedWriteable(MetricDetail.class);
} }
@ -61,7 +61,7 @@ public class EvalQueryQuality implements ToXContentFragment, Writeable {
// only used for parsing internally // only used for parsing internally
private EvalQueryQuality(String queryId, ParsedEvalQueryQuality builder) { private EvalQueryQuality(String queryId, ParsedEvalQueryQuality builder) {
this.queryId = queryId; this.queryId = queryId;
this.evaluationResult = builder.evaluationResult; this.metricScore = builder.evaluationResult;
this.optionalMetricDetails = builder.optionalMetricDetails; this.optionalMetricDetails = builder.optionalMetricDetails;
this.ratedHits = builder.ratedHits; this.ratedHits = builder.ratedHits;
} }
@ -69,7 +69,7 @@ public class EvalQueryQuality implements ToXContentFragment, Writeable {
@Override @Override
public void writeTo(StreamOutput out) throws IOException { public void writeTo(StreamOutput out) throws IOException {
out.writeString(queryId); out.writeString(queryId);
out.writeDouble(evaluationResult); out.writeDouble(metricScore);
out.writeList(ratedHits); out.writeList(ratedHits);
out.writeOptionalNamedWriteable(this.optionalMetricDetails); out.writeOptionalNamedWriteable(this.optionalMetricDetails);
} }
@ -78,8 +78,8 @@ public class EvalQueryQuality implements ToXContentFragment, Writeable {
return queryId; return queryId;
} }
public double getQualityLevel() { public double metricScore() {
return evaluationResult; return metricScore;
} }
public void setMetricDetails(MetricDetail breakdown) { public void setMetricDetails(MetricDetail breakdown) {
@ -101,7 +101,7 @@ public class EvalQueryQuality implements ToXContentFragment, Writeable {
@Override @Override
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
builder.startObject(queryId); builder.startObject(queryId);
builder.field(QUALITY_LEVEL_FIELD.getPreferredName(), this.evaluationResult); builder.field(METRIC_SCORE_FIELD.getPreferredName(), this.metricScore);
builder.startArray(UNRATED_DOCS_FIELD.getPreferredName()); builder.startArray(UNRATED_DOCS_FIELD.getPreferredName());
for (DocumentKey key : EvaluationMetric.filterUnratedDocuments(ratedHits)) { for (DocumentKey key : EvaluationMetric.filterUnratedDocuments(ratedHits)) {
builder.startObject(); builder.startObject();
@ -122,7 +122,7 @@ public class EvalQueryQuality implements ToXContentFragment, Writeable {
return builder; return builder;
} }
private static final ParseField QUALITY_LEVEL_FIELD = new ParseField("quality_level"); static final ParseField METRIC_SCORE_FIELD = new ParseField("metric_score");
private static final ParseField UNRATED_DOCS_FIELD = new ParseField("unrated_docs"); private static final ParseField UNRATED_DOCS_FIELD = new ParseField("unrated_docs");
private static final ParseField HITS_FIELD = new ParseField("hits"); private static final ParseField HITS_FIELD = new ParseField("hits");
private static final ParseField METRIC_DETAILS_FIELD = new ParseField("metric_details"); private static final ParseField METRIC_DETAILS_FIELD = new ParseField("metric_details");
@ -136,7 +136,7 @@ public class EvalQueryQuality implements ToXContentFragment, Writeable {
} }
static { static {
PARSER.declareDouble((obj, value) -> obj.evaluationResult = value, QUALITY_LEVEL_FIELD); PARSER.declareDouble((obj, value) -> obj.evaluationResult = value, METRIC_SCORE_FIELD);
PARSER.declareObject((obj, value) -> obj.optionalMetricDetails = value, (p, c) -> parseMetricDetail(p), PARSER.declareObject((obj, value) -> obj.optionalMetricDetails = value, (p, c) -> parseMetricDetail(p),
METRIC_DETAILS_FIELD); METRIC_DETAILS_FIELD);
PARSER.declareObjectArray((obj, list) -> obj.ratedHits = list, (p, c) -> RatedSearchHit.parse(p), HITS_FIELD); PARSER.declareObjectArray((obj, list) -> obj.ratedHits = list, (p, c) -> RatedSearchHit.parse(p), HITS_FIELD);
@ -164,13 +164,13 @@ public class EvalQueryQuality implements ToXContentFragment, Writeable {
} }
EvalQueryQuality other = (EvalQueryQuality) obj; EvalQueryQuality other = (EvalQueryQuality) obj;
return Objects.equals(queryId, other.queryId) && return Objects.equals(queryId, other.queryId) &&
Objects.equals(evaluationResult, other.evaluationResult) && Objects.equals(metricScore, other.metricScore) &&
Objects.equals(ratedHits, other.ratedHits) && Objects.equals(ratedHits, other.ratedHits) &&
Objects.equals(optionalMetricDetails, other.optionalMetricDetails); Objects.equals(optionalMetricDetails, other.optionalMetricDetails);
} }
@Override @Override
public final int hashCode() { public final int hashCode() {
return Objects.hash(queryId, evaluationResult, ratedHits, optionalMetricDetails); return Objects.hash(queryId, metricScore, ratedHits, optionalMetricDetails);
} }
} }

View File

@ -39,23 +39,22 @@ import java.util.stream.Collectors;
public interface EvaluationMetric extends ToXContentObject, NamedWriteable { public interface EvaluationMetric extends ToXContentObject, NamedWriteable {
/** /**
* Returns a single metric representing the ranking quality of a set of returned * Evaluates a single ranking evaluation case.
* documents wrt. to a set of document ids labeled as relevant for this search.
* *
* @param taskId * @param taskId
* the id of the query for which the ranking is currently evaluated * an identifier of the query for which the search ranking is
* evaluated
* @param hits * @param hits
* the result hits as returned by a search request * the search result hits
* @param ratedDocs * @param ratedDocs
* the documents that were ranked by human annotators for this query * the documents that contain the document rating for this query case
* case * @return an {@link EvalQueryQuality} instance that contains the metric score
* @return some metric representing the quality of the result hit list wrt. to * with respect to the provided search hits and ratings
* relevant doc ids.
*/ */
EvalQueryQuality evaluate(String taskId, SearchHit[] hits, List<RatedDocument> ratedDocs); EvalQueryQuality evaluate(String taskId, SearchHit[] hits, List<RatedDocument> ratedDocs);
/** /**
* join hits with rated documents using the joint _index/_id document key * Joins hits with rated documents using the joint _index/_id document key.
*/ */
static List<RatedSearchHit> joinHitsWithRatings(SearchHit[] hits, List<RatedDocument> ratedDocs) { static List<RatedSearchHit> joinHitsWithRatings(SearchHit[] hits, List<RatedDocument> ratedDocs) {
Map<DocumentKey, RatedDocument> ratedDocumentMap = ratedDocs.stream() Map<DocumentKey, RatedDocument> ratedDocumentMap = ratedDocs.stream()
@ -74,7 +73,7 @@ public interface EvaluationMetric extends ToXContentObject, NamedWriteable {
} }
/** /**
* filter @link {@link RatedSearchHit} that don't have a rating * Filter {@link RatedSearchHit}s that do not have a rating.
*/ */
static List<DocumentKey> filterUnratedDocuments(List<RatedSearchHit> ratedHits) { static List<DocumentKey> filterUnratedDocuments(List<RatedSearchHit> ratedHits) {
return ratedHits.stream().filter(hit -> hit.getRating().isPresent() == false) return ratedHits.stream().filter(hit -> hit.getRating().isPresent() == false)
@ -82,11 +81,11 @@ public interface EvaluationMetric extends ToXContentObject, NamedWriteable {
} }
/** /**
* how evaluation metrics for particular search queries get combined for the overall evaluation score. * Combine several {@link EvalQueryQuality} results into the overall evaluation score.
* Defaults to averaging over the partial results. * This defaults to averaging over the partial results, but can be overwritten to obtain a different behavior.
*/ */
default double combine(Collection<EvalQueryQuality> partialResults) { default double combine(Collection<EvalQueryQuality> partialResults) {
return partialResults.stream().mapToDouble(EvalQueryQuality::getQualityLevel).sum() / partialResults.size(); return partialResults.stream().mapToDouble(EvalQueryQuality::metricScore).sum() / partialResults.size();
} }
/** /**

View File

@ -65,6 +65,9 @@ public class ExpectedReciprocalRank implements EvaluationMetric {
public static final String NAME = "expected_reciprocal_rank"; public static final String NAME = "expected_reciprocal_rank";
/**
* @param maxRelevance the highest expected relevance in the data
*/
public ExpectedReciprocalRank(int maxRelevance) { public ExpectedReciprocalRank(int maxRelevance) {
this(maxRelevance, null, DEFAULT_K); this(maxRelevance, null, DEFAULT_K);
} }

View File

@ -110,8 +110,7 @@ public class MeanReciprocalRank implements EvaluationMetric {
* Compute ReciprocalRank based on provided relevant document IDs. * Compute ReciprocalRank based on provided relevant document IDs.
**/ **/
@Override @Override
public EvalQueryQuality evaluate(String taskId, SearchHit[] hits, public EvalQueryQuality evaluate(String taskId, SearchHit[] hits, List<RatedDocument> ratedDocs) {
List<RatedDocument> ratedDocs) {
List<RatedSearchHit> ratedHits = joinHitsWithRatings(hits, ratedDocs); List<RatedSearchHit> ratedHits = joinHitsWithRatings(hits, ratedDocs);
int firstRelevant = -1; int firstRelevant = -1;
int rank = 1; int rank = 1;

View File

@ -37,12 +37,17 @@ public class RankEvalNamedXContentProvider implements NamedXContentProvider {
MeanReciprocalRank::fromXContent)); MeanReciprocalRank::fromXContent));
namedXContent.add(new NamedXContentRegistry.Entry(EvaluationMetric.class, new ParseField(DiscountedCumulativeGain.NAME), namedXContent.add(new NamedXContentRegistry.Entry(EvaluationMetric.class, new ParseField(DiscountedCumulativeGain.NAME),
DiscountedCumulativeGain::fromXContent)); DiscountedCumulativeGain::fromXContent));
namedXContent.add(new NamedXContentRegistry.Entry(EvaluationMetric.class, new ParseField(ExpectedReciprocalRank.NAME),
ExpectedReciprocalRank::fromXContent));
namedXContent.add(new NamedXContentRegistry.Entry(MetricDetail.class, new ParseField(PrecisionAtK.NAME), namedXContent.add(new NamedXContentRegistry.Entry(MetricDetail.class, new ParseField(PrecisionAtK.NAME),
PrecisionAtK.Detail::fromXContent)); PrecisionAtK.Detail::fromXContent));
namedXContent.add(new NamedXContentRegistry.Entry(MetricDetail.class, new ParseField(MeanReciprocalRank.NAME), namedXContent.add(new NamedXContentRegistry.Entry(MetricDetail.class, new ParseField(MeanReciprocalRank.NAME),
MeanReciprocalRank.Detail::fromXContent)); MeanReciprocalRank.Detail::fromXContent));
namedXContent.add(new NamedXContentRegistry.Entry(MetricDetail.class, new ParseField(DiscountedCumulativeGain.NAME), namedXContent.add(new NamedXContentRegistry.Entry(MetricDetail.class, new ParseField(DiscountedCumulativeGain.NAME),
DiscountedCumulativeGain.Detail::fromXContent)); DiscountedCumulativeGain.Detail::fromXContent));
namedXContent.add(new NamedXContentRegistry.Entry(MetricDetail.class, new ParseField(ExpectedReciprocalRank.NAME),
ExpectedReciprocalRank.Detail::fromXContent));
return namedXContent; return namedXContent;
} }
} }

View File

@ -60,10 +60,14 @@ public class RankEvalPlugin extends Plugin implements ActionPlugin {
namedWriteables.add(new NamedWriteableRegistry.Entry(EvaluationMetric.class, MeanReciprocalRank.NAME, MeanReciprocalRank::new)); namedWriteables.add(new NamedWriteableRegistry.Entry(EvaluationMetric.class, MeanReciprocalRank.NAME, MeanReciprocalRank::new));
namedWriteables.add( namedWriteables.add(
new NamedWriteableRegistry.Entry(EvaluationMetric.class, DiscountedCumulativeGain.NAME, DiscountedCumulativeGain::new)); new NamedWriteableRegistry.Entry(EvaluationMetric.class, DiscountedCumulativeGain.NAME, DiscountedCumulativeGain::new));
namedWriteables.add(
new NamedWriteableRegistry.Entry(EvaluationMetric.class, ExpectedReciprocalRank.NAME, ExpectedReciprocalRank::new));
namedWriteables.add(new NamedWriteableRegistry.Entry(MetricDetail.class, PrecisionAtK.NAME, PrecisionAtK.Detail::new)); namedWriteables.add(new NamedWriteableRegistry.Entry(MetricDetail.class, PrecisionAtK.NAME, PrecisionAtK.Detail::new));
namedWriteables.add(new NamedWriteableRegistry.Entry(MetricDetail.class, MeanReciprocalRank.NAME, MeanReciprocalRank.Detail::new)); namedWriteables.add(new NamedWriteableRegistry.Entry(MetricDetail.class, MeanReciprocalRank.NAME, MeanReciprocalRank.Detail::new));
namedWriteables.add( namedWriteables.add(
new NamedWriteableRegistry.Entry(MetricDetail.class, DiscountedCumulativeGain.NAME, DiscountedCumulativeGain.Detail::new)); new NamedWriteableRegistry.Entry(MetricDetail.class, DiscountedCumulativeGain.NAME, DiscountedCumulativeGain.Detail::new));
namedWriteables.add(
new NamedWriteableRegistry.Entry(MetricDetail.class, ExpectedReciprocalRank.NAME, ExpectedReciprocalRank.Detail::new));
return namedWriteables; return namedWriteables;
} }

View File

@ -48,15 +48,15 @@ import java.util.stream.Collectors;
public class RankEvalResponse extends ActionResponse implements ToXContentObject { public class RankEvalResponse extends ActionResponse implements ToXContentObject {
/** The overall evaluation result. */ /** The overall evaluation result. */
private double evaluationResult; private double metricScore;
/** details about individual ranking evaluation queries, keyed by their id */ /** details about individual ranking evaluation queries, keyed by their id */
private Map<String, EvalQueryQuality> details; private Map<String, EvalQueryQuality> details;
/** exceptions for specific ranking evaluation queries, keyed by their id */ /** exceptions for specific ranking evaluation queries, keyed by their id */
private Map<String, Exception> failures; private Map<String, Exception> failures;
public RankEvalResponse(double qualityLevel, Map<String, EvalQueryQuality> partialResults, public RankEvalResponse(double metricScore, Map<String, EvalQueryQuality> partialResults,
Map<String, Exception> failures) { Map<String, Exception> failures) {
this.evaluationResult = qualityLevel; this.metricScore = metricScore;
this.details = new HashMap<>(partialResults); this.details = new HashMap<>(partialResults);
this.failures = new HashMap<>(failures); this.failures = new HashMap<>(failures);
} }
@ -65,8 +65,8 @@ public class RankEvalResponse extends ActionResponse implements ToXContentObject
// only used in RankEvalAction#newResponse() // only used in RankEvalAction#newResponse()
} }
public double getEvaluationResult() { public double getMetricScore() {
return evaluationResult; return metricScore;
} }
public Map<String, EvalQueryQuality> getPartialResults() { public Map<String, EvalQueryQuality> getPartialResults() {
@ -85,7 +85,7 @@ public class RankEvalResponse extends ActionResponse implements ToXContentObject
@Override @Override
public void writeTo(StreamOutput out) throws IOException { public void writeTo(StreamOutput out) throws IOException {
super.writeTo(out); super.writeTo(out);
out.writeDouble(evaluationResult); out.writeDouble(metricScore);
out.writeVInt(details.size()); out.writeVInt(details.size());
for (String queryId : details.keySet()) { for (String queryId : details.keySet()) {
out.writeString(queryId); out.writeString(queryId);
@ -101,7 +101,7 @@ public class RankEvalResponse extends ActionResponse implements ToXContentObject
@Override @Override
public void readFrom(StreamInput in) throws IOException { public void readFrom(StreamInput in) throws IOException {
super.readFrom(in); super.readFrom(in);
this.evaluationResult = in.readDouble(); this.metricScore = in.readDouble();
int partialResultSize = in.readVInt(); int partialResultSize = in.readVInt();
this.details = new HashMap<>(partialResultSize); this.details = new HashMap<>(partialResultSize);
for (int i = 0; i < partialResultSize; i++) { for (int i = 0; i < partialResultSize; i++) {
@ -120,7 +120,7 @@ public class RankEvalResponse extends ActionResponse implements ToXContentObject
@Override @Override
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
builder.startObject(); builder.startObject();
builder.field("quality_level", evaluationResult); builder.field("metric_score", metricScore);
builder.startObject("details"); builder.startObject("details");
for (String key : details.keySet()) { for (String key : details.keySet()) {
details.get(key).toXContent(builder, params); details.get(key).toXContent(builder, params);
@ -137,7 +137,6 @@ public class RankEvalResponse extends ActionResponse implements ToXContentObject
return builder; return builder;
} }
private static final ParseField QUALITY_LEVEL_FIELD = new ParseField("quality_level");
private static final ParseField DETAILS_FIELD = new ParseField("details"); private static final ParseField DETAILS_FIELD = new ParseField("details");
private static final ParseField FAILURES_FIELD = new ParseField("failures"); private static final ParseField FAILURES_FIELD = new ParseField("failures");
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
@ -147,7 +146,7 @@ public class RankEvalResponse extends ActionResponse implements ToXContentObject
((List<EvalQueryQuality>) a[1]).stream().collect(Collectors.toMap(EvalQueryQuality::getId, Function.identity())), ((List<EvalQueryQuality>) a[1]).stream().collect(Collectors.toMap(EvalQueryQuality::getId, Function.identity())),
((List<Tuple<String, Exception>>) a[2]).stream().collect(Collectors.toMap(Tuple::v1, Tuple::v2)))); ((List<Tuple<String, Exception>>) a[2]).stream().collect(Collectors.toMap(Tuple::v1, Tuple::v2))));
static { static {
PARSER.declareDouble(ConstructingObjectParser.constructorArg(), QUALITY_LEVEL_FIELD); PARSER.declareDouble(ConstructingObjectParser.constructorArg(), EvalQueryQuality.METRIC_SCORE_FIELD);
PARSER.declareNamedObjects(ConstructingObjectParser.optionalConstructorArg(), (p, c, n) -> EvalQueryQuality.fromXContent(p, n), PARSER.declareNamedObjects(ConstructingObjectParser.optionalConstructorArg(), (p, c, n) -> EvalQueryQuality.fromXContent(p, n),
DETAILS_FIELD); DETAILS_FIELD);
PARSER.declareNamedObjects(ConstructingObjectParser.optionalConstructorArg(), (p, c, n) -> { PARSER.declareNamedObjects(ConstructingObjectParser.optionalConstructorArg(), (p, c, n) -> {

View File

@ -76,7 +76,7 @@ public class DiscountedCumulativeGainTests extends ESTestCase {
hits[i].shard(new SearchShardTarget("testnode", new Index("index", "uuid"), 0, null)); hits[i].shard(new SearchShardTarget("testnode", new Index("index", "uuid"), 0, null));
} }
DiscountedCumulativeGain dcg = new DiscountedCumulativeGain(); DiscountedCumulativeGain dcg = new DiscountedCumulativeGain();
assertEquals(EXPECTED_DCG, dcg.evaluate("id", hits, rated).getQualityLevel(), DELTA); assertEquals(EXPECTED_DCG, dcg.evaluate("id", hits, rated).metricScore(), DELTA);
/** /**
* Check with normalization: to get the maximal possible dcg, sort documents by * Check with normalization: to get the maximal possible dcg, sort documents by
@ -94,7 +94,7 @@ public class DiscountedCumulativeGainTests extends ESTestCase {
* idcg = 14.595390756454922 (sum of last column) * idcg = 14.595390756454922 (sum of last column)
*/ */
dcg = new DiscountedCumulativeGain(true, null, 10); dcg = new DiscountedCumulativeGain(true, null, 10);
assertEquals(EXPECTED_NDCG, dcg.evaluate("id", hits, rated).getQualityLevel(), DELTA); assertEquals(EXPECTED_NDCG, dcg.evaluate("id", hits, rated).metricScore(), DELTA);
} }
/** /**
@ -127,7 +127,7 @@ public class DiscountedCumulativeGainTests extends ESTestCase {
} }
DiscountedCumulativeGain dcg = new DiscountedCumulativeGain(); DiscountedCumulativeGain dcg = new DiscountedCumulativeGain();
EvalQueryQuality result = dcg.evaluate("id", hits, rated); EvalQueryQuality result = dcg.evaluate("id", hits, rated);
assertEquals(12.779642067948913, result.getQualityLevel(), DELTA); assertEquals(12.779642067948913, result.metricScore(), DELTA);
assertEquals(2, filterUnratedDocuments(result.getHitsAndRatings()).size()); assertEquals(2, filterUnratedDocuments(result.getHitsAndRatings()).size());
/** /**
@ -146,7 +146,7 @@ public class DiscountedCumulativeGainTests extends ESTestCase {
* idcg = 13.347184833073591 (sum of last column) * idcg = 13.347184833073591 (sum of last column)
*/ */
dcg = new DiscountedCumulativeGain(true, null, 10); dcg = new DiscountedCumulativeGain(true, null, 10);
assertEquals(12.779642067948913 / 13.347184833073591, dcg.evaluate("id", hits, rated).getQualityLevel(), DELTA); assertEquals(12.779642067948913 / 13.347184833073591, dcg.evaluate("id", hits, rated).metricScore(), DELTA);
} }
/** /**
@ -184,7 +184,7 @@ public class DiscountedCumulativeGainTests extends ESTestCase {
} }
DiscountedCumulativeGain dcg = new DiscountedCumulativeGain(); DiscountedCumulativeGain dcg = new DiscountedCumulativeGain();
EvalQueryQuality result = dcg.evaluate("id", hits, ratedDocs); EvalQueryQuality result = dcg.evaluate("id", hits, ratedDocs);
assertEquals(12.392789260714371, result.getQualityLevel(), DELTA); assertEquals(12.392789260714371, result.metricScore(), DELTA);
assertEquals(1, filterUnratedDocuments(result.getHitsAndRatings()).size()); assertEquals(1, filterUnratedDocuments(result.getHitsAndRatings()).size());
/** /**
@ -204,7 +204,7 @@ public class DiscountedCumulativeGainTests extends ESTestCase {
* idcg = 13.347184833073591 (sum of last column) * idcg = 13.347184833073591 (sum of last column)
*/ */
dcg = new DiscountedCumulativeGain(true, null, 10); dcg = new DiscountedCumulativeGain(true, null, 10);
assertEquals(12.392789260714371 / 13.347184833073591, dcg.evaluate("id", hits, ratedDocs).getQualityLevel(), DELTA); assertEquals(12.392789260714371 / 13.347184833073591, dcg.evaluate("id", hits, ratedDocs).metricScore(), DELTA);
} }
/** /**
@ -223,13 +223,13 @@ public class DiscountedCumulativeGainTests extends ESTestCase {
SearchHit[] hits = new SearchHit[0]; SearchHit[] hits = new SearchHit[0];
DiscountedCumulativeGain dcg = new DiscountedCumulativeGain(); DiscountedCumulativeGain dcg = new DiscountedCumulativeGain();
EvalQueryQuality result = dcg.evaluate("id", hits, ratedDocs); EvalQueryQuality result = dcg.evaluate("id", hits, ratedDocs);
assertEquals(0.0d, result.getQualityLevel(), DELTA); assertEquals(0.0d, result.metricScore(), DELTA);
assertEquals(0, filterUnratedDocuments(result.getHitsAndRatings()).size()); assertEquals(0, filterUnratedDocuments(result.getHitsAndRatings()).size());
// also check normalized // also check normalized
dcg = new DiscountedCumulativeGain(true, null, 10); dcg = new DiscountedCumulativeGain(true, null, 10);
result = dcg.evaluate("id", hits, ratedDocs); result = dcg.evaluate("id", hits, ratedDocs);
assertEquals(0.0d, result.getQualityLevel(), DELTA); assertEquals(0.0d, result.metricScore(), DELTA);
assertEquals(0, filterUnratedDocuments(result.getHitsAndRatings()).size()); assertEquals(0, filterUnratedDocuments(result.getHitsAndRatings()).size());
} }

View File

@ -129,7 +129,7 @@ public class EvalQueryQualityTests extends ESTestCase {
private static EvalQueryQuality mutateTestItem(EvalQueryQuality original) { private static EvalQueryQuality mutateTestItem(EvalQueryQuality original) {
String id = original.getId(); String id = original.getId();
double qualityLevel = original.getQualityLevel(); double metricScore = original.metricScore();
List<RatedSearchHit> ratedHits = new ArrayList<>(original.getHitsAndRatings()); List<RatedSearchHit> ratedHits = new ArrayList<>(original.getHitsAndRatings());
MetricDetail metricDetails = original.getMetricDetails(); MetricDetail metricDetails = original.getMetricDetails();
switch (randomIntBetween(0, 3)) { switch (randomIntBetween(0, 3)) {
@ -137,7 +137,7 @@ public class EvalQueryQualityTests extends ESTestCase {
id = id + "_"; id = id + "_";
break; break;
case 1: case 1:
qualityLevel = qualityLevel + 0.1; metricScore = metricScore + 0.1;
break; break;
case 2: case 2:
if (metricDetails == null) { if (metricDetails == null) {
@ -152,7 +152,7 @@ public class EvalQueryQualityTests extends ESTestCase {
default: default:
throw new IllegalStateException("The test should only allow four parameters mutated"); throw new IllegalStateException("The test should only allow four parameters mutated");
} }
EvalQueryQuality evalQueryQuality = new EvalQueryQuality(id, qualityLevel); EvalQueryQuality evalQueryQuality = new EvalQueryQuality(id, metricScore);
evalQueryQuality.setMetricDetails(metricDetails); evalQueryQuality.setMetricDetails(metricDetails);
evalQueryQuality.addHitsAndRatings(ratedHits); evalQueryQuality.addHitsAndRatings(ratedHits);
return evalQueryQuality; return evalQueryQuality;

View File

@ -76,10 +76,10 @@ public class ExpectedReciprocalRankTests extends ESTestCase {
Integer[] relevanceRatings = new Integer[] { 3, 2, 0, 1}; Integer[] relevanceRatings = new Integer[] { 3, 2, 0, 1};
SearchHit[] hits = createSearchHits(rated, relevanceRatings); SearchHit[] hits = createSearchHits(rated, relevanceRatings);
ExpectedReciprocalRank err = new ExpectedReciprocalRank(3, 0, 3); ExpectedReciprocalRank err = new ExpectedReciprocalRank(3, 0, 3);
assertEquals(0.8984375, err.evaluate("id", hits, rated).getQualityLevel(), DELTA); assertEquals(0.8984375, err.evaluate("id", hits, rated).metricScore(), DELTA);
// take 4th rank into window // take 4th rank into window
err = new ExpectedReciprocalRank(3, 0, 4); err = new ExpectedReciprocalRank(3, 0, 4);
assertEquals(0.8984375 + 0.00244140625, err.evaluate("id", hits, rated).getQualityLevel(), DELTA); assertEquals(0.8984375 + 0.00244140625, err.evaluate("id", hits, rated).metricScore(), DELTA);
} }
/** /**
@ -102,11 +102,11 @@ public class ExpectedReciprocalRankTests extends ESTestCase {
SearchHit[] hits = createSearchHits(rated, relevanceRatings); SearchHit[] hits = createSearchHits(rated, relevanceRatings);
ExpectedReciprocalRank err = new ExpectedReciprocalRank(3, null, 4); ExpectedReciprocalRank err = new ExpectedReciprocalRank(3, null, 4);
EvalQueryQuality evaluation = err.evaluate("id", hits, rated); EvalQueryQuality evaluation = err.evaluate("id", hits, rated);
assertEquals(0.875 + 0.00390625, evaluation.getQualityLevel(), DELTA); assertEquals(0.875 + 0.00390625, evaluation.metricScore(), DELTA);
assertEquals(1, ((ExpectedReciprocalRank.Detail) evaluation.getMetricDetails()).getUnratedDocs()); assertEquals(1, ((ExpectedReciprocalRank.Detail) evaluation.getMetricDetails()).getUnratedDocs());
// if we supply e.g. 2 as unknown docs rating, it should be the same as in the other test above // if we supply e.g. 2 as unknown docs rating, it should be the same as in the other test above
err = new ExpectedReciprocalRank(3, 2, 4); err = new ExpectedReciprocalRank(3, 2, 4);
assertEquals(0.8984375 + 0.00244140625, err.evaluate("id", hits, rated).getQualityLevel(), DELTA); assertEquals(0.8984375 + 0.00244140625, err.evaluate("id", hits, rated).metricScore(), DELTA);
} }
private SearchHit[] createSearchHits(List<RatedDocument> rated, Integer[] relevanceRatings) { private SearchHit[] createSearchHits(List<RatedDocument> rated, Integer[] relevanceRatings) {
@ -126,7 +126,7 @@ public class ExpectedReciprocalRankTests extends ESTestCase {
*/ */
public void testNoResults() throws Exception { public void testNoResults() throws Exception {
ExpectedReciprocalRank err = new ExpectedReciprocalRank(5, 0, 10); ExpectedReciprocalRank err = new ExpectedReciprocalRank(5, 0, 10);
assertEquals(0.0, err.evaluate("id", new SearchHit[0], Collections.emptyList()).getQualityLevel(), DELTA); assertEquals(0.0, err.evaluate("id", new SearchHit[0], Collections.emptyList()).metricScore(), DELTA);
} }
public void testParseFromXContent() throws IOException { public void testParseFromXContent() throws IOException {

View File

@ -95,14 +95,14 @@ public class MeanReciprocalRankTests extends ESTestCase {
int rankAtFirstRelevant = relevantAt + 1; int rankAtFirstRelevant = relevantAt + 1;
EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, ratedDocs); EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, ratedDocs);
assertEquals(1.0 / rankAtFirstRelevant, evaluation.getQualityLevel(), Double.MIN_VALUE); assertEquals(1.0 / rankAtFirstRelevant, evaluation.metricScore(), Double.MIN_VALUE);
assertEquals(rankAtFirstRelevant, ((MeanReciprocalRank.Detail) evaluation.getMetricDetails()).getFirstRelevantRank()); assertEquals(rankAtFirstRelevant, ((MeanReciprocalRank.Detail) evaluation.getMetricDetails()).getFirstRelevantRank());
// check that if we have fewer search hits than relevant doc position, // check that if we have fewer search hits than relevant doc position,
// we don't find any result and get 0.0 quality level // we don't find any result and get 0.0 score
reciprocalRank = new MeanReciprocalRank(); reciprocalRank = new MeanReciprocalRank();
evaluation = reciprocalRank.evaluate("id", Arrays.copyOfRange(hits, 0, relevantAt), ratedDocs); evaluation = reciprocalRank.evaluate("id", Arrays.copyOfRange(hits, 0, relevantAt), ratedDocs);
assertEquals(0.0, evaluation.getQualityLevel(), Double.MIN_VALUE); assertEquals(0.0, evaluation.metricScore(), Double.MIN_VALUE);
} }
public void testEvaluationOneRelevantInResults() { public void testEvaluationOneRelevantInResults() {
@ -120,7 +120,7 @@ public class MeanReciprocalRankTests extends ESTestCase {
} }
EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, ratedDocs); EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, ratedDocs);
assertEquals(1.0 / (relevantAt + 1), evaluation.getQualityLevel(), Double.MIN_VALUE); assertEquals(1.0 / (relevantAt + 1), evaluation.metricScore(), Double.MIN_VALUE);
assertEquals(relevantAt + 1, ((MeanReciprocalRank.Detail) evaluation.getMetricDetails()).getFirstRelevantRank()); assertEquals(relevantAt + 1, ((MeanReciprocalRank.Detail) evaluation.getMetricDetails()).getFirstRelevantRank());
} }
@ -140,7 +140,7 @@ public class MeanReciprocalRankTests extends ESTestCase {
MeanReciprocalRank reciprocalRank = new MeanReciprocalRank(2, 10); MeanReciprocalRank reciprocalRank = new MeanReciprocalRank(2, 10);
EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, rated); EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, rated);
assertEquals((double) 1 / 3, evaluation.getQualityLevel(), 0.00001); assertEquals((double) 1 / 3, evaluation.metricScore(), 0.00001);
assertEquals(3, ((MeanReciprocalRank.Detail) evaluation.getMetricDetails()).getFirstRelevantRank()); assertEquals(3, ((MeanReciprocalRank.Detail) evaluation.getMetricDetails()).getFirstRelevantRank());
} }
@ -158,13 +158,13 @@ public class MeanReciprocalRankTests extends ESTestCase {
SearchHit[] hits = createSearchHits(0, 9, "test"); SearchHit[] hits = createSearchHits(0, 9, "test");
List<RatedDocument> ratedDocs = new ArrayList<>(); List<RatedDocument> ratedDocs = new ArrayList<>();
EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, ratedDocs); EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, ratedDocs);
assertEquals(0.0, evaluation.getQualityLevel(), Double.MIN_VALUE); assertEquals(0.0, evaluation.metricScore(), Double.MIN_VALUE);
} }
public void testNoResults() throws Exception { public void testNoResults() throws Exception {
SearchHit[] hits = new SearchHit[0]; SearchHit[] hits = new SearchHit[0];
EvalQueryQuality evaluated = (new MeanReciprocalRank()).evaluate("id", hits, Collections.emptyList()); EvalQueryQuality evaluated = (new MeanReciprocalRank()).evaluate("id", hits, Collections.emptyList());
assertEquals(0.0d, evaluated.getQualityLevel(), 0.00001); assertEquals(0.0d, evaluated.metricScore(), 0.00001);
assertEquals(-1, ((MeanReciprocalRank.Detail) evaluated.getMetricDetails()).getFirstRelevantRank()); assertEquals(-1, ((MeanReciprocalRank.Detail) evaluated.getMetricDetails()).getFirstRelevantRank());
} }

View File

@ -53,7 +53,7 @@ public class PrecisionAtKTests extends ESTestCase {
List<RatedDocument> rated = new ArrayList<>(); List<RatedDocument> rated = new ArrayList<>();
rated.add(createRatedDoc("test", "0", RELEVANT_RATING_1)); rated.add(createRatedDoc("test", "0", RELEVANT_RATING_1));
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", toSearchHits(rated, "test"), rated); EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", toSearchHits(rated, "test"), rated);
assertEquals(1, evaluated.getQualityLevel(), 0.00001); assertEquals(1, evaluated.metricScore(), 0.00001);
assertEquals(1, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved()); assertEquals(1, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
assertEquals(1, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved()); assertEquals(1, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
} }
@ -66,7 +66,7 @@ public class PrecisionAtKTests extends ESTestCase {
rated.add(createRatedDoc("test", "3", RELEVANT_RATING_1)); rated.add(createRatedDoc("test", "3", RELEVANT_RATING_1));
rated.add(createRatedDoc("test", "4", IRRELEVANT_RATING_0)); rated.add(createRatedDoc("test", "4", IRRELEVANT_RATING_0));
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", toSearchHits(rated, "test"), rated); EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", toSearchHits(rated, "test"), rated);
assertEquals((double) 4 / 5, evaluated.getQualityLevel(), 0.00001); assertEquals((double) 4 / 5, evaluated.metricScore(), 0.00001);
assertEquals(4, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved()); assertEquals(4, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
assertEquals(5, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved()); assertEquals(5, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
} }
@ -85,7 +85,7 @@ public class PrecisionAtKTests extends ESTestCase {
rated.add(createRatedDoc("test", "4", 4)); rated.add(createRatedDoc("test", "4", 4));
PrecisionAtK precisionAtN = new PrecisionAtK(2, false, 5); PrecisionAtK precisionAtN = new PrecisionAtK(2, false, 5);
EvalQueryQuality evaluated = precisionAtN.evaluate("id", toSearchHits(rated, "test"), rated); EvalQueryQuality evaluated = precisionAtN.evaluate("id", toSearchHits(rated, "test"), rated);
assertEquals((double) 3 / 5, evaluated.getQualityLevel(), 0.00001); assertEquals((double) 3 / 5, evaluated.metricScore(), 0.00001);
assertEquals(3, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved()); assertEquals(3, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
assertEquals(5, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved()); assertEquals(5, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
} }
@ -99,7 +99,7 @@ public class PrecisionAtKTests extends ESTestCase {
rated.add(createRatedDoc("test", "2", IRRELEVANT_RATING_0)); rated.add(createRatedDoc("test", "2", IRRELEVANT_RATING_0));
// the following search hits contain only the last three documents // the following search hits contain only the last three documents
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", toSearchHits(rated.subList(2, 5), "test"), rated); EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", toSearchHits(rated.subList(2, 5), "test"), rated);
assertEquals((double) 2 / 3, evaluated.getQualityLevel(), 0.00001); assertEquals((double) 2 / 3, evaluated.metricScore(), 0.00001);
assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved()); assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
assertEquals(3, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved()); assertEquals(3, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
} }
@ -114,14 +114,14 @@ public class PrecisionAtKTests extends ESTestCase {
searchHits[2].shard(new SearchShardTarget("testnode", new Index("index", "uuid"), 0, null)); searchHits[2].shard(new SearchShardTarget("testnode", new Index("index", "uuid"), 0, null));
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", searchHits, rated); EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", searchHits, rated);
assertEquals((double) 2 / 3, evaluated.getQualityLevel(), 0.00001); assertEquals((double) 2 / 3, evaluated.metricScore(), 0.00001);
assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved()); assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
assertEquals(3, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved()); assertEquals(3, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
// also try with setting `ignore_unlabeled` // also try with setting `ignore_unlabeled`
PrecisionAtK prec = new PrecisionAtK(1, true, 10); PrecisionAtK prec = new PrecisionAtK(1, true, 10);
evaluated = prec.evaluate("id", searchHits, rated); evaluated = prec.evaluate("id", searchHits, rated);
assertEquals((double) 2 / 2, evaluated.getQualityLevel(), 0.00001); assertEquals((double) 2 / 2, evaluated.metricScore(), 0.00001);
assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved()); assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved()); assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
} }
@ -133,14 +133,14 @@ public class PrecisionAtKTests extends ESTestCase {
hits[i].shard(new SearchShardTarget("testnode", new Index("index", "uuid"), 0, null)); hits[i].shard(new SearchShardTarget("testnode", new Index("index", "uuid"), 0, null));
} }
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", hits, Collections.emptyList()); EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", hits, Collections.emptyList());
assertEquals(0.0d, evaluated.getQualityLevel(), 0.00001); assertEquals(0.0d, evaluated.metricScore(), 0.00001);
assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved()); assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
assertEquals(5, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved()); assertEquals(5, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
// also try with setting `ignore_unlabeled` // also try with setting `ignore_unlabeled`
PrecisionAtK prec = new PrecisionAtK(1, true, 10); PrecisionAtK prec = new PrecisionAtK(1, true, 10);
evaluated = prec.evaluate("id", hits, Collections.emptyList()); evaluated = prec.evaluate("id", hits, Collections.emptyList());
assertEquals(0.0d, evaluated.getQualityLevel(), 0.00001); assertEquals(0.0d, evaluated.metricScore(), 0.00001);
assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved()); assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved()); assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
} }
@ -148,7 +148,7 @@ public class PrecisionAtKTests extends ESTestCase {
public void testNoResults() throws Exception { public void testNoResults() throws Exception {
SearchHit[] hits = new SearchHit[0]; SearchHit[] hits = new SearchHit[0];
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", hits, Collections.emptyList()); EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", hits, Collections.emptyList());
assertEquals(0.0d, evaluated.getQualityLevel(), 0.00001); assertEquals(0.0d, evaluated.metricScore(), 0.00001);
assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved()); assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved()); assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
} }

View File

@ -114,7 +114,7 @@ public class RankEvalRequestIT extends ESIntegTestCase {
// the expected Prec@ for the first query is 4/6 and the expected Prec@ for the // the expected Prec@ for the first query is 4/6 and the expected Prec@ for the
// second is 1/6, divided by 2 to get the average // second is 1/6, divided by 2 to get the average
double expectedPrecision = (1.0 / 6.0 + 4.0 / 6.0) / 2.0; double expectedPrecision = (1.0 / 6.0 + 4.0 / 6.0) / 2.0;
assertEquals(expectedPrecision, response.getEvaluationResult(), Double.MIN_VALUE); assertEquals(expectedPrecision, response.getMetricScore(), Double.MIN_VALUE);
Set<Entry<String, EvalQueryQuality>> entrySet = response.getPartialResults().entrySet(); Set<Entry<String, EvalQueryQuality>> entrySet = response.getPartialResults().entrySet();
assertEquals(2, entrySet.size()); assertEquals(2, entrySet.size());
for (Entry<String, EvalQueryQuality> entry : entrySet) { for (Entry<String, EvalQueryQuality> entry : entrySet) {
@ -157,7 +157,7 @@ public class RankEvalRequestIT extends ESIntegTestCase {
// if we look only at top 3 documente, the expected P@3 for the first query is // if we look only at top 3 documente, the expected P@3 for the first query is
// 2/3 and the expected Prec@ for the second is 1/3, divided by 2 to get the average // 2/3 and the expected Prec@ for the second is 1/3, divided by 2 to get the average
expectedPrecision = (1.0 / 3.0 + 2.0 / 3.0) / 2.0; expectedPrecision = (1.0 / 3.0 + 2.0 / 3.0) / 2.0;
assertEquals(expectedPrecision, response.getEvaluationResult(), Double.MIN_VALUE); assertEquals(expectedPrecision, response.getMetricScore(), Double.MIN_VALUE);
} }
/** /**
@ -186,7 +186,7 @@ public class RankEvalRequestIT extends ESIntegTestCase {
new RankEvalRequest(task, new String[] { TEST_INDEX })); new RankEvalRequest(task, new String[] { TEST_INDEX }));
RankEvalResponse response = client().execute(RankEvalAction.INSTANCE, builder.request()).actionGet(); RankEvalResponse response = client().execute(RankEvalAction.INSTANCE, builder.request()).actionGet();
assertEquals(DiscountedCumulativeGainTests.EXPECTED_DCG, response.getEvaluationResult(), 10E-14); assertEquals(DiscountedCumulativeGainTests.EXPECTED_DCG, response.getMetricScore(), 10E-14);
// test that a different window size k affects the result // test that a different window size k affects the result
metric = new DiscountedCumulativeGain(false, null, 3); metric = new DiscountedCumulativeGain(false, null, 3);
@ -195,7 +195,7 @@ public class RankEvalRequestIT extends ESIntegTestCase {
builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE, new RankEvalRequest(task, new String[] { TEST_INDEX })); builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE, new RankEvalRequest(task, new String[] { TEST_INDEX }));
response = client().execute(RankEvalAction.INSTANCE, builder.request()).actionGet(); response = client().execute(RankEvalAction.INSTANCE, builder.request()).actionGet();
assertEquals(12.39278926071437, response.getEvaluationResult(), 10E-14); assertEquals(12.39278926071437, response.getMetricScore(), 10E-14);
} }
public void testMRRRequest() { public void testMRRRequest() {
@ -218,7 +218,7 @@ public class RankEvalRequestIT extends ESIntegTestCase {
// the expected reciprocal rank for the berlin_query is 1/1 // the expected reciprocal rank for the berlin_query is 1/1
// dividing by 2 to get the average // dividing by 2 to get the average
double expectedMRR = (1.0 + 1.0 / 5.0) / 2.0; double expectedMRR = (1.0 + 1.0 / 5.0) / 2.0;
assertEquals(expectedMRR, response.getEvaluationResult(), 0.0); assertEquals(expectedMRR, response.getMetricScore(), 0.0);
// test that a different window size k affects the result // test that a different window size k affects the result
metric = new MeanReciprocalRank(1, 3); metric = new MeanReciprocalRank(1, 3);
@ -231,7 +231,7 @@ public class RankEvalRequestIT extends ESIntegTestCase {
// the reciprocal rank for the berlin_query is 1/1 // the reciprocal rank for the berlin_query is 1/1
// dividing by 2 to get the average // dividing by 2 to get the average
expectedMRR = 1.0 / 2.0; expectedMRR = 1.0 / 2.0;
assertEquals(expectedMRR, response.getEvaluationResult(), 0.0); assertEquals(expectedMRR, response.getMetricScore(), 0.0);
} }
/** /**

View File

@ -102,7 +102,7 @@ public class RankEvalResponseTests extends ESTestCase {
try (StreamInput in = output.bytes().streamInput()) { try (StreamInput in = output.bytes().streamInput()) {
RankEvalResponse deserializedResponse = new RankEvalResponse(); RankEvalResponse deserializedResponse = new RankEvalResponse();
deserializedResponse.readFrom(in); deserializedResponse.readFrom(in);
assertEquals(randomResponse.getEvaluationResult(), deserializedResponse.getEvaluationResult(), Double.MIN_VALUE); assertEquals(randomResponse.getMetricScore(), deserializedResponse.getMetricScore(), Double.MIN_VALUE);
assertEquals(randomResponse.getPartialResults(), deserializedResponse.getPartialResults()); assertEquals(randomResponse.getPartialResults(), deserializedResponse.getPartialResults());
assertEquals(randomResponse.getFailures().keySet(), deserializedResponse.getFailures().keySet()); assertEquals(randomResponse.getFailures().keySet(), deserializedResponse.getFailures().keySet());
assertNotSame(randomResponse, deserializedResponse); assertNotSame(randomResponse, deserializedResponse);
@ -130,7 +130,7 @@ public class RankEvalResponseTests extends ESTestCase {
assertNotSame(testItem, parsedItem); assertNotSame(testItem, parsedItem);
// We cannot check equality of object here because some information (e.g. // We cannot check equality of object here because some information (e.g.
// SearchHit#shard) cannot fully be parsed back. // SearchHit#shard) cannot fully be parsed back.
assertEquals(testItem.getEvaluationResult(), parsedItem.getEvaluationResult(), 0.0); assertEquals(testItem.getMetricScore(), parsedItem.getMetricScore(), 0.0);
assertEquals(testItem.getPartialResults().keySet(), parsedItem.getPartialResults().keySet()); assertEquals(testItem.getPartialResults().keySet(), parsedItem.getPartialResults().keySet());
for (EvalQueryQuality metricDetail : testItem.getPartialResults().values()) { for (EvalQueryQuality metricDetail : testItem.getPartialResults().values()) {
EvalQueryQuality parsedEvalQueryQuality = parsedItem.getPartialResults().get(metricDetail.getId()); EvalQueryQuality parsedEvalQueryQuality = parsedItem.getPartialResults().get(metricDetail.getId());
@ -154,10 +154,10 @@ public class RankEvalResponseTests extends ESTestCase {
XContentBuilder builder = XContentFactory.contentBuilder(XContentType.JSON); XContentBuilder builder = XContentFactory.contentBuilder(XContentType.JSON);
String xContent = BytesReference.bytes(response.toXContent(builder, ToXContent.EMPTY_PARAMS)).utf8ToString(); String xContent = BytesReference.bytes(response.toXContent(builder, ToXContent.EMPTY_PARAMS)).utf8ToString();
assertEquals(("{" + assertEquals(("{" +
" \"quality_level\": 0.123," + " \"metric_score\": 0.123," +
" \"details\": {" + " \"details\": {" +
" \"coffee_query\": {" + " \"coffee_query\": {" +
" \"quality_level\": 0.1," + " \"metric_score\": 0.1," +
" \"unrated_docs\": [{\"_index\":\"index\",\"_id\":\"456\"}]," + " \"unrated_docs\": [{\"_index\":\"index\",\"_id\":\"456\"}]," +
" \"hits\":[{\"hit\":{\"_index\":\"index\",\"_type\":\"\",\"_id\":\"123\",\"_score\":1.0}," + " \"hits\":[{\"hit\":{\"_index\":\"index\",\"_type\":\"\",\"_id\":\"123\",\"_score\":1.0}," +
" \"rating\":5}," + " \"rating\":5}," +

View File

@ -71,8 +71,8 @@ setup:
"metric" : { "precision": { "ignore_unlabeled" : true }} "metric" : { "precision": { "ignore_unlabeled" : true }}
} }
- match: { quality_level: 1} - match: { metric_score: 1}
- match: { details.amsterdam_query.quality_level: 1.0} - match: { details.amsterdam_query.metric_score: 1.0}
- match: { details.amsterdam_query.unrated_docs: [ {"_index": "foo", "_id": "doc4"}]} - match: { details.amsterdam_query.unrated_docs: [ {"_index": "foo", "_id": "doc4"}]}
- match: { details.amsterdam_query.metric_details.precision: {"relevant_docs_retrieved": 2, "docs_retrieved": 2}} - match: { details.amsterdam_query.metric_details.precision: {"relevant_docs_retrieved": 2, "docs_retrieved": 2}}
@ -84,7 +84,7 @@ setup:
- match: { details.amsterdam_query.hits.2.hit._id: "doc4"} - match: { details.amsterdam_query.hits.2.hit._id: "doc4"}
- is_false: details.amsterdam_query.hits.2.rating - is_false: details.amsterdam_query.hits.2.rating
- match: { details.berlin_query.quality_level: 1.0} - match: { details.berlin_query.metric_score: 1.0}
- match: { details.berlin_query.unrated_docs: [ {"_index": "foo", "_id": "doc4"}]} - match: { details.berlin_query.unrated_docs: [ {"_index": "foo", "_id": "doc4"}]}
- match: { details.berlin_query.metric_details.precision: {"relevant_docs_retrieved": 1, "docs_retrieved": 1}} - match: { details.berlin_query.metric_details.precision: {"relevant_docs_retrieved": 1, "docs_retrieved": 1}}
- length: { details.berlin_query.hits: 2} - length: { details.berlin_query.hits: 2}
@ -118,9 +118,9 @@ setup:
"metric" : { "precision": { "ignore_unlabeled" : true }} "metric" : { "precision": { "ignore_unlabeled" : true }}
} }
- match: { quality_level: 1} - match: { metric_score: 1}
- match: { details.amsterdam_query.quality_level: 1.0} - match: { details.amsterdam_query.metric_score: 1.0}
- match: { details.berlin_query.quality_level: 1.0} - match: { details.berlin_query.metric_score: 1.0}
--- ---
"Mean Reciprocal Rank": "Mean Reciprocal Rank":
@ -150,14 +150,48 @@ setup:
} }
# average is (1/3 + 1/2)/2 = 5/12 ~ 0.41666666666666663 # average is (1/3 + 1/2)/2 = 5/12 ~ 0.41666666666666663
- gt: {quality_level: 0.416} - gt: {metric_score: 0.416}
- lt: {quality_level: 0.417} - lt: {metric_score: 0.417}
- gt: {details.amsterdam_query.quality_level: 0.333} - gt: {details.amsterdam_query.metric_score: 0.333}
- lt: {details.amsterdam_query.quality_level: 0.334} - lt: {details.amsterdam_query.metric_score: 0.334}
- match: {details.amsterdam_query.metric_details.mean_reciprocal_rank: {"first_relevant": 3}} - match: {details.amsterdam_query.metric_details.mean_reciprocal_rank: {"first_relevant": 3}}
- match: {details.amsterdam_query.unrated_docs: [ {"_index": "foo", "_id": "doc2"}, - match: {details.amsterdam_query.unrated_docs: [ {"_index": "foo", "_id": "doc2"},
{"_index": "foo", "_id": "doc3"} ]} {"_index": "foo", "_id": "doc3"} ]}
- match: {details.berlin_query.quality_level: 0.5} - match: {details.berlin_query.metric_score: 0.5}
- match: {details.berlin_query.metric_details.mean_reciprocal_rank: {"first_relevant": 2}} - match: {details.berlin_query.metric_details.mean_reciprocal_rank: {"first_relevant": 2}}
- match: {details.berlin_query.unrated_docs: [ {"_index": "foo", "_id": "doc1"}]} - match: {details.berlin_query.unrated_docs: [ {"_index": "foo", "_id": "doc1"}]}
---
"Expected Reciprocal Rank":
- skip:
version: " - 6.3.99"
reason: ERR was introduced in 6.4
- do:
rank_eval:
body: {
"requests" : [
{
"id": "amsterdam_query",
"request": { "query": { "match" : {"text" : "amsterdam" }}},
"ratings": [{"_index": "foo", "_id": "doc4", "rating": 1}]
},
{
"id" : "berlin_query",
"request": { "query": { "match" : { "text" : "berlin" } }, "size" : 10 },
"ratings": [{"_index": "foo", "_id": "doc4", "rating": 1}]
}
],
"metric" : {
"expected_reciprocal_rank": {
"maximum_relevance" : 1,
"k" : 5
}
}
}
- gt: {metric_score: 0.2083333}
- lt: {metric_score: 0.2083334}
- match: {details.amsterdam_query.metric_details.expected_reciprocal_rank.unrated_docs: 2}
- match: {details.berlin_query.metric_details.expected_reciprocal_rank.unrated_docs: 1}

View File

@ -69,10 +69,10 @@
"metric" : { "dcg": {}} "metric" : { "dcg": {}}
} }
- gt: {quality_level: 13.848263 } - gt: {metric_score: 13.848263 }
- lt: {quality_level: 13.848264 } - lt: {metric_score: 13.848264 }
- gt: {details.dcg_query.quality_level: 13.848263} - gt: {details.dcg_query.metric_score: 13.848263}
- lt: {details.dcg_query.quality_level: 13.848264} - lt: {details.dcg_query.metric_score: 13.848264}
- match: {details.dcg_query.unrated_docs: [ ]} - match: {details.dcg_query.unrated_docs: [ ]}
# reverse the order in which the results are returned (less relevant docs first) # reverse the order in which the results are returned (less relevant docs first)
@ -96,10 +96,10 @@
"metric" : { "dcg": { }} "metric" : { "dcg": { }}
} }
- gt: {quality_level: 10.299674} - gt: {metric_score: 10.299674}
- lt: {quality_level: 10.299675} - lt: {metric_score: 10.299675}
- gt: {details.dcg_query_reverse.quality_level: 10.299674} - gt: {details.dcg_query_reverse.metric_score: 10.299674}
- lt: {details.dcg_query_reverse.quality_level: 10.299675} - lt: {details.dcg_query_reverse.metric_score: 10.299675}
- match: {details.dcg_query_reverse.unrated_docs: [ ]} - match: {details.dcg_query_reverse.unrated_docs: [ ]}
# if we mix both, we should get the average # if we mix both, we should get the average
@ -134,11 +134,11 @@
"metric" : { "dcg": { }} "metric" : { "dcg": { }}
} }
- gt: {quality_level: 12.073969} - gt: {metric_score: 12.073969}
- lt: {quality_level: 12.073970} - lt: {metric_score: 12.073970}
- gt: {details.dcg_query.quality_level: 13.848263} - gt: {details.dcg_query.metric_score: 13.848263}
- lt: {details.dcg_query.quality_level: 13.848264} - lt: {details.dcg_query.metric_score: 13.848264}
- match: {details.dcg_query.unrated_docs: [ ]} - match: {details.dcg_query.unrated_docs: [ ]}
- gt: {details.dcg_query_reverse.quality_level: 10.299674} - gt: {details.dcg_query_reverse.metric_score: 10.299674}
- lt: {details.dcg_query_reverse.quality_level: 10.299675} - lt: {details.dcg_query_reverse.metric_score: 10.299675}
- match: {details.dcg_query_reverse.unrated_docs: [ ]} - match: {details.dcg_query_reverse.unrated_docs: [ ]}

View File

@ -34,8 +34,8 @@
"metric" : { "precision": { "ignore_unlabeled" : true }} "metric" : { "precision": { "ignore_unlabeled" : true }}
} }
- match: { quality_level: 1} - match: { metric_score: 1}
- match: { details.amsterdam_query.quality_level: 1.0} - match: { details.amsterdam_query.metric_score: 1.0}
- match: { details.amsterdam_query.unrated_docs: [ ]} - match: { details.amsterdam_query.unrated_docs: [ ]}
- match: { details.amsterdam_query.metric_details.precision: {"relevant_docs_retrieved": 1, "docs_retrieved": 1}} - match: { details.amsterdam_query.metric_details.precision: {"relevant_docs_retrieved": 1, "docs_retrieved": 1}}

View File

@ -84,7 +84,7 @@ setup:
"metric" : { "precision": { }} "metric" : { "precision": { }}
} }
- match: {quality_level: 0.9} - match: {metric_score: 0.9}
- match: {details.amsterdam_query.unrated_docs.0._id: "6"} - match: {details.amsterdam_query.unrated_docs.0._id: "6"}
--- ---

View File

@ -57,7 +57,6 @@ public class RestUpdateByQueryAction extends AbstractBulkByQueryRestHandler<Upda
} }
@Override @Override
@SuppressWarnings("unchecked")
protected UpdateByQueryRequest buildRequest(RestRequest request) throws IOException { protected UpdateByQueryRequest buildRequest(RestRequest request) throws IOException {
/* /*
* Passing the search request through UpdateByQueryRequest first allows * Passing the search request through UpdateByQueryRequest first allows

View File

@ -181,12 +181,14 @@ public class Netty4HttpServerPipeliningTests extends ESTestCase {
@Override @Override
public void run() { public void run() {
try {
final String uri = fullHttpRequest.uri(); final String uri = fullHttpRequest.uri();
final ByteBuf buffer = Unpooled.copiedBuffer(uri, StandardCharsets.UTF_8); final ByteBuf buffer = Unpooled.copiedBuffer(uri, StandardCharsets.UTF_8);
Netty4HttpRequest httpRequest = new Netty4HttpRequest(fullHttpRequest, pipelinedRequest.getSequence()); Netty4HttpRequest httpRequest = new Netty4HttpRequest(fullHttpRequest, pipelinedRequest.getSequence());
Netty4HttpResponse response = httpRequest.createResponse(RestStatus.OK, new BytesArray(uri.getBytes(StandardCharsets.UTF_8))); Netty4HttpResponse response =
httpRequest.createResponse(RestStatus.OK, new BytesArray(uri.getBytes(StandardCharsets.UTF_8)));
response.headers().add(HttpHeaderNames.CONTENT_LENGTH, buffer.readableBytes()); response.headers().add(HttpHeaderNames.CONTENT_LENGTH, buffer.readableBytes());
final boolean slow = uri.matches("/slow/\\d+"); final boolean slow = uri.matches("/slow/\\d+");
@ -202,6 +204,9 @@ public class Netty4HttpServerPipeliningTests extends ESTestCase {
final ChannelPromise promise = ctx.newPromise(); final ChannelPromise promise = ctx.newPromise();
ctx.writeAndFlush(response, promise); ctx.writeAndFlush(response, promise);
} finally {
fullHttpRequest.release();
}
} }
} }

View File

@ -90,7 +90,6 @@ public class SimpleNetty4TransportTests extends AbstractSimpleTransportTestCase
@Override @Override
protected void closeConnectionChannel(Transport transport, Transport.Connection connection) throws IOException { protected void closeConnectionChannel(Transport transport, Transport.Connection connection) throws IOException {
final Netty4Transport t = (Netty4Transport) transport; final Netty4Transport t = (Netty4Transport) transport;
@SuppressWarnings("unchecked")
final TcpTransport.NodeChannels channels = (TcpTransport.NodeChannels) connection; final TcpTransport.NodeChannels channels = (TcpTransport.NodeChannels) connection;
CloseableChannel.closeChannels(channels.getChannels().subList(0, randomIntBetween(1, channels.getChannels().size())), true); CloseableChannel.closeChannels(channels.getChannels().subList(0, randomIntBetween(1, channels.getChannels().size())), true);
} }

View File

@ -3,7 +3,7 @@ index:
filter: filter:
doublemetaphonefilter: doublemetaphonefilter:
type: phonetic type: phonetic
encoder: doublemetaphone encoder: double_metaphone
metaphonefilter: metaphonefilter:
type: phonetic type: phonetic
encoder: metaphone encoder: metaphone
@ -12,16 +12,16 @@ index:
encoder: soundex encoder: soundex
refinedsoundexfilter: refinedsoundexfilter:
type: phonetic type: phonetic
encoder: refinedsoundex encoder: refined_soundex
caverphonefilter: caverphonefilter:
type: phonetic type: phonetic
encoder: caverphone encoder: caverphone
beidermorsefilter: beidermorsefilter:
type: phonetic type: phonetic
encoder: beidermorse encoder: beider_morse
beidermorsefilterfrench: beidermorsefilterfrench:
type: phonetic type: phonetic
encoder: beidermorse encoder: beider_morse
languageset : [ "french" ] languageset : [ "french" ]
koelnerphonetikfilter: koelnerphonetikfilter:
type: phonetic type: phonetic

View File

@ -22,36 +22,6 @@ dependencies {
compile "commons-codec:commons-codec:${versions.commonscodec}" compile "commons-codec:commons-codec:${versions.commonscodec}"
} }
// needed to be consistent with ssl host checking
String host = InetAddress.getLoopbackAddress().getHostAddress();
// location of keystore and files to generate it
File keystore = new File(project.buildDir, 'keystore/test-node.jks')
// generate the keystore
task createKey(type: LoggedExec) {
doFirst {
project.delete(keystore.parentFile)
keystore.parentFile.mkdirs()
}
executable = new File(project.runtimeJavaHome, 'bin/keytool')
standardInput = new ByteArrayInputStream('FirstName LastName\nUnit\nOrganization\nCity\nState\nNL\nyes\n\n'.getBytes('UTF-8'))
args '-genkey',
'-alias', 'test-node',
'-keystore', keystore,
'-keyalg', 'RSA',
'-keysize', '2048',
'-validity', '712',
'-dname', 'CN=' + host,
'-keypass', 'keypass',
'-storepass', 'keypass'
}
// add keystore to test classpath: it expects it there
sourceSets.test.resources.srcDir(keystore.parentFile)
processTestResources.dependsOn(createKey)
dependencyLicenses { dependencyLicenses {
mapping from: /google-.*/, to: 'google' mapping from: /google-.*/, to: 'google'
} }

View File

@ -1,4 +1,4 @@
1/* /*
* Licensed to Elasticsearch under one or more contributor * Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with * license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright * this work for additional information regarding copyright
@ -214,25 +214,6 @@ RestIntegTestTask integTestSecureHa = project.tasks.create('integTestSecureHa',
description = "Runs rest tests against an elasticsearch cluster with HDFS configured with HA Namenode and secured by MIT Kerberos." description = "Runs rest tests against an elasticsearch cluster with HDFS configured with HA Namenode and secured by MIT Kerberos."
} }
if (rootProject.ext.compilerJavaVersion.isJava11()) {
// TODO remove when: https://github.com/elastic/elasticsearch/issues/31498
integTestRunner {
systemProperty 'tests.rest.blacklist', [
'hdfs_repository/30_snapshot/take snapshot',
'hdfs_repository/40_restore/Create a snapshot and then restore it',
'hdfs_repository/20_repository_verify/HDFS Repository Verify',
'hdfs_repository/30_snapshot_get/Get a snapshot',
'hdfs_repository/20_repository_create/HDFS Repository Creation',
'hdfs_repository/20_repository_delete/HDFS Delete Repository',
'hdfs_repository/30_snapshot_readonly/Get a snapshot - readonly',
].join(',')
}
}
if (rootProject.ext.runtimeJavaVersion.isJava11() || rootProject.ext.compilerJavaVersion.isJava11()) {
// TODO remove when: https://github.com/elastic/elasticsearch/issues/31498
integTestHa.enabled = false
}
// Determine HDFS Fixture compatibility for the current build environment. // Determine HDFS Fixture compatibility for the current build environment.
boolean fixtureSupported = false boolean fixtureSupported = false
if (Os.isFamily(Os.FAMILY_WINDOWS)) { if (Os.isFamily(Os.FAMILY_WINDOWS)) {

View File

@ -61,6 +61,7 @@ grant {
// Hadoop depends on OS level user information for simple authentication // Hadoop depends on OS level user information for simple authentication
// Unix: UnixLoginModule: com.sun.security.auth.module.UnixSystem.UnixSystem init // Unix: UnixLoginModule: com.sun.security.auth.module.UnixSystem.UnixSystem init
permission java.lang.RuntimePermission "loadLibrary.jaas";
permission java.lang.RuntimePermission "loadLibrary.jaas_unix"; permission java.lang.RuntimePermission "loadLibrary.jaas_unix";
// Windows: NTLoginModule: com.sun.security.auth.module.NTSystem.loadNative // Windows: NTLoginModule: com.sun.security.auth.module.NTSystem.loadNative
permission java.lang.RuntimePermission "loadLibrary.jaas_nt"; permission java.lang.RuntimePermission "loadLibrary.jaas_nt";

View File

@ -114,9 +114,7 @@ if (!s3PermanentAccessKey && !s3PermanentSecretKey && !s3PermanentBucket && !s3P
useFixture = true useFixture = true
} else if (!s3PermanentAccessKey || !s3PermanentSecretKey || !s3PermanentBucket || !s3PermanentBasePath } else if (!s3PermanentAccessKey || !s3PermanentSecretKey || !s3PermanentBucket || !s3PermanentBasePath) {
|| !s3EC2Bucket || !s3EC2BasePath
|| !s3ECSBucket || !s3ECSBasePath) {
throw new IllegalArgumentException("not all options specified to run against external S3 service") throw new IllegalArgumentException("not all options specified to run against external S3 service")
} }
@ -349,8 +347,13 @@ processTestResources {
project.afterEvaluate { project.afterEvaluate {
if (useFixture == false) { if (useFixture == false) {
// 30_repository_temporary_credentials is not ready for CI yet // temporary_credentials, ec2_credentials and ecs_credentials are not ready for third-party-tests yet
integTestRunner.systemProperty 'tests.rest.blacklist', 'repository_s3/30_repository_temporary_credentials/*' integTestRunner.systemProperty 'tests.rest.blacklist',
[
'repository_s3/30_repository_temporary_credentials/*',
'repository_s3/40_repository_ec2_credentials/*',
'repository_s3/50_repository_ecs_credentials/*'
].join(",")
} }
} }

View File

@ -32,6 +32,7 @@ import io.netty.handler.codec.http.HttpResponseDecoder;
import io.netty.handler.codec.http.HttpResponseStatus; import io.netty.handler.codec.http.HttpResponseStatus;
import io.netty.handler.codec.http.HttpUtil; import io.netty.handler.codec.http.HttpUtil;
import io.netty.handler.codec.http.HttpVersion; import io.netty.handler.codec.http.HttpVersion;
import org.elasticsearch.common.bytes.BytesArray; import org.elasticsearch.common.bytes.BytesArray;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.unit.ByteSizeValue; import org.elasticsearch.common.unit.ByteSizeValue;
@ -89,7 +90,6 @@ public class HttpReadWriteHandlerTests extends ESTestCase {
private final ResponseDecoder responseDecoder = new ResponseDecoder(); private final ResponseDecoder responseDecoder = new ResponseDecoder();
@Before @Before
@SuppressWarnings("unchecked")
public void setMocks() { public void setMocks() {
transport = mock(NioHttpServerTransport.class); transport = mock(NioHttpServerTransport.class);
Settings settings = Settings.EMPTY; Settings settings = Settings.EMPTY;

View File

@ -95,7 +95,6 @@ public class SimpleNioTransportTests extends AbstractSimpleTransportTestCase {
@Override @Override
protected void closeConnectionChannel(Transport transport, Transport.Connection connection) throws IOException { protected void closeConnectionChannel(Transport transport, Transport.Connection connection) throws IOException {
@SuppressWarnings("unchecked")
TcpTransport.NodeChannels channels = (TcpTransport.NodeChannels) connection; TcpTransport.NodeChannels channels = (TcpTransport.NodeChannels) connection;
CloseableChannel.closeChannels(channels.getChannels().subList(0, randomIntBetween(1, channels.getChannels().size())), true); CloseableChannel.closeChannels(channels.getChannels().subList(0, randomIntBetween(1, channels.getChannels().size())), true);
} }

View File

@ -47,6 +47,8 @@ import static org.hamcrest.Matchers.notNullValue;
* In depth testing of the recovery mechanism during a rolling restart. * In depth testing of the recovery mechanism during a rolling restart.
*/ */
public class RecoveryIT extends AbstractRollingTestCase { public class RecoveryIT extends AbstractRollingTestCase {
@AwaitsFix(bugUrl = "https://github.com/elastic/elasticsearch/issues/31291")
public void testHistoryUUIDIsGenerated() throws Exception { public void testHistoryUUIDIsGenerated() throws Exception {
final String index = "index_history_uuid"; final String index = "index_history_uuid";
if (CLUSTER_TYPE == ClusterType.OLD) { if (CLUSTER_TYPE == ClusterType.OLD) {

View File

@ -129,7 +129,7 @@ public abstract class ArchiveTestCase extends PackagingTestCase {
}); });
Platforms.onLinux(() -> { Platforms.onLinux(() -> {
final String javaPath = sh.run("which java").stdout.trim(); final String javaPath = sh.run("command -v java").stdout.trim();
try { try {
sh.run("chmod -x '" + javaPath + "'"); sh.run("chmod -x '" + javaPath + "'");

View File

@ -30,16 +30,20 @@ import org.junit.BeforeClass;
import java.io.IOException; import java.io.IOException;
import java.nio.file.Files; import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.regex.Matcher; import java.util.regex.Matcher;
import java.util.regex.Pattern; import java.util.regex.Pattern;
import static org.elasticsearch.packaging.util.Cleanup.cleanEverything; import static org.elasticsearch.packaging.util.Cleanup.cleanEverything;
import static org.elasticsearch.packaging.util.FileUtils.assertPathsDontExist; import static org.elasticsearch.packaging.util.FileUtils.assertPathsDontExist;
import static org.elasticsearch.packaging.util.FileUtils.mv;
import static org.elasticsearch.packaging.util.Packages.SYSTEMD_SERVICE; import static org.elasticsearch.packaging.util.Packages.SYSTEMD_SERVICE;
import static org.elasticsearch.packaging.util.Packages.assertInstalled; import static org.elasticsearch.packaging.util.Packages.assertInstalled;
import static org.elasticsearch.packaging.util.Packages.assertRemoved; import static org.elasticsearch.packaging.util.Packages.assertRemoved;
import static org.elasticsearch.packaging.util.Packages.install; import static org.elasticsearch.packaging.util.Packages.install;
import static org.elasticsearch.packaging.util.Packages.remove; import static org.elasticsearch.packaging.util.Packages.remove;
import static org.elasticsearch.packaging.util.Packages.runInstallCommand;
import static org.elasticsearch.packaging.util.Packages.startElasticsearch; import static org.elasticsearch.packaging.util.Packages.startElasticsearch;
import static org.elasticsearch.packaging.util.Packages.verifyPackageInstallation; import static org.elasticsearch.packaging.util.Packages.verifyPackageInstallation;
import static org.elasticsearch.packaging.util.Platforms.getOsRelease; import static org.elasticsearch.packaging.util.Platforms.getOsRelease;
@ -75,6 +79,21 @@ public abstract class PackageTestCase extends PackagingTestCase {
assumeTrue("only compatible distributions", distribution().packaging.compatible); assumeTrue("only compatible distributions", distribution().packaging.compatible);
} }
public void test05InstallFailsWhenJavaMissing() {
final Shell sh = new Shell();
final Result java = sh.run("command -v java");
final Path originalJavaPath = Paths.get(java.stdout.trim());
final Path relocatedJavaPath = originalJavaPath.getParent().resolve("java.relocated");
try {
mv(originalJavaPath, relocatedJavaPath);
final Result installResult = runInstallCommand(distribution());
assertThat(installResult.exitCode, is(1));
} finally {
mv(relocatedJavaPath, originalJavaPath);
}
}
public void test10InstallPackage() { public void test10InstallPackage() {
assertRemoved(distribution()); assertRemoved(distribution());
installation = install(distribution()); installation = install(distribution());

View File

@ -67,7 +67,10 @@ public class Packages {
Platforms.onDPKG(() -> { Platforms.onDPKG(() -> {
assertThat(status.exitCode, anyOf(is(0), is(1))); assertThat(status.exitCode, anyOf(is(0), is(1)));
if (status.exitCode == 0) { if (status.exitCode == 0) {
assertTrue(Pattern.compile("(?m)^Status:.+deinstall ok").matcher(status.stdout).find()); assertTrue("an uninstalled status should be indicated: " + status.stdout,
Pattern.compile("(?m)^Status:.+deinstall ok").matcher(status.stdout).find() ||
Pattern.compile("(?m)^Status:.+ok not-installed").matcher(status.stdout).find()
);
} }
}); });
} }
@ -90,13 +93,27 @@ public class Packages {
} }
public static Installation install(Distribution distribution, String version) { public static Installation install(Distribution distribution, String version) {
final Result result = runInstallCommand(distribution, version);
if (result.exitCode != 0) {
throw new RuntimeException("Installing distribution " + distribution + " version " + version + " failed: " + result);
}
return Installation.ofPackage(distribution.packaging);
}
public static Result runInstallCommand(Distribution distribution) {
return runInstallCommand(distribution, getCurrentVersion());
}
public static Result runInstallCommand(Distribution distribution, String version) {
final Shell sh = new Shell(); final Shell sh = new Shell();
final Path distributionFile = getDistributionFile(distribution, version); final Path distributionFile = getDistributionFile(distribution, version);
Platforms.onRPM(() -> sh.run("rpm -i " + distributionFile)); if (Platforms.isRPM()) {
Platforms.onDPKG(() -> sh.run("dpkg -i " + distributionFile)); return sh.runIgnoreExitCode("rpm -i " + distributionFile);
} else {
return Installation.ofPackage(distribution.packaging); return sh.runIgnoreExitCode("dpkg -i " + distributionFile);
}
} }
public static void remove(Distribution distribution) { public static void remove(Distribution distribution) {

View File

@ -0,0 +1,74 @@
setup:
- skip:
version: " - 6.3.99"
reason: weighted_avg is only available as of 6.4.0
- do:
indices.create:
index: test_1
body:
settings:
number_of_replicas: 0
mappings:
doc:
properties:
int_field:
type : integer
double_field:
type : double
string_field:
type: keyword
- do:
bulk:
refresh: true
body:
- index:
_index: test_1
_type: doc
_id: 1
- int_field: 1
double_field: 1.0
- index:
_index: test_1
_type: doc
_id: 2
- int_field: 2
double_field: 2.0
- index:
_index: test_1
_type: doc
_id: 3
- int_field: 3
double_field: 3.0
- index:
_index: test_1
_type: doc
_id: 4
- int_field: 4
double_field: 4.0
---
"Basic test":
- do:
search:
body:
aggs:
the_int_avg:
weighted_avg:
value:
field: "int_field"
weight:
field: "int_field"
the_double_avg:
weighted_avg:
value:
field: "double_field"
weight:
field: "double_field"
- match: { hits.total: 4 }
- length: { hits.hits: 4 }
- match: { aggregations.the_int_avg.value: 3.0 }
- match: { aggregations.the_double_avg.value: 3.0 }

View File

@ -174,6 +174,8 @@ public class Version implements Comparable<Version>, ToXContentFragment {
public static final Version V_6_3_1 = new Version(V_6_3_1_ID, org.apache.lucene.util.Version.LUCENE_7_3_1); public static final Version V_6_3_1 = new Version(V_6_3_1_ID, org.apache.lucene.util.Version.LUCENE_7_3_1);
public static final int V_6_3_2_ID = 6030299; public static final int V_6_3_2_ID = 6030299;
public static final Version V_6_3_2 = new Version(V_6_3_2_ID, org.apache.lucene.util.Version.LUCENE_7_3_1); public static final Version V_6_3_2 = new Version(V_6_3_2_ID, org.apache.lucene.util.Version.LUCENE_7_3_1);
public static final int V_6_3_3_ID = 6030399;
public static final Version V_6_3_3 = new Version(V_6_3_3_ID, org.apache.lucene.util.Version.LUCENE_7_3_1);
public static final int V_6_4_0_ID = 6040099; public static final int V_6_4_0_ID = 6040099;
public static final Version V_6_4_0 = new Version(V_6_4_0_ID, org.apache.lucene.util.Version.LUCENE_7_4_0); public static final Version V_6_4_0 = new Version(V_6_4_0_ID, org.apache.lucene.util.Version.LUCENE_7_4_0);
public static final int V_7_0_0_alpha1_ID = 7000001; public static final int V_7_0_0_alpha1_ID = 7000001;
@ -196,6 +198,8 @@ public class Version implements Comparable<Version>, ToXContentFragment {
return V_7_0_0_alpha1; return V_7_0_0_alpha1;
case V_6_4_0_ID: case V_6_4_0_ID:
return V_6_4_0; return V_6_4_0;
case V_6_3_3_ID:
return V_6_3_3;
case V_6_3_2_ID: case V_6_3_2_ID:
return V_6_3_2; return V_6_3_2;
case V_6_3_1_ID: case V_6_3_1_ID:

View File

@ -48,7 +48,6 @@ public class ClusterGetSettingsResponse extends ActionResponse implements ToXCon
static final String TRANSIENT_FIELD = "transient"; static final String TRANSIENT_FIELD = "transient";
static final String DEFAULTS_FIELD = "defaults"; static final String DEFAULTS_FIELD = "defaults";
@SuppressWarnings("unchecked")
private static final ConstructingObjectParser<ClusterGetSettingsResponse, Void> PARSER = private static final ConstructingObjectParser<ClusterGetSettingsResponse, Void> PARSER =
new ConstructingObjectParser<>( new ConstructingObjectParser<>(
"cluster_get_settings_response", "cluster_get_settings_response",

View File

@ -27,14 +27,17 @@ import org.elasticsearch.common.Strings;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.ToXContentObject;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory; import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.XContentType;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Objects;
import static org.elasticsearch.action.ValidateActions.addValidationError; import static org.elasticsearch.action.ValidateActions.addValidationError;
import static org.elasticsearch.common.settings.Settings.readSettingsFromStream; import static org.elasticsearch.common.settings.Settings.readSettingsFromStream;
@ -45,7 +48,7 @@ import static org.elasticsearch.common.xcontent.support.XContentMapValues.nodeBo
/** /**
* Restore snapshot request * Restore snapshot request
*/ */
public class RestoreSnapshotRequest extends MasterNodeRequest<RestoreSnapshotRequest> { public class RestoreSnapshotRequest extends MasterNodeRequest<RestoreSnapshotRequest> implements ToXContentObject {
private String snapshot; private String snapshot;
private String repository; private String repository;
@ -563,6 +566,49 @@ public class RestoreSnapshotRequest extends MasterNodeRequest<RestoreSnapshotReq
return this; return this;
} }
@Override
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
builder.startObject();
builder.startArray("indices");
for (String index : indices) {
builder.value(index);
}
builder.endArray();
if (indicesOptions != null) {
indicesOptions.toXContent(builder, params);
}
if (renamePattern != null) {
builder.field("rename_pattern", renamePattern);
}
if (renameReplacement != null) {
builder.field("rename_replacement", renameReplacement);
}
builder.field("include_global_state", includeGlobalState);
builder.field("partial", partial);
builder.field("include_aliases", includeAliases);
if (settings != null) {
builder.startObject("settings");
if (settings.isEmpty() == false) {
settings.toXContent(builder, params);
}
builder.endObject();
}
if (indexSettings != null) {
builder.startObject("index_settings");
if (indexSettings.isEmpty() == false) {
indexSettings.toXContent(builder, params);
}
builder.endObject();
}
builder.startArray("ignore_index_settings");
for (String ignoreIndexSetting : ignoreIndexSettings) {
builder.value(ignoreIndexSetting);
}
builder.endArray();
builder.endObject();
return builder;
}
@Override @Override
public void readFrom(StreamInput in) throws IOException { public void readFrom(StreamInput in) throws IOException {
throw new UnsupportedOperationException("usage of Streamable is to be replaced by Writeable"); throw new UnsupportedOperationException("usage of Streamable is to be replaced by Writeable");
@ -573,4 +619,37 @@ public class RestoreSnapshotRequest extends MasterNodeRequest<RestoreSnapshotReq
return "snapshot [" + repository + ":" + snapshot + "]"; return "snapshot [" + repository + ":" + snapshot + "]";
} }
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
RestoreSnapshotRequest that = (RestoreSnapshotRequest) o;
return waitForCompletion == that.waitForCompletion &&
includeGlobalState == that.includeGlobalState &&
partial == that.partial &&
includeAliases == that.includeAliases &&
Objects.equals(snapshot, that.snapshot) &&
Objects.equals(repository, that.repository) &&
Arrays.equals(indices, that.indices) &&
Objects.equals(indicesOptions, that.indicesOptions) &&
Objects.equals(renamePattern, that.renamePattern) &&
Objects.equals(renameReplacement, that.renameReplacement) &&
Objects.equals(settings, that.settings) &&
Objects.equals(indexSettings, that.indexSettings) &&
Arrays.equals(ignoreIndexSettings, that.ignoreIndexSettings);
}
@Override
public int hashCode() {
int result = Objects.hash(snapshot, repository, indicesOptions, renamePattern, renameReplacement, waitForCompletion,
includeGlobalState, partial, includeAliases, settings, indexSettings);
result = 31 * result + Arrays.hashCode(indices);
result = 31 * result + Arrays.hashCode(ignoreIndexSettings);
return result;
}
@Override
public String toString() {
return Strings.toString(this);
}
} }

View File

@ -21,15 +21,21 @@ package org.elasticsearch.action.admin.cluster.snapshots.restore;
import org.elasticsearch.action.ActionResponse; import org.elasticsearch.action.ActionResponse;
import org.elasticsearch.common.Nullable; import org.elasticsearch.common.Nullable;
import org.elasticsearch.common.ParseField;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.xcontent.ConstructingObjectParser;
import org.elasticsearch.common.xcontent.ToXContent; import org.elasticsearch.common.xcontent.ToXContent;
import org.elasticsearch.common.xcontent.ToXContentObject; import org.elasticsearch.common.xcontent.ToXContentObject;
import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentParser;
import org.elasticsearch.rest.RestStatus; import org.elasticsearch.rest.RestStatus;
import org.elasticsearch.snapshots.RestoreInfo; import org.elasticsearch.snapshots.RestoreInfo;
import java.io.IOException; import java.io.IOException;
import java.util.Objects;
import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg;
/** /**
* Contains information about restores snapshot * Contains information about restores snapshot
@ -86,4 +92,42 @@ public class RestoreSnapshotResponse extends ActionResponse implements ToXConten
builder.endObject(); builder.endObject();
return builder; return builder;
} }
public static final ConstructingObjectParser<RestoreSnapshotResponse, Void> PARSER = new ConstructingObjectParser<>(
"restore_snapshot", true, v -> {
RestoreInfo restoreInfo = (RestoreInfo) v[0];
Boolean accepted = (Boolean) v[1];
assert (accepted == null && restoreInfo != null) ||
(accepted != null && accepted && restoreInfo == null) :
"accepted: [" + accepted + "], restoreInfo: [" + restoreInfo + "]";
return new RestoreSnapshotResponse(restoreInfo);
});
static {
PARSER.declareObject(optionalConstructorArg(), (parser, context) -> RestoreInfo.fromXContent(parser), new ParseField("snapshot"));
PARSER.declareBoolean(optionalConstructorArg(), new ParseField("accepted"));
}
public static RestoreSnapshotResponse fromXContent(XContentParser parser) throws IOException {
return PARSER.parse(parser, null);
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
RestoreSnapshotResponse that = (RestoreSnapshotResponse) o;
return Objects.equals(restoreInfo, that.restoreInfo);
}
@Override
public int hashCode() {
return Objects.hash(restoreInfo);
}
@Override
public String toString() {
return "RestoreSnapshotResponse{" + "restoreInfo=" + restoreInfo + '}';
}
} }

View File

@ -45,7 +45,6 @@ public class QueryExplanation implements Streamable, ToXContentFragment {
public static final int RANDOM_SHARD = -1; public static final int RANDOM_SHARD = -1;
@SuppressWarnings("unchecked")
static ConstructingObjectParser<QueryExplanation, Void> PARSER = new ConstructingObjectParser<>( static ConstructingObjectParser<QueryExplanation, Void> PARSER = new ConstructingObjectParser<>(
"query_explanation", "query_explanation",
true, true,

View File

@ -129,7 +129,6 @@ public class GetResponse extends ActionResponse implements Iterable<DocumentFiel
/** /**
* The source of the document (As a map). * The source of the document (As a map).
*/ */
@SuppressWarnings({"unchecked"})
public Map<String, Object> getSourceAsMap() throws ElasticsearchParseException { public Map<String, Object> getSourceAsMap() throws ElasticsearchParseException {
return getResult.sourceAsMap(); return getResult.sourceAsMap();
} }

View File

@ -20,7 +20,6 @@
package org.elasticsearch.action.get; package org.elasticsearch.action.get;
import org.apache.logging.log4j.message.ParameterizedMessage; import org.apache.logging.log4j.message.ParameterizedMessage;
import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.action.support.ActionFilters; import org.elasticsearch.action.support.ActionFilters;
import org.elasticsearch.action.support.TransportActions; import org.elasticsearch.action.support.TransportActions;
import org.elasticsearch.action.support.single.shard.TransportSingleShardAction; import org.elasticsearch.action.support.single.shard.TransportSingleShardAction;
@ -90,9 +89,9 @@ public class TransportShardMultiGetAction extends TransportSingleShardAction<Mul
GetResult getResult = indexShard.getService().get(item.type(), item.id(), item.storedFields(), request.realtime(), item.version(), GetResult getResult = indexShard.getService().get(item.type(), item.id(), item.storedFields(), request.realtime(), item.version(),
item.versionType(), item.fetchSourceContext()); item.versionType(), item.fetchSourceContext());
response.add(request.locations.get(i), new GetResponse(getResult)); response.add(request.locations.get(i), new GetResponse(getResult));
} catch (Exception e) { } catch (RuntimeException e) {
if (TransportActions.isShardNotAvailableException(e)) { if (TransportActions.isShardNotAvailableException(e)) {
throw (ElasticsearchException) e; throw e;
} else { } else {
logger.debug(() -> new ParameterizedMessage("{} failed to execute multi_get for [{}]/[{}]", shardId, logger.debug(() -> new ParameterizedMessage("{} failed to execute multi_get for [{}]/[{}]", shardId,
item.type(), item.id()), e); item.type(), item.id()), e);

View File

@ -32,8 +32,8 @@ import org.elasticsearch.ingest.IngestDocument;
import java.io.IOException; import java.io.IOException;
import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg;
import static org.elasticsearch.common.xcontent.ConstructingObjectParser.constructorArg; import static org.elasticsearch.common.xcontent.ConstructingObjectParser.constructorArg;
import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg;
public class SimulateProcessorResult implements Writeable, ToXContentObject { public class SimulateProcessorResult implements Writeable, ToXContentObject {
@ -42,7 +42,6 @@ public class SimulateProcessorResult implements Writeable, ToXContentObject {
private final WriteableIngestDocument ingestDocument; private final WriteableIngestDocument ingestDocument;
private final Exception failure; private final Exception failure;
@SuppressWarnings("unchecked")
private static final ConstructingObjectParser<ElasticsearchException, Void> IGNORED_ERROR_PARSER = private static final ConstructingObjectParser<ElasticsearchException, Void> IGNORED_ERROR_PARSER =
new ConstructingObjectParser<>( new ConstructingObjectParser<>(
"ignored_error_parser", "ignored_error_parser",
@ -57,7 +56,6 @@ public class SimulateProcessorResult implements Writeable, ToXContentObject {
); );
} }
@SuppressWarnings("unchecked")
public static final ConstructingObjectParser<SimulateProcessorResult, Void> PARSER = public static final ConstructingObjectParser<SimulateProcessorResult, Void> PARSER =
new ConstructingObjectParser<>( new ConstructingObjectParser<>(
"simulate_processor_result", "simulate_processor_result",

View File

@ -94,7 +94,6 @@ final class WriteableIngestDocument implements Writeable, ToXContentFragment {
); );
} }
@SuppressWarnings("unchecked")
public static final ConstructingObjectParser<WriteableIngestDocument, Void> PARSER = public static final ConstructingObjectParser<WriteableIngestDocument, Void> PARSER =
new ConstructingObjectParser<>( new ConstructingObjectParser<>(
"writeable_ingest_document", "writeable_ingest_document",

View File

@ -20,6 +20,7 @@
package org.elasticsearch.action.support; package org.elasticsearch.action.support;
import com.carrotsearch.hppc.cursors.IntObjectCursor; import com.carrotsearch.hppc.cursors.IntObjectCursor;
import org.elasticsearch.cluster.ClusterState; import org.elasticsearch.cluster.ClusterState;
import org.elasticsearch.cluster.metadata.IndexMetaData; import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.cluster.routing.IndexRoutingTable; import org.elasticsearch.cluster.routing.IndexRoutingTable;
@ -205,7 +206,7 @@ public final class ActiveShardCount implements Writeable {
if (o == null || getClass() != o.getClass()) { if (o == null || getClass() != o.getClass()) {
return false; return false;
} }
@SuppressWarnings("unchecked") ActiveShardCount that = (ActiveShardCount) o; ActiveShardCount that = (ActiveShardCount) o;
return value == that.value; return value == that.value;
} }

View File

@ -72,7 +72,6 @@ public abstract class ReplicationRequestBuilder<Request extends ReplicationReque
* shard count is passed in, instead of having to first call {@link ActiveShardCount#from(int)} * shard count is passed in, instead of having to first call {@link ActiveShardCount#from(int)}
* to get the ActiveShardCount. * to get the ActiveShardCount.
*/ */
@SuppressWarnings("unchecked")
public RequestBuilder setWaitForActiveShards(final int waitForActiveShards) { public RequestBuilder setWaitForActiveShards(final int waitForActiveShards) {
return setWaitForActiveShards(ActiveShardCount.from(waitForActiveShards)); return setWaitForActiveShards(ActiveShardCount.from(waitForActiveShards));
} }

View File

@ -20,7 +20,6 @@
package org.elasticsearch.action.termvectors; package org.elasticsearch.action.termvectors;
import org.apache.logging.log4j.message.ParameterizedMessage; import org.apache.logging.log4j.message.ParameterizedMessage;
import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.action.support.ActionFilters; import org.elasticsearch.action.support.ActionFilters;
import org.elasticsearch.action.support.TransportActions; import org.elasticsearch.action.support.TransportActions;
import org.elasticsearch.action.support.single.shard.TransportSingleShardAction; import org.elasticsearch.action.support.single.shard.TransportSingleShardAction;
@ -84,13 +83,13 @@ public class TransportShardMultiTermsVectorAction extends TransportSingleShardAc
try { try {
TermVectorsResponse termVectorsResponse = TermVectorsService.getTermVectors(indexShard, termVectorsRequest); TermVectorsResponse termVectorsResponse = TermVectorsService.getTermVectors(indexShard, termVectorsRequest);
response.add(request.locations.get(i), termVectorsResponse); response.add(request.locations.get(i), termVectorsResponse);
} catch (Exception t) { } catch (RuntimeException e) {
if (TransportActions.isShardNotAvailableException(t)) { if (TransportActions.isShardNotAvailableException(e)) {
throw (ElasticsearchException) t; throw e;
} else { } else {
logger.debug(() -> new ParameterizedMessage("{} failed to execute multi term vectors for [{}]/[{}]", shardId, termVectorsRequest.type(), termVectorsRequest.id()), t); logger.debug(() -> new ParameterizedMessage("{} failed to execute multi term vectors for [{}]/[{}]", shardId, termVectorsRequest.type(), termVectorsRequest.id()), e);
response.add(request.locations.get(i), response.add(request.locations.get(i),
new MultiTermVectorsResponse.Failure(request.index(), termVectorsRequest.type(), termVectorsRequest.id(), t)); new MultiTermVectorsResponse.Failure(request.index(), termVectorsRequest.type(), termVectorsRequest.id(), e));
} }
} }
} }

View File

@ -20,6 +20,7 @@
package org.elasticsearch.cluster; package org.elasticsearch.cluster;
import com.carrotsearch.hppc.cursors.ObjectObjectCursor; import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
import org.elasticsearch.Version; import org.elasticsearch.Version;
import org.elasticsearch.cluster.ClusterState.Custom; import org.elasticsearch.cluster.ClusterState.Custom;
import org.elasticsearch.common.collect.ImmutableOpenMap; import org.elasticsearch.common.collect.ImmutableOpenMap;
@ -165,7 +166,7 @@ public class RestoreInProgress extends AbstractNamedDiffable<Custom> implements
if (o == null || getClass() != o.getClass()) { if (o == null || getClass() != o.getClass()) {
return false; return false;
} }
@SuppressWarnings("unchecked") Entry entry = (Entry) o; Entry entry = (Entry) o;
return snapshot.equals(entry.snapshot) && return snapshot.equals(entry.snapshot) &&
state == entry.state && state == entry.state &&
indices.equals(entry.indices) && indices.equals(entry.indices) &&
@ -291,7 +292,7 @@ public class RestoreInProgress extends AbstractNamedDiffable<Custom> implements
return false; return false;
} }
@SuppressWarnings("unchecked") ShardRestoreStatus status = (ShardRestoreStatus) o; ShardRestoreStatus status = (ShardRestoreStatus) o;
return state == status.state && return state == status.state &&
Objects.equals(nodeId, status.nodeId) && Objects.equals(nodeId, status.nodeId) &&
Objects.equals(reason, status.reason); Objects.equals(reason, status.reason);

View File

@ -161,7 +161,6 @@ public final class IndexGraveyard implements MetaData.Custom {
} }
@Override @Override
@SuppressWarnings("unchecked")
public Diff<MetaData.Custom> diff(final MetaData.Custom previous) { public Diff<MetaData.Custom> diff(final MetaData.Custom previous) {
return new IndexGraveyardDiff((IndexGraveyard) previous, this); return new IndexGraveyardDiff((IndexGraveyard) previous, this);
} }
@ -321,7 +320,7 @@ public final class IndexGraveyard implements MetaData.Custom {
@Override @Override
public IndexGraveyard apply(final MetaData.Custom previous) { public IndexGraveyard apply(final MetaData.Custom previous) {
@SuppressWarnings("unchecked") final IndexGraveyard old = (IndexGraveyard) previous; final IndexGraveyard old = (IndexGraveyard) previous;
if (removedCount > old.tombstones.size()) { if (removedCount > old.tombstones.size()) {
throw new IllegalStateException("IndexGraveyardDiff cannot remove [" + removedCount + "] entries from [" + throw new IllegalStateException("IndexGraveyardDiff cannot remove [" + removedCount + "] entries from [" +
old.tombstones.size() + "] tombstones."); old.tombstones.size() + "] tombstones.");
@ -416,7 +415,7 @@ public final class IndexGraveyard implements MetaData.Custom {
if (other == null || getClass() != other.getClass()) { if (other == null || getClass() != other.getClass()) {
return false; return false;
} }
@SuppressWarnings("unchecked") Tombstone that = (Tombstone) other; Tombstone that = (Tombstone) other;
return index.equals(that.index) && deleteDateInMillis == that.deleteDateInMillis; return index.equals(that.index) && deleteDateInMillis == that.deleteDateInMillis;
} }

View File

@ -23,6 +23,7 @@ import com.carrotsearch.hppc.LongArrayList;
import com.carrotsearch.hppc.cursors.IntObjectCursor; import com.carrotsearch.hppc.cursors.IntObjectCursor;
import com.carrotsearch.hppc.cursors.ObjectCursor; import com.carrotsearch.hppc.cursors.ObjectCursor;
import com.carrotsearch.hppc.cursors.ObjectObjectCursor; import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
import org.elasticsearch.Version; import org.elasticsearch.Version;
import org.elasticsearch.action.admin.indices.rollover.RolloverInfo; import org.elasticsearch.action.admin.indices.rollover.RolloverInfo;
import org.elasticsearch.action.support.ActiveShardCount; import org.elasticsearch.action.support.ActiveShardCount;
@ -685,7 +686,6 @@ public class IndexMetaData implements Diffable<IndexMetaData>, ToXContentFragmen
return lookupPrototypeSafe(key).readFrom(in); return lookupPrototypeSafe(key).readFrom(in);
} }
@SuppressWarnings("unchecked")
@Override @Override
public Diff<Custom> readDiff(StreamInput in, String key) throws IOException { public Diff<Custom> readDiff(StreamInput in, String key) throws IOException {
return lookupPrototypeSafe(key).readDiffFrom(in); return lookupPrototypeSafe(key).readDiffFrom(in);

View File

@ -381,7 +381,6 @@ public class IndexTemplateMetaData extends AbstractDiffable<IndexTemplateMetaDat
aliases.build(), customs.build()); aliases.build(), customs.build());
} }
@SuppressWarnings("unchecked")
public static void toXContent(IndexTemplateMetaData indexTemplateMetaData, XContentBuilder builder, ToXContent.Params params) public static void toXContent(IndexTemplateMetaData indexTemplateMetaData, XContentBuilder builder, ToXContent.Params params)
throws IOException { throws IOException {
builder.startObject(indexTemplateMetaData.name()); builder.startObject(indexTemplateMetaData.name());

View File

@ -22,6 +22,7 @@ package org.elasticsearch.cluster.metadata;
import com.carrotsearch.hppc.ObjectHashSet; import com.carrotsearch.hppc.ObjectHashSet;
import com.carrotsearch.hppc.cursors.ObjectCursor; import com.carrotsearch.hppc.cursors.ObjectCursor;
import com.carrotsearch.hppc.cursors.ObjectObjectCursor; import com.carrotsearch.hppc.cursors.ObjectObjectCursor;
import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.Logger;
import org.apache.lucene.util.CollectionUtil; import org.apache.lucene.util.CollectionUtil;
import org.elasticsearch.action.AliasesRequest; import org.elasticsearch.action.AliasesRequest;
@ -169,7 +170,6 @@ public class MetaData implements Iterable<IndexMetaData>, Diffable<MetaData>, To
private final SortedMap<String, AliasOrIndex> aliasAndIndexLookup; private final SortedMap<String, AliasOrIndex> aliasAndIndexLookup;
@SuppressWarnings("unchecked")
MetaData(String clusterUUID, long version, Settings transientSettings, Settings persistentSettings, MetaData(String clusterUUID, long version, Settings transientSettings, Settings persistentSettings,
ImmutableOpenMap<String, IndexMetaData> indices, ImmutableOpenMap<String, IndexTemplateMetaData> templates, ImmutableOpenMap<String, IndexMetaData> indices, ImmutableOpenMap<String, IndexTemplateMetaData> templates,
ImmutableOpenMap<String, Custom> customs, String[] allIndices, String[] allOpenIndices, String[] allClosedIndices, ImmutableOpenMap<String, Custom> customs, String[] allIndices, String[] allOpenIndices, String[] allClosedIndices,
@ -1000,7 +1000,7 @@ public class MetaData implements Iterable<IndexMetaData>, Diffable<MetaData>, To
} }
public IndexGraveyard indexGraveyard() { public IndexGraveyard indexGraveyard() {
@SuppressWarnings("unchecked") IndexGraveyard graveyard = (IndexGraveyard) getCustom(IndexGraveyard.TYPE); IndexGraveyard graveyard = (IndexGraveyard) getCustom(IndexGraveyard.TYPE);
return graveyard; return graveyard;
} }

View File

@ -217,7 +217,7 @@ public abstract class RecoverySource implements Writeable, ToXContentObject {
return false; return false;
} }
@SuppressWarnings("unchecked") SnapshotRecoverySource that = (SnapshotRecoverySource) o; SnapshotRecoverySource that = (SnapshotRecoverySource) o;
return snapshot.equals(that.snapshot) && index.equals(that.index) && version.equals(that.version); return snapshot.equals(that.snapshot) && index.equals(that.index) && version.equals(that.version);
} }

View File

@ -175,7 +175,7 @@ public abstract class AbstractAllocationDecision implements ToXContentFragment,
if (other == null || other instanceof AbstractAllocationDecision == false) { if (other == null || other instanceof AbstractAllocationDecision == false) {
return false; return false;
} }
@SuppressWarnings("unchecked") AbstractAllocationDecision that = (AbstractAllocationDecision) other; AbstractAllocationDecision that = (AbstractAllocationDecision) other;
return Objects.equals(targetNode, that.targetNode) && Objects.equals(nodeDecisions, that.nodeDecisions); return Objects.equals(targetNode, that.targetNode) && Objects.equals(nodeDecisions, that.nodeDecisions);
} }

View File

@ -316,7 +316,7 @@ public class AllocateUnassignedDecision extends AbstractAllocationDecision {
if (other instanceof AllocateUnassignedDecision == false) { if (other instanceof AllocateUnassignedDecision == false) {
return false; return false;
} }
@SuppressWarnings("unchecked") AllocateUnassignedDecision that = (AllocateUnassignedDecision) other; AllocateUnassignedDecision that = (AllocateUnassignedDecision) other;
return Objects.equals(allocationStatus, that.allocationStatus) return Objects.equals(allocationStatus, that.allocationStatus)
&& Objects.equals(allocationId, that.allocationId) && Objects.equals(allocationId, that.allocationId)
&& reuseStore == that.reuseStore && reuseStore == that.reuseStore

View File

@ -300,7 +300,7 @@ public final class MoveDecision extends AbstractAllocationDecision {
if (other instanceof MoveDecision == false) { if (other instanceof MoveDecision == false) {
return false; return false;
} }
@SuppressWarnings("unchecked") MoveDecision that = (MoveDecision) other; MoveDecision that = (MoveDecision) other;
return Objects.equals(allocationDecision, that.allocationDecision) return Objects.equals(allocationDecision, that.allocationDecision)
&& Objects.equals(canRemainDecision, that.canRemainDecision) && Objects.equals(canRemainDecision, that.canRemainDecision)
&& Objects.equals(clusterRebalanceDecision, that.clusterRebalanceDecision) && Objects.equals(clusterRebalanceDecision, that.clusterRebalanceDecision)

Some files were not shown because too many files have changed in this diff Show More