Merge branch 'master' into feature/sql
Original commit: elastic/x-pack-elasticsearch@270ab71b19
This commit is contained in:
commit
bac9afee7e
|
@ -73,6 +73,14 @@ case $key in
|
|||
"-x:x-pack-elasticsearch:qa:sql:forbiddenPatterns"
|
||||
"-x:x-pack-elasticsearch:qa:sql:no-security:forbiddenPatterns"
|
||||
"-x:x-pack-elasticsearch:qa:sql:security:forbiddenPatterns"
|
||||
releaseTest)
|
||||
GRADLE_CLI_ARGS=(
|
||||
"--info"
|
||||
"check"
|
||||
"-Dtests.network=true"
|
||||
"-Dtests.badapples=true"
|
||||
"-Dbuild.snapshot=false"
|
||||
"-Dtests.jvm.argline=-Dbuild.snapshot=false"
|
||||
)
|
||||
;;
|
||||
jdk9)
|
||||
|
|
|
@ -0,0 +1,16 @@
|
|||
[role="xpack"]
|
||||
[[xpack-commands]]
|
||||
= {xpack} Commands
|
||||
|
||||
[partintro]
|
||||
--
|
||||
|
||||
{xpack} includes commands that help you configure security:
|
||||
|
||||
//* <<certgen>>
|
||||
//* <<setup-passwords>>
|
||||
* <<users-command>>
|
||||
|
||||
--
|
||||
|
||||
include::users-command.asciidoc[]
|
|
@ -0,0 +1,138 @@
|
|||
[role="xpack"]
|
||||
[[users-command]]
|
||||
== Users Command
|
||||
++++
|
||||
<titleabbrev>users</titleabbrev>
|
||||
++++
|
||||
|
||||
If you use file-based user authentication, the `users` command enables you to
|
||||
add and remove users, assign user roles, and manage passwords.
|
||||
|
||||
[float]
|
||||
=== Synopsis
|
||||
|
||||
[source,shell]
|
||||
--------------------------------------------------
|
||||
bin/x-pack/users
|
||||
([useradd <username>] [-p <password>] [-r <roles>]) |
|
||||
([list] <username>) |
|
||||
([passwd <username>] [-p <password>]) |
|
||||
([roles <username>] [-a <roles>] [-r <roles>]) |
|
||||
([userdel <username>])
|
||||
--------------------------------------------------
|
||||
|
||||
[float]
|
||||
=== Description
|
||||
|
||||
If you use the built-in `file` internal realm, users are defined in local files
|
||||
on each node in the cluster.
|
||||
|
||||
Usernames and roles must be at least 1 and no more than 1024 characters. They
|
||||
can contain alphanumeric characters (`a-z`, `A-Z`, `0-9`), spaces, punctuation,
|
||||
and printable symbols in the
|
||||
https://en.wikipedia.org/wiki/Basic_Latin_(Unicode_block)[Basic Latin (ASCII) block].
|
||||
Leading or trailing whitespace is not allowed.
|
||||
|
||||
Passwords must be at least 6 characters long.
|
||||
|
||||
For more information, see {xpack-ref}/file-realm.html[File-based User Authentication].
|
||||
|
||||
TIP: To ensure that {es} can read the user and role information at startup, run
|
||||
`users useradd` as the same user you use to run {es}. Running the command as
|
||||
root or some other user updates the permissions for the `users` and `users_roles`
|
||||
files and prevents {es} from accessing them.
|
||||
|
||||
[float]
|
||||
=== Parameters
|
||||
|
||||
`-a <roles>`:: If used with the `roles` parameter, adds a comma-separated list
|
||||
of roles to a user.
|
||||
|
||||
//`-h, --help`:: Returns all of the command parameters.
|
||||
|
||||
`list`:: List the users that are registered with the `file` realm
|
||||
on the local node. If you also specify a user name, the command provides
|
||||
information for that user.
|
||||
|
||||
`-p <password>`:: Specifies the user's password. If you do not specify this
|
||||
parameter, the command prompts you for the password.
|
||||
+
|
||||
--
|
||||
TIP: Omit the `-p` option to keep
|
||||
plaintext passwords out of the terminal session's command history.
|
||||
|
||||
--
|
||||
|
||||
`passwd <username>`:: Resets a user's password. You can specify the new
|
||||
password directly with the `-p` parameter.
|
||||
|
||||
`-r <roles>`::
|
||||
* If used with the `useradd` parameter, defines a user's roles. This option
|
||||
accepts a comma-separated list of role names to assign to the user.
|
||||
* If used with the `roles` parameter, removes a comma-separated list of roles
|
||||
from a user.
|
||||
|
||||
`roles`:: Manages the roles of a particular user. You can combine adding and
|
||||
removing roles within the same command to change a user's roles.
|
||||
|
||||
//`-s, --silent`:: Shows minimal output.
|
||||
|
||||
`useradd <username>`:: Adds a user to your local node.
|
||||
|
||||
`userdel <username>`:: Deletes a user from your local node.
|
||||
|
||||
//`-v, --verbose`:: Shows verbose output.
|
||||
|
||||
//[float]
|
||||
//=== Authorization
|
||||
|
||||
[float]
|
||||
=== Examples
|
||||
|
||||
The following example adds a new user named `jacknich` to the `file` realm. The
|
||||
password for this user is `theshining`, and this user is associated with the
|
||||
`network` and `monitoring` roles.
|
||||
|
||||
[source,shell]
|
||||
-------------------------------------------------------------------
|
||||
bin/x-pack/users useradd jacknich -p theshining -r network,monitoring
|
||||
-------------------------------------------------------------------
|
||||
|
||||
The following example lists the users that are registered with the `file` realm
|
||||
on the local node:
|
||||
|
||||
[source, shell]
|
||||
----------------------------------
|
||||
bin/x-pack/users list
|
||||
rdeniro : admin
|
||||
alpacino : power_user
|
||||
jacknich : monitoring,network
|
||||
----------------------------------
|
||||
|
||||
Users are in the left-hand column and their corresponding roles are listed in
|
||||
the right-hand column.
|
||||
|
||||
The following example resets the `jacknich` user's password:
|
||||
|
||||
[source,shell]
|
||||
--------------------------------------------------
|
||||
bin/x-pack/users passwd jachnich
|
||||
--------------------------------------------------
|
||||
|
||||
Since the `-p` parameter was omitted, the command prompts you to enter and
|
||||
confirm a password in interactive mode.
|
||||
|
||||
The following example removes the `network` and `monitoring` roles from the
|
||||
`jacknich` user and adds the `user` role:
|
||||
|
||||
[source,shell]
|
||||
------------------------------------------------------------
|
||||
bin/x-pack/users roles jacknich -r network,monitoring -a user
|
||||
------------------------------------------------------------
|
||||
|
||||
The following example deletes the `jacknich` user:
|
||||
|
||||
[source,shell]
|
||||
--------------------------------------------------
|
||||
bin/x-pack/users userdel jacknich
|
||||
--------------------------------------------------
|
|
@ -21,9 +21,11 @@ include::{es-repo-dir}/reference/index-shared2.asciidoc[]
|
|||
include::sql/index.asciidoc[]
|
||||
include::rest-api/index.asciidoc[]
|
||||
|
||||
|
||||
# NOCOMMIT before merging SQL we should make a index-shared4 in core and
|
||||
# put index-shared3 between sql and rest-api.
|
||||
|
||||
:edit_url!:
|
||||
include::commands/index.asciidoc[]
|
||||
|
||||
:edit_url:
|
||||
include::{es-repo-dir}/reference/index-shared3.asciidoc[]
|
||||
|
|
|
@ -111,7 +111,7 @@ S3Object getZip() {
|
|||
return client.getObject('prelert-artifacts', key)
|
||||
} catch (AmazonServiceException e) {
|
||||
if (e.getStatusCode() != 403) {
|
||||
throw new GradleException('Error while trying to get ml-cpp snapshot', e)
|
||||
throw new GradleException('Error while trying to get ml-cpp snapshot: ' + e.getMessage(), e)
|
||||
}
|
||||
sleep(500)
|
||||
retries--
|
||||
|
|
|
@ -306,7 +306,7 @@ public class XPackPlugin extends Plugin implements ScriptPlugin, ActionPlugin, I
|
|||
components.add(licenseState);
|
||||
|
||||
try {
|
||||
components.addAll(security.createComponents(internalClient, threadPool, clusterService, resourceWatcherService,
|
||||
components.addAll(security.createComponents(client, threadPool, clusterService, resourceWatcherService,
|
||||
extensionsService.getExtensions()));
|
||||
} catch (final Exception e) {
|
||||
throw new IllegalStateException("security initialization failed", e);
|
||||
|
@ -342,7 +342,7 @@ public class XPackPlugin extends Plugin implements ScriptPlugin, ActionPlugin, I
|
|||
|
||||
components.addAll(logstash.createComponents(internalClient, clusterService));
|
||||
|
||||
components.addAll(upgrade.createComponents(internalClient, clusterService, threadPool, resourceWatcherService,
|
||||
components.addAll(upgrade.createComponents(client, clusterService, threadPool, resourceWatcherService,
|
||||
scriptService, xContentRegistry));
|
||||
|
||||
FilteredCatalog.Filter securityCatalogFilter = XPackSettings.SECURITY_ENABLED.get(settings) ?
|
||||
|
|
|
@ -44,6 +44,7 @@ import org.elasticsearch.xpack.ml.action.DeleteJobAction;
|
|||
import org.elasticsearch.xpack.ml.action.DeleteModelSnapshotAction;
|
||||
import org.elasticsearch.xpack.ml.action.FinalizeJobExecutionAction;
|
||||
import org.elasticsearch.xpack.ml.action.FlushJobAction;
|
||||
import org.elasticsearch.xpack.ml.action.ForecastJobAction;
|
||||
import org.elasticsearch.xpack.ml.action.GetBucketsAction;
|
||||
import org.elasticsearch.xpack.ml.action.GetCategoriesAction;
|
||||
import org.elasticsearch.xpack.ml.action.GetDatafeedsAction;
|
||||
|
@ -108,6 +109,7 @@ import org.elasticsearch.xpack.ml.rest.filter.RestPutFilterAction;
|
|||
import org.elasticsearch.xpack.ml.rest.job.RestCloseJobAction;
|
||||
import org.elasticsearch.xpack.ml.rest.job.RestDeleteJobAction;
|
||||
import org.elasticsearch.xpack.ml.rest.job.RestFlushJobAction;
|
||||
import org.elasticsearch.xpack.ml.rest.job.RestForecastJobAction;
|
||||
import org.elasticsearch.xpack.ml.rest.job.RestGetJobStatsAction;
|
||||
import org.elasticsearch.xpack.ml.rest.job.RestGetJobsAction;
|
||||
import org.elasticsearch.xpack.ml.rest.job.RestOpenJobAction;
|
||||
|
@ -383,7 +385,8 @@ public class MachineLearning implements ActionPlugin {
|
|||
new RestStartDatafeedAction(settings, restController),
|
||||
new RestStopDatafeedAction(settings, restController),
|
||||
new RestDeleteModelSnapshotAction(settings, restController),
|
||||
new RestDeleteExpiredDataAction(settings, restController)
|
||||
new RestDeleteExpiredDataAction(settings, restController),
|
||||
new RestForecastJobAction(settings, restController)
|
||||
);
|
||||
}
|
||||
|
||||
|
@ -431,7 +434,8 @@ public class MachineLearning implements ActionPlugin {
|
|||
new ActionHandler<>(CompletionPersistentTaskAction.INSTANCE, CompletionPersistentTaskAction.TransportAction.class),
|
||||
new ActionHandler<>(RemovePersistentTaskAction.INSTANCE, RemovePersistentTaskAction.TransportAction.class),
|
||||
new ActionHandler<>(UpdateProcessAction.INSTANCE, UpdateProcessAction.TransportAction.class),
|
||||
new ActionHandler<>(DeleteExpiredDataAction.INSTANCE, DeleteExpiredDataAction.TransportAction.class)
|
||||
new ActionHandler<>(DeleteExpiredDataAction.INSTANCE, DeleteExpiredDataAction.TransportAction.class),
|
||||
new ActionHandler<>(ForecastJobAction.INSTANCE, ForecastJobAction.TransportAction.class)
|
||||
);
|
||||
}
|
||||
|
||||
|
|
|
@ -0,0 +1,234 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.ml.action;
|
||||
|
||||
import org.elasticsearch.action.Action;
|
||||
import org.elasticsearch.action.ActionListener;
|
||||
import org.elasticsearch.action.ActionRequestBuilder;
|
||||
import org.elasticsearch.action.support.ActionFilters;
|
||||
import org.elasticsearch.action.support.tasks.BaseTasksResponse;
|
||||
import org.elasticsearch.client.ElasticsearchClient;
|
||||
import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver;
|
||||
import org.elasticsearch.cluster.service.ClusterService;
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.inject.Inject;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Writeable;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.ObjectParser;
|
||||
import org.elasticsearch.common.xcontent.ToXContentObject;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.transport.TransportService;
|
||||
import org.elasticsearch.xpack.ml.job.config.Job;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.AutodetectProcessManager;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.ForecastParams;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Objects;
|
||||
|
||||
import static org.elasticsearch.xpack.ml.action.ForecastJobAction.Request.END_TIME;
|
||||
|
||||
public class ForecastJobAction extends Action<ForecastJobAction.Request, ForecastJobAction.Response, ForecastJobAction.RequestBuilder> {
|
||||
|
||||
public static final ForecastJobAction INSTANCE = new ForecastJobAction();
|
||||
public static final String NAME = "cluster:admin/xpack/ml/job/forecast";
|
||||
|
||||
private ForecastJobAction() {
|
||||
super(NAME);
|
||||
}
|
||||
|
||||
@Override
|
||||
public RequestBuilder newRequestBuilder(ElasticsearchClient client) {
|
||||
return new RequestBuilder(client, this);
|
||||
}
|
||||
|
||||
@Override
|
||||
public Response newResponse() {
|
||||
return new Response();
|
||||
}
|
||||
|
||||
public static class Request extends TransportJobTaskAction.JobTaskRequest<Request> implements ToXContentObject {
|
||||
|
||||
public static final ParseField END_TIME = new ParseField("end");
|
||||
|
||||
private static final ObjectParser<Request, Void> PARSER = new ObjectParser<>(NAME, Request::new);
|
||||
|
||||
static {
|
||||
PARSER.declareString((request, jobId) -> request.jobId = jobId, Job.ID);
|
||||
PARSER.declareString(Request::setEndTime, END_TIME);
|
||||
}
|
||||
|
||||
public static Request parseRequest(String jobId, XContentParser parser) {
|
||||
Request request = PARSER.apply(parser, null);
|
||||
if (jobId != null) {
|
||||
request.jobId = jobId;
|
||||
}
|
||||
return request;
|
||||
}
|
||||
|
||||
private String endTime;
|
||||
|
||||
Request() {
|
||||
}
|
||||
|
||||
public Request(String jobId) {
|
||||
super(jobId);
|
||||
}
|
||||
|
||||
public String getEndTime() {
|
||||
return endTime;
|
||||
}
|
||||
|
||||
public void setEndTime(String endTime) {
|
||||
this.endTime = endTime;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
this.endTime = in.readOptionalString();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
out.writeOptionalString(endTime);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return Objects.hash(jobId, endTime);
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
if (obj == null || getClass() != obj.getClass()) {
|
||||
return false;
|
||||
}
|
||||
Request other = (Request) obj;
|
||||
return Objects.equals(jobId, other.jobId) && Objects.equals(endTime, other.endTime);
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject();
|
||||
builder.field(Job.ID.getPreferredName(), jobId);
|
||||
if (endTime != null) {
|
||||
builder.field(END_TIME.getPreferredName(), endTime);
|
||||
}
|
||||
builder.endObject();
|
||||
return builder;
|
||||
}
|
||||
}
|
||||
|
||||
static class RequestBuilder extends ActionRequestBuilder<Request, Response, RequestBuilder> {
|
||||
|
||||
RequestBuilder(ElasticsearchClient client, ForecastJobAction action) {
|
||||
super(client, action, new Request());
|
||||
}
|
||||
}
|
||||
|
||||
public static class Response extends BaseTasksResponse implements Writeable, ToXContentObject {
|
||||
|
||||
private boolean acknowledged;
|
||||
private long id;
|
||||
|
||||
Response() {
|
||||
super(null, null);
|
||||
}
|
||||
|
||||
Response(boolean acknowledged, long id) {
|
||||
super(null, null);
|
||||
this.acknowledged = acknowledged;
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public boolean isacknowledged() {
|
||||
return acknowledged;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void readFrom(StreamInput in) throws IOException {
|
||||
super.readFrom(in);
|
||||
acknowledged = in.readBoolean();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
super.writeTo(out);
|
||||
out.writeBoolean(acknowledged);
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject();
|
||||
builder.field("acknowledged", acknowledged);
|
||||
builder.field("id", id);
|
||||
builder.endObject();
|
||||
return builder;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o)
|
||||
return true;
|
||||
if (o == null || getClass() != o.getClass())
|
||||
return false;
|
||||
Response response = (Response) o;
|
||||
return acknowledged == response.acknowledged;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return Objects.hash(acknowledged);
|
||||
}
|
||||
}
|
||||
|
||||
public static class TransportAction extends TransportJobTaskAction<Request, Response> {
|
||||
|
||||
@Inject
|
||||
public TransportAction(Settings settings, TransportService transportService, ThreadPool threadPool, ClusterService clusterService,
|
||||
ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver,
|
||||
AutodetectProcessManager processManager) {
|
||||
super(settings, ForecastJobAction.NAME, threadPool, clusterService, transportService, actionFilters,
|
||||
indexNameExpressionResolver, ForecastJobAction.Request::new, ForecastJobAction.Response::new, ThreadPool.Names.SAME,
|
||||
processManager);
|
||||
// ThreadPool.Names.SAME, because operations is executed by
|
||||
// autodetect worker thread
|
||||
}
|
||||
|
||||
@Override
|
||||
protected ForecastJobAction.Response readTaskResponse(StreamInput in) throws IOException {
|
||||
Response response = new Response();
|
||||
response.readFrom(in);
|
||||
return response;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected void taskOperation(Request request, OpenJobAction.JobTask task, ActionListener<Response> listener) {
|
||||
ForecastParams.Builder paramsBuilder = ForecastParams.builder();
|
||||
if (request.getEndTime() != null) {
|
||||
paramsBuilder.endTime(request.getEndTime(), END_TIME);
|
||||
}
|
||||
|
||||
ForecastParams params = paramsBuilder.build();
|
||||
processManager.forecastJob(task, params, e -> {
|
||||
if (e == null) {
|
||||
listener.onResponse(new Response(true, params.getId()));
|
||||
} else {
|
||||
listener.onFailure(e);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -17,6 +17,7 @@ import org.elasticsearch.xpack.ml.job.results.AnomalyRecord;
|
|||
import org.elasticsearch.xpack.ml.job.results.Bucket;
|
||||
import org.elasticsearch.xpack.ml.job.results.BucketInfluencer;
|
||||
import org.elasticsearch.xpack.ml.job.results.CategoryDefinition;
|
||||
import org.elasticsearch.xpack.ml.job.results.Forecast;
|
||||
import org.elasticsearch.xpack.ml.job.results.Influence;
|
||||
import org.elasticsearch.xpack.ml.job.results.Influencer;
|
||||
import org.elasticsearch.xpack.ml.job.results.ModelPlot;
|
||||
|
@ -289,6 +290,7 @@ public class ElasticsearchMappings {
|
|||
.field(TYPE, DOUBLE)
|
||||
.endObject();
|
||||
|
||||
addForecastFieldsToMapping(builder);
|
||||
addAnomalyRecordFieldsToMapping(builder);
|
||||
addInfluencerFieldsToMapping(builder);
|
||||
addModelSizeStatsFieldsToMapping(builder);
|
||||
|
@ -320,6 +322,24 @@ public class ElasticsearchMappings {
|
|||
}
|
||||
}
|
||||
|
||||
private static void addForecastFieldsToMapping(XContentBuilder builder) throws IOException {
|
||||
|
||||
// Forecast Output
|
||||
builder.startObject(Forecast.FORECAST_LOWER.getPreferredName())
|
||||
.field(TYPE, DOUBLE)
|
||||
.endObject()
|
||||
.startObject(Forecast.FORECAST_UPPER.getPreferredName())
|
||||
.field(TYPE, DOUBLE)
|
||||
.endObject()
|
||||
.startObject(Forecast.FORECAST_PREDICTION.getPreferredName())
|
||||
.field(TYPE, DOUBLE)
|
||||
.endObject()
|
||||
.startObject(Forecast.FORECAST_ID.getPreferredName())
|
||||
.field(TYPE, LONG)
|
||||
.endObject();
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* AnomalyRecord fields to be added under the 'properties' section of the mapping
|
||||
* @param builder Add properties to this builder
|
||||
|
|
|
@ -29,6 +29,7 @@ import org.elasticsearch.xpack.ml.job.results.AnomalyRecord;
|
|||
import org.elasticsearch.xpack.ml.job.results.Bucket;
|
||||
import org.elasticsearch.xpack.ml.job.results.BucketInfluencer;
|
||||
import org.elasticsearch.xpack.ml.job.results.CategoryDefinition;
|
||||
import org.elasticsearch.xpack.ml.job.results.Forecast;
|
||||
import org.elasticsearch.xpack.ml.job.results.Influencer;
|
||||
import org.elasticsearch.xpack.ml.job.results.ModelPlot;
|
||||
|
||||
|
@ -151,6 +152,12 @@ public class JobResultsPersister extends AbstractComponent {
|
|||
return this;
|
||||
}
|
||||
|
||||
public Builder persistForecast(Forecast forecast) {
|
||||
logger.trace("[{}] ES BULK ACTION: index forecast to index [{}] with ID [{}]", jobId, indexName, forecast.getId());
|
||||
indexResult(forecast.getId(), forecast, Forecast.RESULT_TYPE_VALUE);
|
||||
return this;
|
||||
}
|
||||
|
||||
private void indexResult(String id, ToXContent resultDoc, String resultType) {
|
||||
try (XContentBuilder content = toXContentBuilder(resultDoc)) {
|
||||
bulkRequest.add(new IndexRequest(indexName, DOC_TYPE, id).source(content));
|
||||
|
|
|
@ -27,6 +27,7 @@ import org.elasticsearch.xpack.ml.job.process.autodetect.output.AutoDetectResult
|
|||
import org.elasticsearch.xpack.ml.job.process.autodetect.output.FlushAcknowledgement;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.DataLoadParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.FlushJobParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.ForecastParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.state.DataCounts;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.state.ModelSizeStats;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.state.ModelSnapshot;
|
||||
|
@ -205,6 +206,13 @@ public class AutodetectCommunicator implements Closeable {
|
|||
}, handler);
|
||||
}
|
||||
|
||||
public void forecastJob(ForecastParams params, BiConsumer<Void, Exception> handler) {
|
||||
submitOperation(() -> {
|
||||
autodetectProcess.forecastJob(params);
|
||||
return null;
|
||||
}, handler);
|
||||
}
|
||||
|
||||
@Nullable
|
||||
FlushAcknowledgement waitFlushToCompletion(String flushId) {
|
||||
LOGGER.debug("[{}] waiting for flush", job.getId());
|
||||
|
|
|
@ -10,6 +10,7 @@ import org.elasticsearch.xpack.ml.job.config.ModelPlotConfig;
|
|||
import org.elasticsearch.xpack.ml.job.persistence.StateStreamer;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.DataLoadParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.FlushJobParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.ForecastParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.state.ModelSnapshot;
|
||||
import org.elasticsearch.xpack.ml.job.results.AutodetectResult;
|
||||
|
||||
|
@ -85,6 +86,14 @@ public interface AutodetectProcess extends Closeable {
|
|||
*/
|
||||
String flushJob(FlushJobParams params) throws IOException;
|
||||
|
||||
/**
|
||||
* Do a forecast on a running job.
|
||||
*
|
||||
* @param params The forecast parameters
|
||||
* @throws IOException If the write fails
|
||||
*/
|
||||
void forecastJob(ForecastParams params) throws IOException;
|
||||
|
||||
/**
|
||||
* Flush the output data stream
|
||||
*/
|
||||
|
|
|
@ -42,6 +42,7 @@ import org.elasticsearch.xpack.ml.job.process.autodetect.output.FlushAcknowledge
|
|||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.AutodetectParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.DataLoadParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.FlushJobParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.ForecastParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.state.DataCounts;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.state.ModelSizeStats;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.state.ModelSnapshot;
|
||||
|
@ -49,6 +50,7 @@ import org.elasticsearch.xpack.ml.job.process.normalizer.NormalizerFactory;
|
|||
import org.elasticsearch.xpack.ml.job.process.normalizer.Renormalizer;
|
||||
import org.elasticsearch.xpack.ml.job.process.normalizer.ScoresUpdater;
|
||||
import org.elasticsearch.xpack.ml.job.process.normalizer.ShortCircuitingRenormalizer;
|
||||
import org.elasticsearch.xpack.ml.job.results.Forecast;
|
||||
import org.elasticsearch.xpack.ml.notifications.Auditor;
|
||||
import org.elasticsearch.xpack.ml.utils.ExceptionsHelper;
|
||||
import org.elasticsearch.xpack.persistent.PersistentTasksCustomMetaData.PersistentTask;
|
||||
|
@ -240,6 +242,33 @@ public class AutodetectProcessManager extends AbstractComponent {
|
|||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Do a forecast for the running job.
|
||||
*
|
||||
* @param jobTask The job task
|
||||
* @param params Forecast parameters
|
||||
*/
|
||||
public void forecastJob(JobTask jobTask, ForecastParams params, Consumer<Exception> handler) {
|
||||
logger.debug("Forecasting job {}", jobTask.getJobId());
|
||||
AutodetectCommunicator communicator = autoDetectCommunicatorByOpenJob.get(jobTask.getAllocationId());
|
||||
if (communicator == null) {
|
||||
String message = String.format(Locale.ROOT, "Cannot forecast because job [%s] is not open", jobTask.getJobId());
|
||||
logger.debug(message);
|
||||
handler.accept(ExceptionsHelper.conflictStatusException(message));
|
||||
return;
|
||||
}
|
||||
|
||||
communicator.forecastJob(params, (aVoid, e) -> {
|
||||
if (e == null) {
|
||||
handler.accept(null);
|
||||
} else {
|
||||
String msg = String.format(Locale.ROOT, "[%s] exception while forecasting job", jobTask.getJobId());
|
||||
logger.error(msg, e);
|
||||
handler.accept(ExceptionsHelper.serverError(msg, e));
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
public void writeUpdateProcessMessage(JobTask jobTask, List<JobUpdate.DetectorUpdate> updates, ModelPlotConfig config,
|
||||
Consumer<Exception> handler) {
|
||||
AutodetectCommunicator communicator = autoDetectCommunicatorByOpenJob.get(jobTask.getAllocationId());
|
||||
|
|
|
@ -11,6 +11,7 @@ import org.elasticsearch.xpack.ml.job.persistence.StateStreamer;
|
|||
import org.elasticsearch.xpack.ml.job.process.autodetect.output.FlushAcknowledgement;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.DataLoadParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.FlushJobParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.ForecastParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.state.ModelSnapshot;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.state.Quantiles;
|
||||
import org.elasticsearch.xpack.ml.job.results.AutodetectResult;
|
||||
|
@ -78,7 +79,7 @@ public class BlackHoleAutodetectProcess implements AutodetectProcess {
|
|||
@Override
|
||||
public String flushJob(FlushJobParams params) throws IOException {
|
||||
FlushAcknowledgement flushAcknowledgement = new FlushAcknowledgement(FLUSH_ID, null);
|
||||
AutodetectResult result = new AutodetectResult(null, null, null, null, null, null, null, null, flushAcknowledgement);
|
||||
AutodetectResult result = new AutodetectResult(null, null, null, null, null, null, null, null, null, null, flushAcknowledgement);
|
||||
results.add(result);
|
||||
return FLUSH_ID;
|
||||
}
|
||||
|
@ -91,7 +92,7 @@ public class BlackHoleAutodetectProcess implements AutodetectProcess {
|
|||
public void close() throws IOException {
|
||||
if (open) {
|
||||
Quantiles quantiles = new Quantiles(jobId, new Date(), "black hole quantiles");
|
||||
AutodetectResult result = new AutodetectResult(null, null, null, quantiles, null, null, null, null, null);
|
||||
AutodetectResult result = new AutodetectResult(null, null, null, quantiles, null, null, null, null, null, null, null);
|
||||
results.add(result);
|
||||
open = false;
|
||||
}
|
||||
|
@ -147,4 +148,8 @@ public class BlackHoleAutodetectProcess implements AutodetectProcess {
|
|||
public String readError() {
|
||||
return "";
|
||||
}
|
||||
|
||||
@Override
|
||||
public void forecastJob(ForecastParams params) throws IOException {
|
||||
}
|
||||
}
|
||||
|
|
|
@ -17,6 +17,7 @@ import org.elasticsearch.xpack.ml.job.process.autodetect.output.AutodetectResult
|
|||
import org.elasticsearch.xpack.ml.job.process.autodetect.output.StateProcessor;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.DataLoadParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.FlushJobParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.ForecastParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.state.ModelSnapshot;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.writer.ControlMsgToProcessWriter;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.writer.LengthEncodedWriter;
|
||||
|
@ -94,7 +95,9 @@ class NativeAutodetectProcess implements AutodetectProcess {
|
|||
if (processCloseInitiated == false && processKilled == false) {
|
||||
// The log message doesn't say "crashed", as the process could have been killed
|
||||
// by a user or other process (e.g. the Linux OOM killer)
|
||||
LOGGER.error("[{}] autodetect process stopped unexpectedly", jobId);
|
||||
|
||||
String errors = cppLogHandler.getErrors();
|
||||
LOGGER.error("[{}] autodetect process stopped unexpectedly: {}", jobId, errors);
|
||||
onProcessCrash.run();
|
||||
}
|
||||
}
|
||||
|
@ -163,6 +166,12 @@ class NativeAutodetectProcess implements AutodetectProcess {
|
|||
return writer.writeFlushMessage();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void forecastJob(ForecastParams params) throws IOException {
|
||||
ControlMsgToProcessWriter writer = new ControlMsgToProcessWriter(recordWriter, numberOfAnalysisFields);
|
||||
writer.writeForecastMessage(params);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void flushStream() throws IOException {
|
||||
recordWriter.flush();
|
||||
|
|
|
@ -26,6 +26,8 @@ import org.elasticsearch.xpack.ml.job.results.AnomalyRecord;
|
|||
import org.elasticsearch.xpack.ml.job.results.AutodetectResult;
|
||||
import org.elasticsearch.xpack.ml.job.results.Bucket;
|
||||
import org.elasticsearch.xpack.ml.job.results.CategoryDefinition;
|
||||
import org.elasticsearch.xpack.ml.job.results.Forecast;
|
||||
import org.elasticsearch.xpack.ml.job.results.ForecastStats;
|
||||
import org.elasticsearch.xpack.ml.job.results.Influencer;
|
||||
import org.elasticsearch.xpack.ml.job.results.ModelPlot;
|
||||
|
||||
|
@ -184,6 +186,20 @@ public class AutoDetectResultProcessor {
|
|||
if (modelPlot != null) {
|
||||
context.bulkResultsPersister.persistModelPlot(modelPlot);
|
||||
}
|
||||
Forecast forecast = result.getForecast();
|
||||
if (forecast != null) {
|
||||
context.bulkResultsPersister.persistForecast(forecast);
|
||||
}
|
||||
ForecastStats forecastStats = result.getForecastStats();
|
||||
if (forecastStats != null) {
|
||||
// forecast stats are send by autodetect but do not get persisted,
|
||||
// still they mark the end of a forecast
|
||||
|
||||
LOGGER.trace("Received Forecast Stats [{}]", forecastStats.getId());
|
||||
|
||||
// forecast stats mark the end of a forecast, therefore commit whatever we have
|
||||
context.bulkResultsPersister.executeRequest();
|
||||
}
|
||||
ModelSizeStats modelSizeStats = result.getModelSizeStats();
|
||||
if (modelSizeStats != null) {
|
||||
LOGGER.trace("[{}] Parsed ModelSizeStats: {} / {} / {} / {} / {} / {}",
|
||||
|
|
|
@ -0,0 +1,102 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.ml.job.process.autodetect.params;
|
||||
|
||||
import org.elasticsearch.ElasticsearchParseException;
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.joda.DateMathParser;
|
||||
import org.elasticsearch.index.mapper.DateFieldMapper;
|
||||
import org.elasticsearch.xpack.ml.job.messages.Messages;
|
||||
|
||||
import java.util.Objects;
|
||||
|
||||
public class ForecastParams {
|
||||
|
||||
private final long endTime;
|
||||
private final long id;
|
||||
|
||||
private ForecastParams(long id, long endTime) {
|
||||
this.id = id;
|
||||
this.endTime = endTime;
|
||||
}
|
||||
|
||||
/**
|
||||
* The forecast end time in seconds from the epoch
|
||||
* @return The end time in seconds from the epoch
|
||||
*/
|
||||
public long getEndTime() {
|
||||
return endTime;
|
||||
}
|
||||
|
||||
/**
|
||||
* The forecast id
|
||||
*
|
||||
* @return The forecast Id
|
||||
*/
|
||||
public long getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return Objects.hash(id, endTime);
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object obj) {
|
||||
if (this == obj) {
|
||||
return true;
|
||||
}
|
||||
if (obj == null || getClass() != obj.getClass()) {
|
||||
return false;
|
||||
}
|
||||
ForecastParams other = (ForecastParams) obj;
|
||||
return Objects.equals(id, other.id) && Objects.equals(endTime, other.endTime);
|
||||
}
|
||||
|
||||
|
||||
public static Builder builder() {
|
||||
return new Builder();
|
||||
}
|
||||
|
||||
public static class Builder {
|
||||
private long endTimeEpochSecs;
|
||||
private long startTime;
|
||||
private long forecastId;
|
||||
|
||||
private Builder() {
|
||||
startTime = System.currentTimeMillis();
|
||||
endTimeEpochSecs = tomorrow(startTime);
|
||||
forecastId = generateId();
|
||||
}
|
||||
|
||||
static long tomorrow(long now) {
|
||||
return (now / 1000) + (60 * 60 * 24);
|
||||
}
|
||||
|
||||
private long generateId() {
|
||||
return startTime;
|
||||
}
|
||||
|
||||
public Builder endTime(String endTime, ParseField paramName) {
|
||||
DateMathParser dateMathParser = new DateMathParser(DateFieldMapper.DEFAULT_DATE_TIME_FORMATTER);
|
||||
|
||||
try {
|
||||
endTimeEpochSecs = dateMathParser.parse(endTime, System::currentTimeMillis) / 1000;
|
||||
} catch (Exception e) {
|
||||
String msg = Messages.getMessage(Messages.REST_INVALID_DATETIME_PARAMS, paramName.getPreferredName(), endTime);
|
||||
throw new ElasticsearchParseException(msg, e);
|
||||
}
|
||||
|
||||
return this;
|
||||
}
|
||||
|
||||
public ForecastParams build() {
|
||||
return new ForecastParams(forecastId, endTimeEpochSecs);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -7,11 +7,13 @@ package org.elasticsearch.xpack.ml.job.process.autodetect.writer;
|
|||
|
||||
import org.elasticsearch.common.xcontent.ToXContent;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.common.xcontent.json.JsonXContent;
|
||||
import org.elasticsearch.xpack.ml.job.config.DetectionRule;
|
||||
import org.elasticsearch.xpack.ml.job.config.ModelPlotConfig;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.DataLoadParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.FlushJobParams;
|
||||
import org.elasticsearch.xpack.ml.job.process.autodetect.params.ForecastParams;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.io.OutputStream;
|
||||
|
@ -37,6 +39,11 @@ public class ControlMsgToProcessWriter {
|
|||
*/
|
||||
private static final String FLUSH_MESSAGE_CODE = "f";
|
||||
|
||||
/**
|
||||
* This must match the code defined in the api::CAnomalyDetector C++ class.
|
||||
*/
|
||||
private static final String FORECAST_MESSAGE_CODE = "p";
|
||||
|
||||
/**
|
||||
* This must match the code defined in the api::CAnomalyDetector C++ class.
|
||||
*/
|
||||
|
@ -137,14 +144,32 @@ public class ControlMsgToProcessWriter {
|
|||
String flushId = Long.toString(ms_FlushNumber.getAndIncrement());
|
||||
writeMessage(FLUSH_MESSAGE_CODE + flushId);
|
||||
|
||||
char[] spaces = new char[FLUSH_SPACES_LENGTH];
|
||||
Arrays.fill(spaces, ' ');
|
||||
writeMessage(new String(spaces));
|
||||
fillCommandBuffer();
|
||||
|
||||
lengthEncodedWriter.flush();
|
||||
return flushId;
|
||||
}
|
||||
|
||||
public void writeForecastMessage(ForecastParams params) throws IOException {
|
||||
XContentBuilder builder = XContentFactory.jsonBuilder()
|
||||
.startObject()
|
||||
.field("forecast_id", params.getId())
|
||||
.field("end_time", params.getEndTime())
|
||||
.endObject();
|
||||
|
||||
writeMessage(FORECAST_MESSAGE_CODE + builder.string());
|
||||
fillCommandBuffer();
|
||||
lengthEncodedWriter.flush();
|
||||
}
|
||||
|
||||
// todo(hendrikm): workaround, see
|
||||
// https://github.com/elastic/machine-learning-cpp/issues/123
|
||||
private void fillCommandBuffer() throws IOException {
|
||||
char[] spaces = new char[FLUSH_SPACES_LENGTH];
|
||||
Arrays.fill(spaces, ' ');
|
||||
writeMessage(new String(spaces));
|
||||
}
|
||||
|
||||
public void writeResetBucketsMessage(DataLoadParams params) throws IOException {
|
||||
writeControlCodeFollowedByTimeRange(RESET_BUCKETS_MESSAGE_CODE, params.getStart(), params.getEnd());
|
||||
}
|
||||
|
|
|
@ -5,6 +5,7 @@
|
|||
*/
|
||||
package org.elasticsearch.xpack.ml.job.results;
|
||||
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
|
@ -31,7 +32,7 @@ public class AutodetectResult implements ToXContentObject, Writeable {
|
|||
TYPE.getPreferredName(), a -> new AutodetectResult((Bucket) a[0], (List<AnomalyRecord>) a[1], (List<Influencer>) a[2],
|
||||
(Quantiles) a[3], a[4] == null ? null : ((ModelSnapshot.Builder) a[4]).build(),
|
||||
a[5] == null ? null : ((ModelSizeStats.Builder) a[5]).build(),
|
||||
(ModelPlot) a[6], (CategoryDefinition) a[7], (FlushAcknowledgement) a[8]));
|
||||
(ModelPlot) a[6], (Forecast) a[7], (ForecastStats) a[8], (CategoryDefinition) a[9], (FlushAcknowledgement) a[10]));
|
||||
|
||||
static {
|
||||
PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), Bucket.PARSER, Bucket.RESULT_TYPE_FIELD);
|
||||
|
@ -42,6 +43,8 @@ public class AutodetectResult implements ToXContentObject, Writeable {
|
|||
PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), ModelSizeStats.PARSER,
|
||||
ModelSizeStats.RESULT_TYPE_FIELD);
|
||||
PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), ModelPlot.PARSER, ModelPlot.RESULTS_FIELD);
|
||||
PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), Forecast.PARSER, Forecast.RESULTS_FIELD);
|
||||
PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), ForecastStats.PARSER, ForecastStats.RESULTS_FIELD);
|
||||
PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), CategoryDefinition.PARSER, CategoryDefinition.TYPE);
|
||||
PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), FlushAcknowledgement.PARSER, FlushAcknowledgement.TYPE);
|
||||
}
|
||||
|
@ -53,12 +56,14 @@ public class AutodetectResult implements ToXContentObject, Writeable {
|
|||
private final ModelSnapshot modelSnapshot;
|
||||
private final ModelSizeStats modelSizeStats;
|
||||
private final ModelPlot modelPlot;
|
||||
private final Forecast forecast;
|
||||
private final ForecastStats forecastStats;
|
||||
private final CategoryDefinition categoryDefinition;
|
||||
private final FlushAcknowledgement flushAcknowledgement;
|
||||
|
||||
public AutodetectResult(Bucket bucket, List<AnomalyRecord> records, List<Influencer> influencers, Quantiles quantiles,
|
||||
ModelSnapshot modelSnapshot, ModelSizeStats modelSizeStats, ModelPlot modelPlot,
|
||||
CategoryDefinition categoryDefinition, FlushAcknowledgement flushAcknowledgement) {
|
||||
ModelSnapshot modelSnapshot, ModelSizeStats modelSizeStats, ModelPlot modelPlot, Forecast forecast,
|
||||
ForecastStats forecastStats, CategoryDefinition categoryDefinition, FlushAcknowledgement flushAcknowledgement) {
|
||||
this.bucket = bucket;
|
||||
this.records = records;
|
||||
this.influencers = influencers;
|
||||
|
@ -66,6 +71,8 @@ public class AutodetectResult implements ToXContentObject, Writeable {
|
|||
this.modelSnapshot = modelSnapshot;
|
||||
this.modelSizeStats = modelSizeStats;
|
||||
this.modelPlot = modelPlot;
|
||||
this.forecast = forecast;
|
||||
this.forecastStats = forecastStats;
|
||||
this.categoryDefinition = categoryDefinition;
|
||||
this.flushAcknowledgement = flushAcknowledgement;
|
||||
}
|
||||
|
@ -116,6 +123,22 @@ public class AutodetectResult implements ToXContentObject, Writeable {
|
|||
} else {
|
||||
this.flushAcknowledgement = null;
|
||||
}
|
||||
|
||||
if (in.getVersion().onOrAfter(Version.V_6_1_0)) {
|
||||
if (in.readBoolean()) {
|
||||
this.forecast = new Forecast(in);
|
||||
} else {
|
||||
this.forecast = null;
|
||||
}
|
||||
if (in.readBoolean()) {
|
||||
this.forecastStats = new ForecastStats(in);
|
||||
} else {
|
||||
this.forecastStats = null;
|
||||
}
|
||||
} else {
|
||||
this.forecast = null;
|
||||
this.forecastStats = null;
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -129,6 +152,11 @@ public class AutodetectResult implements ToXContentObject, Writeable {
|
|||
writeNullable(modelPlot, out);
|
||||
writeNullable(categoryDefinition, out);
|
||||
writeNullable(flushAcknowledgement, out);
|
||||
|
||||
if (out.getVersion().onOrAfter(Version.V_6_1_0)) {
|
||||
writeNullable(forecast, out);
|
||||
writeNullable(forecastStats, out);
|
||||
}
|
||||
}
|
||||
|
||||
private static void writeNullable(Writeable writeable, StreamOutput out) throws IOException {
|
||||
|
@ -157,6 +185,8 @@ public class AutodetectResult implements ToXContentObject, Writeable {
|
|||
addNullableField(ModelSnapshot.TYPE, modelSnapshot, builder);
|
||||
addNullableField(ModelSizeStats.RESULT_TYPE_FIELD, modelSizeStats, builder);
|
||||
addNullableField(ModelPlot.RESULTS_FIELD, modelPlot, builder);
|
||||
addNullableField(Forecast.RESULTS_FIELD, forecast, builder);
|
||||
addNullableField(ForecastStats.RESULTS_FIELD, forecastStats, builder);
|
||||
addNullableField(CategoryDefinition.TYPE, categoryDefinition, builder);
|
||||
addNullableField(FlushAcknowledgement.TYPE, flushAcknowledgement, builder);
|
||||
builder.endObject();
|
||||
|
@ -203,6 +233,14 @@ public class AutodetectResult implements ToXContentObject, Writeable {
|
|||
return modelPlot;
|
||||
}
|
||||
|
||||
public Forecast getForecast() {
|
||||
return forecast;
|
||||
}
|
||||
|
||||
public ForecastStats getForecastStats() {
|
||||
return forecastStats;
|
||||
}
|
||||
|
||||
public CategoryDefinition getCategoryDefinition() {
|
||||
return categoryDefinition;
|
||||
}
|
||||
|
@ -213,8 +251,8 @@ public class AutodetectResult implements ToXContentObject, Writeable {
|
|||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return Objects.hash(bucket, records, influencers, categoryDefinition, flushAcknowledgement, modelPlot, modelSizeStats,
|
||||
modelSnapshot, quantiles);
|
||||
return Objects.hash(bucket, records, influencers, categoryDefinition, flushAcknowledgement, modelPlot, forecast, forecastStats,
|
||||
modelSizeStats, modelSnapshot, quantiles);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -232,6 +270,8 @@ public class AutodetectResult implements ToXContentObject, Writeable {
|
|||
Objects.equals(categoryDefinition, other.categoryDefinition) &&
|
||||
Objects.equals(flushAcknowledgement, other.flushAcknowledgement) &&
|
||||
Objects.equals(modelPlot, other.modelPlot) &&
|
||||
Objects.equals(forecast, other.forecast) &&
|
||||
Objects.equals(forecastStats, other.forecastStats) &&
|
||||
Objects.equals(modelSizeStats, other.modelSizeStats) &&
|
||||
Objects.equals(modelSnapshot, other.modelSnapshot) &&
|
||||
Objects.equals(quantiles, other.quantiles);
|
||||
|
|
|
@ -0,0 +1,308 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.ml.job.results;
|
||||
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Writeable;
|
||||
import org.elasticsearch.common.xcontent.ConstructingObjectParser;
|
||||
import org.elasticsearch.common.xcontent.ObjectParser.ValueType;
|
||||
import org.elasticsearch.common.xcontent.ToXContentObject;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentParser.Token;
|
||||
import org.elasticsearch.xpack.ml.job.config.Job;
|
||||
import org.elasticsearch.xpack.ml.utils.time.TimeUtils;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Date;
|
||||
import java.util.Objects;
|
||||
|
||||
/**
|
||||
* Model Forecast POJO.
|
||||
*/
|
||||
public class Forecast implements ToXContentObject, Writeable {
|
||||
/**
|
||||
* Result type
|
||||
*/
|
||||
public static final String RESULT_TYPE_VALUE = "model_forecast";
|
||||
public static final ParseField RESULTS_FIELD = new ParseField(RESULT_TYPE_VALUE);
|
||||
|
||||
public static final ParseField FORECAST_ID = new ParseField("forecast_id");
|
||||
public static final ParseField PARTITION_FIELD_NAME = new ParseField("partition_field_name");
|
||||
public static final ParseField PARTITION_FIELD_VALUE = new ParseField("partition_field_value");
|
||||
public static final ParseField OVER_FIELD_NAME = new ParseField("over_field_name");
|
||||
public static final ParseField OVER_FIELD_VALUE = new ParseField("over_field_value");
|
||||
public static final ParseField BY_FIELD_NAME = new ParseField("by_field_name");
|
||||
public static final ParseField BY_FIELD_VALUE = new ParseField("by_field_value");
|
||||
public static final ParseField MODEL_FEATURE = new ParseField("model_feature");
|
||||
public static final ParseField FORECAST_LOWER = new ParseField("forecast_lower");
|
||||
public static final ParseField FORECAST_UPPER = new ParseField("forecast_upper");
|
||||
public static final ParseField FORECAST_PREDICTION = new ParseField("forecast_prediction");
|
||||
public static final ParseField BUCKET_SPAN = new ParseField("bucket_span");
|
||||
|
||||
public static final ConstructingObjectParser<Forecast, Void> PARSER =
|
||||
new ConstructingObjectParser<>(RESULT_TYPE_VALUE, a -> new Forecast((String) a[0], (long) a[1], (Date) a[2], (long) a[3]));
|
||||
|
||||
static {
|
||||
PARSER.declareString(ConstructingObjectParser.constructorArg(), Job.ID);
|
||||
PARSER.declareLong(ConstructingObjectParser.constructorArg(), FORECAST_ID);
|
||||
PARSER.declareField(ConstructingObjectParser.constructorArg(), p -> {
|
||||
if (p.currentToken() == Token.VALUE_NUMBER) {
|
||||
return new Date(p.longValue());
|
||||
} else if (p.currentToken() == Token.VALUE_STRING) {
|
||||
return new Date(TimeUtils.dateStringToEpoch(p.text()));
|
||||
}
|
||||
throw new IllegalArgumentException("unexpected token [" + p.currentToken() + "] for ["
|
||||
+ Result.TIMESTAMP.getPreferredName() + "]");
|
||||
}, Result.TIMESTAMP, ValueType.VALUE);
|
||||
PARSER.declareLong(ConstructingObjectParser.constructorArg(), BUCKET_SPAN);
|
||||
PARSER.declareString((modelForecast, s) -> {}, Result.RESULT_TYPE);
|
||||
PARSER.declareString(Forecast::setPartitionFieldName, PARTITION_FIELD_NAME);
|
||||
PARSER.declareString(Forecast::setPartitionFieldValue, PARTITION_FIELD_VALUE);
|
||||
PARSER.declareString(Forecast::setOverFieldName, OVER_FIELD_NAME);
|
||||
PARSER.declareString(Forecast::setOverFieldValue, OVER_FIELD_VALUE);
|
||||
PARSER.declareString(Forecast::setByFieldName, BY_FIELD_NAME);
|
||||
PARSER.declareString(Forecast::setByFieldValue, BY_FIELD_VALUE);
|
||||
PARSER.declareString(Forecast::setModelFeature, MODEL_FEATURE);
|
||||
PARSER.declareDouble(Forecast::setForecastLower, FORECAST_LOWER);
|
||||
PARSER.declareDouble(Forecast::setForecastUpper, FORECAST_UPPER);
|
||||
PARSER.declareDouble(Forecast::setForecastPrediction, FORECAST_PREDICTION);
|
||||
}
|
||||
|
||||
private final String jobId;
|
||||
private final long forecastId;
|
||||
private final Date timestamp;
|
||||
private final long bucketSpan;
|
||||
private String partitionFieldName;
|
||||
private String partitionFieldValue;
|
||||
private String overFieldName;
|
||||
private String overFieldValue;
|
||||
private String byFieldName;
|
||||
private String byFieldValue;
|
||||
private String modelFeature;
|
||||
private double forecastLower;
|
||||
private double forecastUpper;
|
||||
private double forecastPrediction;
|
||||
|
||||
public Forecast(String jobId, long forecastId, Date timestamp, long bucketSpan) {
|
||||
this.jobId = jobId;
|
||||
this.forecastId = forecastId;
|
||||
this.timestamp = timestamp;
|
||||
this.bucketSpan = bucketSpan;
|
||||
}
|
||||
|
||||
public Forecast(StreamInput in) throws IOException {
|
||||
jobId = in.readString();
|
||||
forecastId = in.readLong();
|
||||
timestamp = new Date(in.readLong());
|
||||
partitionFieldName = in.readOptionalString();
|
||||
partitionFieldValue = in.readOptionalString();
|
||||
overFieldName = in.readOptionalString();
|
||||
overFieldValue = in.readOptionalString();
|
||||
byFieldName = in.readOptionalString();
|
||||
byFieldValue = in.readOptionalString();
|
||||
modelFeature = in.readOptionalString();
|
||||
forecastLower = in.readDouble();
|
||||
forecastUpper = in.readDouble();
|
||||
forecastPrediction = in.readDouble();
|
||||
bucketSpan = in.readLong();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeString(jobId);
|
||||
out.writeLong(forecastId);
|
||||
out.writeLong(timestamp.getTime());
|
||||
out.writeOptionalString(partitionFieldName);
|
||||
out.writeOptionalString(partitionFieldValue);
|
||||
out.writeOptionalString(overFieldName);
|
||||
out.writeOptionalString(overFieldValue);
|
||||
out.writeOptionalString(byFieldName);
|
||||
out.writeOptionalString(byFieldValue);
|
||||
out.writeOptionalString(modelFeature);
|
||||
out.writeDouble(forecastLower);
|
||||
out.writeDouble(forecastUpper);
|
||||
out.writeDouble(forecastPrediction);
|
||||
out.writeLong(bucketSpan);
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject();
|
||||
builder.field(Job.ID.getPreferredName(), jobId);
|
||||
builder.field(FORECAST_ID.getPreferredName(), forecastId);
|
||||
builder.field(Result.RESULT_TYPE.getPreferredName(), RESULT_TYPE_VALUE);
|
||||
builder.field(BUCKET_SPAN.getPreferredName(), bucketSpan);
|
||||
if (timestamp != null) {
|
||||
builder.dateField(Result.TIMESTAMP.getPreferredName(),
|
||||
Result.TIMESTAMP.getPreferredName() + "_string", timestamp.getTime());
|
||||
}
|
||||
if (partitionFieldName != null) {
|
||||
builder.field(PARTITION_FIELD_NAME.getPreferredName(), partitionFieldName);
|
||||
}
|
||||
if (partitionFieldValue != null) {
|
||||
builder.field(PARTITION_FIELD_VALUE.getPreferredName(), partitionFieldValue);
|
||||
}
|
||||
if (overFieldName != null) {
|
||||
builder.field(OVER_FIELD_NAME.getPreferredName(), overFieldName);
|
||||
}
|
||||
if (overFieldValue != null) {
|
||||
builder.field(OVER_FIELD_VALUE.getPreferredName(), overFieldValue);
|
||||
}
|
||||
if (byFieldName != null) {
|
||||
builder.field(BY_FIELD_NAME.getPreferredName(), byFieldName);
|
||||
}
|
||||
if (byFieldValue != null) {
|
||||
builder.field(BY_FIELD_VALUE.getPreferredName(), byFieldValue);
|
||||
}
|
||||
if (modelFeature != null) {
|
||||
builder.field(MODEL_FEATURE.getPreferredName(), modelFeature);
|
||||
}
|
||||
builder.field(FORECAST_LOWER.getPreferredName(), forecastLower);
|
||||
builder.field(FORECAST_UPPER.getPreferredName(), forecastUpper);
|
||||
builder.field(FORECAST_PREDICTION.getPreferredName(), forecastPrediction);
|
||||
builder.endObject();
|
||||
return builder;
|
||||
}
|
||||
|
||||
public String getJobId() {
|
||||
return jobId;
|
||||
}
|
||||
|
||||
public long getForecastId() {
|
||||
return forecastId;
|
||||
}
|
||||
|
||||
public String getId() {
|
||||
int valuesHash = Objects.hash(byFieldValue, overFieldValue, partitionFieldValue);
|
||||
int length = (byFieldValue == null ? 0 : byFieldValue.length()) +
|
||||
(overFieldValue == null ? 0 : overFieldValue.length()) +
|
||||
(partitionFieldValue == null ? 0 : partitionFieldValue.length());
|
||||
return jobId + "_model_forecast_" + forecastId + "_" + timestamp.getTime() + "_" + bucketSpan + "_"
|
||||
+ (modelFeature == null ? "" : modelFeature) + "_" + valuesHash + "_" + length;
|
||||
}
|
||||
|
||||
public Date getTimestamp() {
|
||||
return timestamp;
|
||||
}
|
||||
|
||||
public long getBucketSpan() {
|
||||
return bucketSpan;
|
||||
}
|
||||
|
||||
public String getPartitionFieldName() {
|
||||
return partitionFieldName;
|
||||
}
|
||||
|
||||
public void setPartitionFieldName(String partitionFieldName) {
|
||||
this.partitionFieldName = partitionFieldName;
|
||||
}
|
||||
|
||||
public String getPartitionFieldValue() {
|
||||
return partitionFieldValue;
|
||||
}
|
||||
|
||||
public void setPartitionFieldValue(String partitionFieldValue) {
|
||||
this.partitionFieldValue = partitionFieldValue;
|
||||
}
|
||||
|
||||
public String getOverFieldName() {
|
||||
return overFieldName;
|
||||
}
|
||||
|
||||
public void setOverFieldName(String overFieldName) {
|
||||
this.overFieldName = overFieldName;
|
||||
}
|
||||
|
||||
public String getOverFieldValue() {
|
||||
return overFieldValue;
|
||||
}
|
||||
|
||||
public void setOverFieldValue(String overFieldValue) {
|
||||
this.overFieldValue = overFieldValue;
|
||||
}
|
||||
|
||||
public String getByFieldName() {
|
||||
return byFieldName;
|
||||
}
|
||||
|
||||
public void setByFieldName(String byFieldName) {
|
||||
this.byFieldName = byFieldName;
|
||||
}
|
||||
|
||||
public String getByFieldValue() {
|
||||
return byFieldValue;
|
||||
}
|
||||
|
||||
public void setByFieldValue(String byFieldValue) {
|
||||
this.byFieldValue = byFieldValue;
|
||||
}
|
||||
|
||||
public String getModelFeature() {
|
||||
return modelFeature;
|
||||
}
|
||||
|
||||
public void setModelFeature(String modelFeature) {
|
||||
this.modelFeature = modelFeature;
|
||||
}
|
||||
|
||||
public double getForecastLower() {
|
||||
return forecastLower;
|
||||
}
|
||||
|
||||
public void setForecastLower(double forecastLower) {
|
||||
this.forecastLower = forecastLower;
|
||||
}
|
||||
|
||||
public double getForecastUpper() {
|
||||
return forecastUpper;
|
||||
}
|
||||
|
||||
public void setForecastUpper(double forecastUpper) {
|
||||
this.forecastUpper = forecastUpper;
|
||||
}
|
||||
|
||||
public double getForecastPrediction() {
|
||||
return forecastPrediction;
|
||||
}
|
||||
|
||||
public void setForecastPrediction(double forecastPrediction) {
|
||||
this.forecastPrediction = forecastPrediction;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object other) {
|
||||
if (this == other) {
|
||||
return true;
|
||||
}
|
||||
if (other instanceof Forecast == false) {
|
||||
return false;
|
||||
}
|
||||
Forecast that = (Forecast) other;
|
||||
return Objects.equals(this.jobId, that.jobId) &&
|
||||
forecastId == that.forecastId &&
|
||||
Objects.equals(this.timestamp, that.timestamp) &&
|
||||
Objects.equals(this.partitionFieldValue, that.partitionFieldValue) &&
|
||||
Objects.equals(this.partitionFieldName, that.partitionFieldName) &&
|
||||
Objects.equals(this.overFieldValue, that.overFieldValue) &&
|
||||
Objects.equals(this.overFieldName, that.overFieldName) &&
|
||||
Objects.equals(this.byFieldValue, that.byFieldValue) &&
|
||||
Objects.equals(this.byFieldName, that.byFieldName) &&
|
||||
Objects.equals(this.modelFeature, that.modelFeature) &&
|
||||
this.forecastLower == that.forecastLower &&
|
||||
this.forecastUpper == that.forecastUpper &&
|
||||
this.forecastPrediction == that.forecastPrediction &&
|
||||
this.bucketSpan == that.bucketSpan;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return Objects.hash(jobId, forecastId, timestamp, partitionFieldName, partitionFieldValue,
|
||||
overFieldName, overFieldValue, byFieldName, byFieldValue,
|
||||
modelFeature, forecastLower, forecastUpper, forecastPrediction, bucketSpan);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,114 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.ml.job.results;
|
||||
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.common.io.stream.StreamInput;
|
||||
import org.elasticsearch.common.io.stream.StreamOutput;
|
||||
import org.elasticsearch.common.io.stream.Writeable;
|
||||
import org.elasticsearch.common.xcontent.ConstructingObjectParser;
|
||||
import org.elasticsearch.common.xcontent.ToXContentObject;
|
||||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.xpack.ml.job.config.Job;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Objects;
|
||||
|
||||
/**
|
||||
* Model ForecastStats POJO.
|
||||
*
|
||||
* Note forecast stats are sent from the autodetect process but do not get
|
||||
* indexed.
|
||||
*/
|
||||
public class ForecastStats implements ToXContentObject, Writeable {
|
||||
/**
|
||||
* Result type
|
||||
*/
|
||||
public static final String RESULT_TYPE_VALUE = "model_forecast_stats";
|
||||
public static final ParseField RESULTS_FIELD = new ParseField(RESULT_TYPE_VALUE);
|
||||
|
||||
public static final ParseField FORECAST_ID = new ParseField("forecast_id");
|
||||
public static final ParseField RECORD_COUNT = new ParseField("forecast_record_count");
|
||||
|
||||
public static final ConstructingObjectParser<ForecastStats, Void> PARSER =
|
||||
new ConstructingObjectParser<>(RESULT_TYPE_VALUE, a -> new ForecastStats((String) a[0], (long) a[1]));
|
||||
|
||||
static {
|
||||
PARSER.declareString(ConstructingObjectParser.constructorArg(), Job.ID);
|
||||
PARSER.declareLong(ConstructingObjectParser.constructorArg(), FORECAST_ID);
|
||||
|
||||
PARSER.declareString((modelForecastStats, s) -> {}, Result.RESULT_TYPE);
|
||||
PARSER.declareLong(ForecastStats::setRecordCount, RECORD_COUNT);
|
||||
}
|
||||
|
||||
private final String jobId;
|
||||
private final long forecastId;
|
||||
private long recordCount;
|
||||
|
||||
public ForecastStats(String jobId, long forecastId) {
|
||||
this.jobId = jobId;
|
||||
this.forecastId = forecastId;
|
||||
}
|
||||
|
||||
public ForecastStats(StreamInput in) throws IOException {
|
||||
jobId = in.readString();
|
||||
forecastId = in.readLong();
|
||||
recordCount = in.readLong();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void writeTo(StreamOutput out) throws IOException {
|
||||
out.writeString(jobId);
|
||||
out.writeLong(forecastId);
|
||||
out.writeLong(recordCount);
|
||||
}
|
||||
|
||||
@Override
|
||||
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
|
||||
builder.startObject();
|
||||
builder.field(Job.ID.getPreferredName(), jobId);
|
||||
builder.field(Result.RESULT_TYPE.getPreferredName(), RESULT_TYPE_VALUE);
|
||||
builder.field(FORECAST_ID.getPreferredName(), forecastId);
|
||||
builder.field(RECORD_COUNT.getPreferredName(), recordCount);
|
||||
builder.endObject();
|
||||
return builder;
|
||||
}
|
||||
|
||||
public String getJobId() {
|
||||
return jobId;
|
||||
}
|
||||
|
||||
public String getId() {
|
||||
return jobId + "_model_forecast_stats";
|
||||
}
|
||||
|
||||
public void setRecordCount(long recordCount) {
|
||||
this.recordCount = recordCount;
|
||||
}
|
||||
|
||||
public double getRecordCount() {
|
||||
return recordCount;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object other) {
|
||||
if (this == other) {
|
||||
return true;
|
||||
}
|
||||
if (other instanceof ForecastStats == false) {
|
||||
return false;
|
||||
}
|
||||
ForecastStats that = (ForecastStats) other;
|
||||
return Objects.equals(this.jobId, that.jobId) &&
|
||||
this.forecastId == that.forecastId &&
|
||||
this.recordCount == that.recordCount;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return Objects.hash(jobId, forecastId, recordCount);
|
||||
}
|
||||
}
|
|
@ -127,6 +127,10 @@ public final class ReservedFieldNames {
|
|||
ModelPlot.MODEL_UPPER.getPreferredName(), ModelPlot.MODEL_MEDIAN.getPreferredName(),
|
||||
ModelPlot.ACTUAL.getPreferredName(),
|
||||
|
||||
Forecast.FORECAST_LOWER.getPreferredName(), Forecast.FORECAST_UPPER.getPreferredName(),
|
||||
Forecast.FORECAST_PREDICTION.getPreferredName(),
|
||||
Forecast.FORECAST_ID.getPreferredName(),
|
||||
|
||||
ModelSizeStats.MODEL_BYTES_FIELD.getPreferredName(),
|
||||
ModelSizeStats.TOTAL_BY_FIELD_COUNT_FIELD.getPreferredName(),
|
||||
ModelSizeStats.TOTAL_OVER_FIELD_COUNT_FIELD.getPreferredName(),
|
||||
|
|
|
@ -0,0 +1,50 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.ml.rest.job;
|
||||
|
||||
import org.elasticsearch.client.node.NodeClient;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.rest.BaseRestHandler;
|
||||
import org.elasticsearch.rest.RestController;
|
||||
import org.elasticsearch.rest.RestRequest;
|
||||
import org.elasticsearch.rest.action.RestToXContentListener;
|
||||
import org.elasticsearch.xpack.ml.MachineLearning;
|
||||
import org.elasticsearch.xpack.ml.action.ForecastJobAction;
|
||||
import org.elasticsearch.xpack.ml.job.config.Job;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
public class RestForecastJobAction extends BaseRestHandler {
|
||||
|
||||
public RestForecastJobAction(Settings settings, RestController controller) {
|
||||
super(settings);
|
||||
controller.registerHandler(RestRequest.Method.POST,
|
||||
MachineLearning.BASE_PATH + "anomaly_detectors/{" + Job.ID.getPreferredName() + "}/_forecast", this);
|
||||
}
|
||||
|
||||
@Override
|
||||
public String getName() {
|
||||
return "xpack_ml_forecast_job_action";
|
||||
}
|
||||
|
||||
@Override
|
||||
protected RestChannelConsumer prepareRequest(RestRequest restRequest, NodeClient client) throws IOException {
|
||||
String jobId = restRequest.param(Job.ID.getPreferredName());
|
||||
final ForecastJobAction.Request request;
|
||||
if (restRequest.hasContentOrSourceParam()) {
|
||||
XContentParser parser = restRequest.contentOrSourceParamParser();
|
||||
request = ForecastJobAction.Request.parseRequest(jobId, parser);
|
||||
} else {
|
||||
request = new ForecastJobAction.Request(restRequest.param(Job.ID.getPreferredName()));
|
||||
if (restRequest.hasParam(ForecastJobAction.Request.END_TIME.getPreferredName())) {
|
||||
request.setEndTime(restRequest.param(ForecastJobAction.Request.END_TIME.getPreferredName()));
|
||||
}
|
||||
}
|
||||
|
||||
return channel -> client.execute(ForecastJobAction.INSTANCE, request, new RestToXContentListener<>(channel));
|
||||
}
|
||||
}
|
|
@ -27,6 +27,7 @@ import org.elasticsearch.search.SearchHit;
|
|||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.xpack.XPackSettings;
|
||||
import org.elasticsearch.xpack.security.authc.Authentication;
|
||||
import org.elasticsearch.xpack.security.user.User;
|
||||
import org.elasticsearch.xpack.security.user.XPackUser;
|
||||
|
||||
import java.io.IOException;
|
||||
|
@ -48,15 +49,21 @@ public class InternalClient extends FilterClient {
|
|||
|
||||
private final String nodeName;
|
||||
private final boolean securityEnabled;
|
||||
private final User user;
|
||||
|
||||
/**
|
||||
* Constructs an InternalClient.
|
||||
* If security is enabled the client is secure. Otherwise this client is a passthrough.
|
||||
*/
|
||||
public InternalClient(Settings settings, ThreadPool threadPool, Client in) {
|
||||
this(settings, threadPool, in, XPackUser.INSTANCE);
|
||||
}
|
||||
|
||||
InternalClient(Settings settings, ThreadPool threadPool, Client in, User user) {
|
||||
super(settings, threadPool, in);
|
||||
this.nodeName = Node.NODE_NAME_SETTING.get(settings);
|
||||
this.securityEnabled = XPackSettings.SECURITY_ENABLED.get(settings);
|
||||
this.user = user;
|
||||
}
|
||||
|
||||
@Override
|
||||
|
@ -80,7 +87,7 @@ public class InternalClient extends FilterClient {
|
|||
|
||||
protected void processContext(ThreadContext threadContext) {
|
||||
try {
|
||||
Authentication authentication = new Authentication(XPackUser.INSTANCE,
|
||||
Authentication authentication = new Authentication(user,
|
||||
new Authentication.RealmRef("__attach", "__attach", nodeName), null);
|
||||
authentication.writeToContext(threadContext);
|
||||
} catch (IOException ioe) {
|
||||
|
|
|
@ -0,0 +1,23 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.security;
|
||||
|
||||
import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.xpack.security.user.XPackSecurityUser;
|
||||
|
||||
/**
|
||||
* A special filter client for internal usage by security to modify the security index.
|
||||
*
|
||||
* The {@link XPackSecurityUser} user is added to the execution context before each action is executed.
|
||||
*/
|
||||
public class InternalSecurityClient extends InternalClient {
|
||||
|
||||
public InternalSecurityClient(Settings settings, ThreadPool threadPool, Client in) {
|
||||
super(settings, threadPool, in, XPackSecurityUser.INSTANCE);
|
||||
}
|
||||
}
|
|
@ -31,7 +31,7 @@ class PkiRealmBootstrapCheck implements BootstrapCheck {
|
|||
* least one network communication layer.
|
||||
*/
|
||||
@Override
|
||||
public boolean check(BootstrapContext context) {
|
||||
public BootstrapCheckResult check(BootstrapContext context) {
|
||||
final Settings settings = context.settings;
|
||||
final boolean pkiRealmEnabled = settings.getGroups(RealmSettings.PREFIX).values().stream()
|
||||
.filter(s -> PkiRealm.TYPE.equals(s.get("type")))
|
||||
|
@ -42,34 +42,30 @@ class PkiRealmBootstrapCheck implements BootstrapCheck {
|
|||
Settings httpSSLSettings = SSLService.getHttpTransportSSLSettings(settings);
|
||||
final boolean httpClientAuth = sslService.isSSLClientAuthEnabled(httpSSLSettings);
|
||||
if (httpSsl && httpClientAuth) {
|
||||
return false;
|
||||
return BootstrapCheckResult.success();
|
||||
}
|
||||
|
||||
// Default Transport
|
||||
final Settings transportSSLSettings = settings.getByPrefix(setting("transport.ssl."));
|
||||
final boolean clientAuthEnabled = sslService.isSSLClientAuthEnabled(transportSSLSettings);
|
||||
if (clientAuthEnabled) {
|
||||
return false;
|
||||
return BootstrapCheckResult.success();
|
||||
}
|
||||
|
||||
// Transport Profiles
|
||||
Map<String, Settings> groupedSettings = settings.getGroups("transport.profiles.");
|
||||
for (Map.Entry<String, Settings> entry : groupedSettings.entrySet()) {
|
||||
if (sslService.isSSLClientAuthEnabled(SecurityNetty4Transport.profileSslSettings(entry.getValue()), transportSSLSettings)) {
|
||||
return false;
|
||||
return BootstrapCheckResult.success();
|
||||
}
|
||||
}
|
||||
return true;
|
||||
return BootstrapCheckResult.failure(
|
||||
"a PKI realm is enabled but cannot be used as neither HTTP or Transport have SSL and client authentication enabled");
|
||||
} else {
|
||||
return false;
|
||||
return BootstrapCheckResult.success();
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public String errorMessage() {
|
||||
return "A PKI realm is enabled but cannot be used as neither HTTP or Transport have SSL and client authentication enabled";
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean alwaysEnforce() {
|
||||
return true;
|
||||
|
|
|
@ -14,6 +14,7 @@ import org.elasticsearch.action.ActionResponse;
|
|||
import org.elasticsearch.action.support.ActionFilter;
|
||||
import org.elasticsearch.action.support.DestructiveOperations;
|
||||
import org.elasticsearch.bootstrap.BootstrapCheck;
|
||||
import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.cluster.ClusterState;
|
||||
import org.elasticsearch.cluster.LocalNodeMasterListener;
|
||||
import org.elasticsearch.cluster.NamedDiff;
|
||||
|
@ -295,12 +296,13 @@ public class Security implements ActionPlugin, IngestPlugin, NetworkPlugin, Clus
|
|||
return modules;
|
||||
}
|
||||
|
||||
public Collection<Object> createComponents(InternalClient client, ThreadPool threadPool, ClusterService clusterService,
|
||||
public Collection<Object> createComponents(Client nodeClient, ThreadPool threadPool, ClusterService clusterService,
|
||||
ResourceWatcherService resourceWatcherService,
|
||||
List<XPackExtension> extensions) throws Exception {
|
||||
if (enabled == false) {
|
||||
return Collections.emptyList();
|
||||
}
|
||||
final InternalSecurityClient client = new InternalSecurityClient(settings, threadPool, nodeClient);
|
||||
threadContext.set(threadPool.getThreadContext());
|
||||
List<Object> components = new ArrayList<>();
|
||||
securityContext.set(new SecurityContext(settings, threadPool.getThreadContext()));
|
||||
|
|
|
@ -57,7 +57,7 @@ public class SecurityLifecycleService extends AbstractComponent implements Clust
|
|||
private final IndexLifecycleManager securityIndex;
|
||||
|
||||
public SecurityLifecycleService(Settings settings, ClusterService clusterService,
|
||||
ThreadPool threadPool, InternalClient client,
|
||||
ThreadPool threadPool, InternalSecurityClient client,
|
||||
@Nullable IndexAuditTrail indexAuditTrail) {
|
||||
super(settings);
|
||||
this.settings = settings;
|
||||
|
|
|
@ -11,24 +11,29 @@ import org.elasticsearch.common.network.NetworkModule;
|
|||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.xpack.XPackSettings;
|
||||
|
||||
import java.util.Locale;
|
||||
|
||||
/**
|
||||
* Bootstrap check to ensure that the user has enabled HTTPS when using the token service
|
||||
*/
|
||||
final class TokenSSLBootstrapCheck implements BootstrapCheck {
|
||||
|
||||
@Override
|
||||
public boolean check(BootstrapContext context) {
|
||||
if (NetworkModule.HTTP_ENABLED.get(context.settings)) {
|
||||
return XPackSettings.HTTP_SSL_ENABLED.get(context.settings) == false && XPackSettings.TOKEN_SERVICE_ENABLED_SETTING.get
|
||||
(context.settings);
|
||||
public BootstrapCheckResult check(BootstrapContext context) {
|
||||
final Boolean httpEnabled = NetworkModule.HTTP_ENABLED.get(context.settings);
|
||||
final Boolean httpsEnabled = XPackSettings.HTTP_SSL_ENABLED.get(context.settings);
|
||||
final Boolean tokenServiceEnabled = XPackSettings.TOKEN_SERVICE_ENABLED_SETTING.get(context.settings);
|
||||
if (httpEnabled && httpsEnabled == false && tokenServiceEnabled) {
|
||||
final String message = String.format(
|
||||
Locale.ROOT,
|
||||
"HTTPS is required in order to use the token service; "
|
||||
+ "please enable HTTPS using the [%s] setting or disable the token service using the [%s] setting",
|
||||
XPackSettings.HTTP_SSL_ENABLED.getKey(),
|
||||
XPackSettings.TOKEN_SERVICE_ENABLED_SETTING.getKey());
|
||||
return BootstrapCheckResult.failure(message);
|
||||
} else {
|
||||
return BootstrapCheckResult.success();
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String errorMessage() {
|
||||
return "HTTPS is required in order to use the token service. Please enable HTTPS using the [" +
|
||||
XPackSettings.HTTP_SSL_ENABLED.getKey() + "] setting or disable the token service using the [" +
|
||||
XPackSettings.TOKEN_SERVICE_ENABLED_SETTING.getKey() + "] setting.";
|
||||
}
|
||||
}
|
||||
|
|
|
@ -48,6 +48,7 @@ import org.elasticsearch.threadpool.ThreadPool;
|
|||
import org.elasticsearch.transport.TransportMessage;
|
||||
import org.elasticsearch.xpack.XPackPlugin;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.audit.AuditLevel;
|
||||
import org.elasticsearch.xpack.security.audit.AuditTrail;
|
||||
import org.elasticsearch.xpack.security.authc.AuthenticationToken;
|
||||
|
@ -177,7 +178,7 @@ public class IndexAuditTrail extends AbstractComponent implements AuditTrail, Cl
|
|||
return NAME;
|
||||
}
|
||||
|
||||
public IndexAuditTrail(Settings settings, InternalClient client, ThreadPool threadPool, ClusterService clusterService) {
|
||||
public IndexAuditTrail(Settings settings, InternalSecurityClient client, ThreadPool threadPool, ClusterService clusterService) {
|
||||
super(settings);
|
||||
this.threadPool = threadPool;
|
||||
this.clusterService = clusterService;
|
||||
|
|
|
@ -18,6 +18,7 @@ import org.elasticsearch.index.query.QueryBuilders;
|
|||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.threadpool.ThreadPool.Names;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.SecurityLifecycleService;
|
||||
|
||||
import java.time.Instant;
|
||||
|
@ -30,12 +31,12 @@ import static org.elasticsearch.action.support.TransportActions.isShardNotAvaila
|
|||
*/
|
||||
final class ExpiredTokenRemover extends AbstractRunnable {
|
||||
|
||||
private final InternalClient client;
|
||||
private final InternalSecurityClient client;
|
||||
private final AtomicBoolean inProgress = new AtomicBoolean(false);
|
||||
private final Logger logger;
|
||||
private final TimeValue timeout;
|
||||
|
||||
ExpiredTokenRemover(Settings settings, InternalClient internalClient) {
|
||||
ExpiredTokenRemover(Settings settings, InternalSecurityClient internalClient) {
|
||||
this.client = internalClient;
|
||||
this.logger = Loggers.getLogger(getClass(), settings);
|
||||
this.timeout = TokenService.DELETE_TIMEOUT.get(settings);
|
||||
|
|
|
@ -50,6 +50,7 @@ import org.elasticsearch.rest.RestStatus;
|
|||
import org.elasticsearch.xpack.XPackPlugin;
|
||||
import org.elasticsearch.xpack.XPackSettings;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.SecurityLifecycleService;
|
||||
|
||||
import javax.crypto.Cipher;
|
||||
|
@ -132,7 +133,7 @@ public final class TokenService extends AbstractComponent {
|
|||
private final Clock clock;
|
||||
private final TimeValue expirationDelay;
|
||||
private final TimeValue deleteInterval;
|
||||
private final InternalClient internalClient;
|
||||
private final InternalSecurityClient internalClient;
|
||||
private final SecurityLifecycleService lifecycleService;
|
||||
private final ExpiredTokenRemover expiredTokenRemover;
|
||||
private final boolean enabled;
|
||||
|
@ -148,7 +149,7 @@ public final class TokenService extends AbstractComponent {
|
|||
* @param clock the clock that will be used for comparing timestamps
|
||||
* @param internalClient the client to use when checking for revocations
|
||||
*/
|
||||
public TokenService(Settings settings, Clock clock, InternalClient internalClient,
|
||||
public TokenService(Settings settings, Clock clock, InternalSecurityClient internalClient,
|
||||
SecurityLifecycleService lifecycleService, ClusterService clusterService) throws GeneralSecurityException {
|
||||
super(settings);
|
||||
byte[] saltArr = new byte[SALT_BYTES];
|
||||
|
|
|
@ -36,6 +36,7 @@ import org.elasticsearch.index.query.QueryBuilders;
|
|||
import org.elasticsearch.search.SearchHit;
|
||||
import org.elasticsearch.xpack.XPackPlugin;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.SecurityLifecycleService;
|
||||
import org.elasticsearch.xpack.security.action.realm.ClearRealmCacheRequest;
|
||||
import org.elasticsearch.xpack.security.action.realm.ClearRealmCacheResponse;
|
||||
|
@ -73,12 +74,12 @@ public class NativeUsersStore extends AbstractComponent {
|
|||
|
||||
|
||||
private final Hasher hasher = Hasher.BCRYPT;
|
||||
private final InternalClient client;
|
||||
private final InternalSecurityClient client;
|
||||
private final boolean isTribeNode;
|
||||
|
||||
private volatile SecurityLifecycleService securityLifecycleService;
|
||||
|
||||
public NativeUsersStore(Settings settings, InternalClient client, SecurityLifecycleService securityLifecycleService) {
|
||||
public NativeUsersStore(Settings settings, InternalSecurityClient client, SecurityLifecycleService securityLifecycleService) {
|
||||
super(settings);
|
||||
this.client = client;
|
||||
this.isTribeNode = XPackPlugin.isTribeNode(settings);
|
||||
|
|
|
@ -20,30 +20,22 @@ public class RoleMappingFileBootstrapCheck implements BootstrapCheck {
|
|||
private final RealmConfig realmConfig;
|
||||
private final Path path;
|
||||
|
||||
private final SetOnce<String> error = new SetOnce<>();
|
||||
|
||||
public RoleMappingFileBootstrapCheck(RealmConfig config, Path path) {
|
||||
RoleMappingFileBootstrapCheck(RealmConfig config, Path path) {
|
||||
this.realmConfig = config;
|
||||
this.path = path;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean check(BootstrapContext context) {
|
||||
public BootstrapCheckResult check(BootstrapContext context) {
|
||||
try {
|
||||
DnRoleMapper.parseFile(path, realmConfig.logger(getClass()), realmConfig.type(), realmConfig.name(), true);
|
||||
return false;
|
||||
return BootstrapCheckResult.success();
|
||||
} catch (Exception e) {
|
||||
error.set(e.getMessage());
|
||||
return true;
|
||||
return BootstrapCheckResult.failure(e.getMessage());
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@Override
|
||||
public String errorMessage() {
|
||||
return error.get();
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean alwaysEnforce() {
|
||||
return true;
|
||||
|
@ -56,4 +48,5 @@ public class RoleMappingFileBootstrapCheck implements BootstrapCheck {
|
|||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -38,6 +38,7 @@ import org.elasticsearch.index.query.QueryBuilder;
|
|||
import org.elasticsearch.index.query.QueryBuilders;
|
||||
import org.elasticsearch.xpack.XPackPlugin;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.SecurityLifecycleService;
|
||||
import org.elasticsearch.xpack.security.action.rolemapping.DeleteRoleMappingRequest;
|
||||
import org.elasticsearch.xpack.security.action.rolemapping.PutRoleMappingRequest;
|
||||
|
@ -70,12 +71,12 @@ public class NativeRoleMappingStore extends AbstractComponent implements UserRol
|
|||
|
||||
private static final String SECURITY_GENERIC_TYPE = "doc";
|
||||
|
||||
private final InternalClient client;
|
||||
private final InternalSecurityClient client;
|
||||
private final boolean isTribeNode;
|
||||
private final SecurityLifecycleService securityLifecycleService;
|
||||
private final List<String> realmsToRefresh = new CopyOnWriteArrayList<>();
|
||||
|
||||
public NativeRoleMappingStore(Settings settings, InternalClient client, SecurityLifecycleService securityLifecycleService) {
|
||||
public NativeRoleMappingStore(Settings settings, InternalSecurityClient client, SecurityLifecycleService securityLifecycleService) {
|
||||
super(settings);
|
||||
this.client = client;
|
||||
this.isTribeNode = XPackPlugin.isTribeNode(settings);
|
||||
|
|
|
@ -63,6 +63,7 @@ import org.elasticsearch.xpack.security.support.Automatons;
|
|||
import org.elasticsearch.xpack.security.user.AnonymousUser;
|
||||
import org.elasticsearch.xpack.security.user.SystemUser;
|
||||
import org.elasticsearch.xpack.security.user.User;
|
||||
import org.elasticsearch.xpack.security.user.XPackSecurityUser;
|
||||
import org.elasticsearch.xpack.security.user.XPackUser;
|
||||
import org.elasticsearch.xpack.sql.plugin.jdbc.action.JdbcAction;
|
||||
import org.elasticsearch.xpack.sql.plugin.sql.action.SqlAction;
|
||||
|
@ -442,7 +443,11 @@ public class AuthorizationService extends AbstractComponent {
|
|||
" roles");
|
||||
}
|
||||
if (XPackUser.is(user)) {
|
||||
assert XPackUser.INSTANCE.roles().length == 1 && ReservedRolesStore.SUPERUSER_ROLE.name().equals(XPackUser.INSTANCE.roles()[0]);
|
||||
assert XPackUser.INSTANCE.roles().length == 1;
|
||||
roleActionListener.onResponse(XPackUser.ROLE);
|
||||
return;
|
||||
}
|
||||
if (XPackSecurityUser.is(user)) {
|
||||
roleActionListener.onResponse(ReservedRolesStore.SUPERUSER_ROLE);
|
||||
return;
|
||||
}
|
||||
|
@ -518,7 +523,6 @@ public class AuthorizationService extends AbstractComponent {
|
|||
}
|
||||
if (indicesAccessControl.getIndexPermissions(SecurityLifecycleService.SECURITY_INDEX_NAME) != null
|
||||
&& indicesAccessControl.getIndexPermissions(SecurityLifecycleService.SECURITY_INDEX_NAME).isGranted()
|
||||
&& XPackUser.is(authentication.getUser()) == false
|
||||
&& MONITOR_INDEX_PREDICATE.test(action) == false
|
||||
&& isSuperuser(authentication.getUser()) == false) {
|
||||
// only the superusers are allowed to work with this index, but we should allow indices monitoring actions through for debugging
|
||||
|
|
|
@ -38,6 +38,7 @@ import org.elasticsearch.license.LicenseUtils;
|
|||
import org.elasticsearch.license.XPackLicenseState;
|
||||
import org.elasticsearch.xpack.XPackPlugin;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.SecurityLifecycleService;
|
||||
import org.elasticsearch.xpack.security.action.role.ClearRolesCacheRequest;
|
||||
import org.elasticsearch.xpack.security.action.role.ClearRolesCacheResponse;
|
||||
|
@ -79,14 +80,14 @@ public class NativeRolesStore extends AbstractComponent {
|
|||
TimeValue.timeValueMinutes(20), Property.NodeScope, Property.Deprecated);
|
||||
private static final String ROLE_DOC_TYPE = "doc";
|
||||
|
||||
private final InternalClient client;
|
||||
private final InternalSecurityClient client;
|
||||
private final XPackLicenseState licenseState;
|
||||
private final boolean isTribeNode;
|
||||
|
||||
private SecurityClient securityClient;
|
||||
private final SecurityLifecycleService securityLifecycleService;
|
||||
|
||||
public NativeRolesStore(Settings settings, InternalClient client, XPackLicenseState licenseState,
|
||||
public NativeRolesStore(Settings settings, InternalSecurityClient client, XPackLicenseState licenseState,
|
||||
SecurityLifecycleService securityLifecycleService) {
|
||||
super(settings);
|
||||
this.client = client;
|
||||
|
|
|
@ -12,6 +12,7 @@ import org.elasticsearch.xpack.security.authz.permission.Role;
|
|||
import org.elasticsearch.xpack.security.support.MetadataUtils;
|
||||
import org.elasticsearch.xpack.security.user.KibanaUser;
|
||||
import org.elasticsearch.xpack.security.user.SystemUser;
|
||||
import org.elasticsearch.xpack.security.user.XPackUser;
|
||||
import org.elasticsearch.xpack.watcher.execution.TriggeredWatchStore;
|
||||
import org.elasticsearch.xpack.watcher.history.HistoryStore;
|
||||
import org.elasticsearch.xpack.watcher.watch.Watch;
|
||||
|
@ -126,7 +127,7 @@ public class ReservedRolesStore {
|
|||
}
|
||||
|
||||
public static boolean isReserved(String role) {
|
||||
return RESERVED_ROLES.containsKey(role) || SystemUser.ROLE_NAME.equals(role);
|
||||
return RESERVED_ROLES.containsKey(role) || SystemUser.ROLE_NAME.equals(role) || XPackUser.ROLE_NAME.equals(role);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -38,6 +38,7 @@ import org.elasticsearch.common.component.AbstractComponent;
|
|||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.index.mapper.MapperService;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.template.TemplateUtils;
|
||||
import org.elasticsearch.xpack.upgrade.IndexUpgradeCheck;
|
||||
|
||||
|
@ -58,7 +59,7 @@ public class IndexLifecycleManager extends AbstractComponent {
|
|||
|
||||
private final String indexName;
|
||||
private final String templateName;
|
||||
private final InternalClient client;
|
||||
private final InternalSecurityClient client;
|
||||
|
||||
private final List<BiConsumer<ClusterIndexHealth, ClusterIndexHealth>> indexHealthChangeListeners = new CopyOnWriteArrayList<>();
|
||||
|
||||
|
@ -70,7 +71,7 @@ public class IndexLifecycleManager extends AbstractComponent {
|
|||
private volatile boolean mappingIsUpToDate;
|
||||
private volatile Version mappingVersion;
|
||||
|
||||
public IndexLifecycleManager(Settings settings, InternalClient client, String indexName, String templateName) {
|
||||
public IndexLifecycleManager(Settings settings, InternalSecurityClient client, String indexName, String templateName) {
|
||||
super(settings);
|
||||
this.client = client;
|
||||
this.indexName = indexName;
|
||||
|
|
|
@ -184,6 +184,8 @@ public class User implements ToXContentObject {
|
|||
return SystemUser.INSTANCE;
|
||||
} else if (XPackUser.is(username)) {
|
||||
return XPackUser.INSTANCE;
|
||||
} else if (XPackSecurityUser.is(username)) {
|
||||
return XPackSecurityUser.INSTANCE;
|
||||
}
|
||||
throw new IllegalStateException("user [" + username + "] is not an internal user");
|
||||
}
|
||||
|
@ -214,6 +216,9 @@ public class User implements ToXContentObject {
|
|||
} else if (XPackUser.is(user)) {
|
||||
output.writeBoolean(true);
|
||||
output.writeString(XPackUser.NAME);
|
||||
} else if (XPackSecurityUser.is(user)) {
|
||||
output.writeBoolean(true);
|
||||
output.writeString(XPackSecurityUser.NAME);
|
||||
} else {
|
||||
if (user.authenticatedUser == null) {
|
||||
// no backcompat necessary, since there is no inner user
|
||||
|
|
|
@ -0,0 +1,38 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.security.user;
|
||||
|
||||
/**
|
||||
* internal user that manages xpack security. Has all cluster/indices permissions.
|
||||
*/
|
||||
public class XPackSecurityUser extends User {
|
||||
|
||||
public static final String NAME = "_xpack_security";
|
||||
public static final XPackSecurityUser INSTANCE = new XPackSecurityUser();
|
||||
private static final String ROLE_NAME = "superuser";
|
||||
|
||||
private XPackSecurityUser() {
|
||||
super(NAME, ROLE_NAME);
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
return INSTANCE == o;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return System.identityHashCode(this);
|
||||
}
|
||||
|
||||
public static boolean is(User user) {
|
||||
return INSTANCE.equals(user);
|
||||
}
|
||||
|
||||
public static boolean is(String principal) {
|
||||
return NAME.equals(principal);
|
||||
}
|
||||
}
|
|
@ -5,13 +5,22 @@
|
|||
*/
|
||||
package org.elasticsearch.xpack.security.user;
|
||||
|
||||
import org.elasticsearch.xpack.security.authz.RoleDescriptor;
|
||||
import org.elasticsearch.xpack.security.authz.permission.Role;
|
||||
import org.elasticsearch.xpack.security.support.MetadataUtils;
|
||||
|
||||
/**
|
||||
* XPack internal user that manages xpack. Has all cluster/indices permissions for x-pack to operate.
|
||||
* XPack internal user that manages xpack. Has all cluster/indices permissions for x-pack to operate excluding security permissions.
|
||||
*/
|
||||
public class XPackUser extends User {
|
||||
|
||||
public static final String NAME = "_xpack";
|
||||
private static final String ROLE_NAME = "superuser";
|
||||
public static final String ROLE_NAME = NAME;
|
||||
public static final Role ROLE = Role.builder(new RoleDescriptor(ROLE_NAME, new String[] { "all" },
|
||||
new RoleDescriptor.IndicesPrivileges[] {
|
||||
RoleDescriptor.IndicesPrivileges.builder().indices("/@&~(\\.security*)/").privileges("all").build()},
|
||||
new String[] { "*" },
|
||||
MetadataUtils.DEFAULT_RESERVED_METADATA), null).build();
|
||||
public static final XPackUser INSTANCE = new XPackUser();
|
||||
|
||||
private XPackUser() {
|
||||
|
|
|
@ -42,10 +42,15 @@ public final class SSLBootstrapCheck implements BootstrapCheck {
|
|||
}
|
||||
|
||||
@Override
|
||||
public boolean check(BootstrapContext context) {
|
||||
public BootstrapCheckResult check(BootstrapContext context) {
|
||||
final Settings transportSSLSettings = context.settings.getByPrefix(XPackSettings.TRANSPORT_SSL_PREFIX);
|
||||
return sslService.sslConfiguration(transportSSLSettings).keyConfig() == KeyConfig.NONE
|
||||
|| isDefaultCACertificateTrusted() || isDefaultPrivateKeyUsed();
|
||||
if (sslService.sslConfiguration(transportSSLSettings).keyConfig() == KeyConfig.NONE
|
||||
|| isDefaultCACertificateTrusted() || isDefaultPrivateKeyUsed()) {
|
||||
return BootstrapCheckResult.failure(
|
||||
"default SSL key and certificate do not provide security; please generate keys and certificates");
|
||||
} else {
|
||||
return BootstrapCheckResult.success();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -91,8 +96,4 @@ public final class SSLBootstrapCheck implements BootstrapCheck {
|
|||
.anyMatch(defaultPrivateKey::equals);
|
||||
}
|
||||
|
||||
@Override
|
||||
public String errorMessage() {
|
||||
return "Default SSL key and certificate do not provide security; please generate keys and certificates";
|
||||
}
|
||||
}
|
||||
|
|
|
@ -8,6 +8,7 @@ package org.elasticsearch.xpack.upgrade;
|
|||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.action.ActionRequest;
|
||||
import org.elasticsearch.action.ActionResponse;
|
||||
import org.elasticsearch.client.Client;
|
||||
import org.elasticsearch.cluster.metadata.IndexMetaData;
|
||||
import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver;
|
||||
import org.elasticsearch.cluster.node.DiscoveryNodes;
|
||||
|
@ -24,6 +25,7 @@ import org.elasticsearch.script.ScriptService;
|
|||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.watcher.ResourceWatcherService;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.upgrade.actions.IndexUpgradeAction;
|
||||
import org.elasticsearch.xpack.upgrade.actions.IndexUpgradeInfoAction;
|
||||
import org.elasticsearch.xpack.upgrade.rest.RestIndexUpgradeAction;
|
||||
|
@ -53,12 +55,13 @@ public class Upgrade implements ActionPlugin {
|
|||
this.upgradeCheckFactories = new ArrayList<>();
|
||||
}
|
||||
|
||||
public Collection<Object> createComponents(InternalClient internalClient, ClusterService clusterService, ThreadPool threadPool,
|
||||
public Collection<Object> createComponents(Client client, ClusterService clusterService, ThreadPool threadPool,
|
||||
ResourceWatcherService resourceWatcherService, ScriptService scriptService,
|
||||
NamedXContentRegistry xContentRegistry) {
|
||||
final InternalSecurityClient internalSecurityClient = new InternalSecurityClient(settings, threadPool, client);
|
||||
List<IndexUpgradeCheck> upgradeChecks = new ArrayList<>(upgradeCheckFactories.size());
|
||||
for (BiFunction<InternalClient, ClusterService, IndexUpgradeCheck> checkFactory : upgradeCheckFactories) {
|
||||
upgradeChecks.add(checkFactory.apply(internalClient, clusterService));
|
||||
upgradeChecks.add(checkFactory.apply(internalSecurityClient, clusterService));
|
||||
}
|
||||
return Collections.singletonList(new IndexUpgradeService(settings, Collections.unmodifiableList(upgradeChecks)));
|
||||
}
|
||||
|
|
|
@ -23,27 +23,28 @@ final class EncryptSensitiveDataBootstrapCheck implements BootstrapCheck {
|
|||
}
|
||||
|
||||
@Override
|
||||
public boolean check(BootstrapContext context) {
|
||||
return Watcher.ENCRYPT_SENSITIVE_DATA_SETTING.get(context.settings)
|
||||
&& Watcher.ENCRYPTION_KEY_SETTING.exists(context.settings) == false;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String errorMessage() {
|
||||
final Path sysKeyPath = environment.configFile().resolve(XPackPlugin.NAME).resolve("system_key").toAbsolutePath();
|
||||
if (Files.exists(sysKeyPath)) {
|
||||
return "Encryption of sensitive data requires the key to be placed in the secure setting store. Run " +
|
||||
"'bin/elasticsearch-keystore add-file " + Watcher.ENCRYPTION_KEY_SETTING.getKey() + " " +
|
||||
environment.configFile().resolve(XPackPlugin.NAME).resolve("system_key").toAbsolutePath() +
|
||||
"' to import the file.\nAfter importing, the system_key file should be removed from the " +
|
||||
"filesystem.\nRepeat this on every node in the cluster.";
|
||||
public BootstrapCheckResult check(BootstrapContext context) {
|
||||
if (Watcher.ENCRYPT_SENSITIVE_DATA_SETTING.get(context.settings)
|
||||
&& Watcher.ENCRYPTION_KEY_SETTING.exists(context.settings) == false) {
|
||||
final Path systemKeyPath = environment.configFile().resolve(XPackPlugin.NAME).resolve("system_key").toAbsolutePath();
|
||||
final String message;
|
||||
if (Files.exists(systemKeyPath)) {
|
||||
message = "Encryption of sensitive data requires the key to be placed in the secure setting store. Run " +
|
||||
"'bin/elasticsearch-keystore add-file " + Watcher.ENCRYPTION_KEY_SETTING.getKey() + " " +
|
||||
systemKeyPath +
|
||||
"' to import the file.\nAfter importing, the system_key file should be removed from the " +
|
||||
"filesystem.\nRepeat this on every node in the cluster.";
|
||||
} else {
|
||||
message = "Encryption of sensitive data requires a key to be placed in the secure setting store. First run the " +
|
||||
"bin/x-pack/syskeygen tool to generate a key file.\nThen run 'bin/elasticsearch-keystore add-file " +
|
||||
Watcher.ENCRYPTION_KEY_SETTING.getKey() + " " +
|
||||
systemKeyPath + "' to import the key into" +
|
||||
" the secure setting store. Finally, remove the system_key file from the filesystem.\n" +
|
||||
"Repeat this on every node in the cluster";
|
||||
}
|
||||
return BootstrapCheckResult.failure(message);
|
||||
} else {
|
||||
return "Encryption of sensitive data requires a key to be placed in the secure setting store. First run the " +
|
||||
"bin/x-pack/syskeygen tool to generate a key file.\nThen run 'bin/elasticsearch-keystore add-file " +
|
||||
Watcher.ENCRYPTION_KEY_SETTING.getKey() + " " +
|
||||
environment.configFile().resolve(XPackPlugin.NAME).resolve("system_key").toAbsolutePath() + "' to import the key into" +
|
||||
" the secure setting store. Finally, remove the system_key file from the filesystem.\n" +
|
||||
"Repeat this on every node in the cluster";
|
||||
return BootstrapCheckResult.success();
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -35,6 +35,7 @@ import org.elasticsearch.xpack.XPackPlugin;
|
|||
import org.elasticsearch.xpack.XPackSettings;
|
||||
import org.elasticsearch.xpack.ml.MachineLearning;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.Security;
|
||||
import org.elasticsearch.xpack.security.client.SecurityClient;
|
||||
import org.junit.AfterClass;
|
||||
|
@ -438,6 +439,11 @@ public abstract class SecurityIntegTestCase extends ESIntegTestCase {
|
|||
return internalCluster().getInstance(InternalClient.class);
|
||||
}
|
||||
|
||||
protected InternalSecurityClient internalSecurityClient() {
|
||||
Client client = client();
|
||||
return new InternalSecurityClient(client.settings(), client.threadPool(), client);
|
||||
}
|
||||
|
||||
protected SecurityClient securityClient() {
|
||||
return securityClient(client());
|
||||
}
|
||||
|
|
|
@ -0,0 +1,35 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.ml.action;
|
||||
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.test.AbstractStreamableXContentTestCase;
|
||||
import org.elasticsearch.xpack.ml.action.ForecastJobAction.Request;
|
||||
|
||||
public class ForecastJobActionRequestTests extends AbstractStreamableXContentTestCase<Request> {
|
||||
|
||||
@Override
|
||||
protected Request doParseInstance(XContentParser parser) {
|
||||
return Request.parseRequest(null, parser);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected boolean supportsUnknownFields() {
|
||||
return false;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Request createTestInstance() {
|
||||
Request request = new Request(randomAlphaOfLengthBetween(1, 20));
|
||||
return request;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Request createBlankInstance() {
|
||||
return new Request();
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,23 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.ml.action;
|
||||
|
||||
import org.elasticsearch.test.AbstractStreamableTestCase;
|
||||
import org.elasticsearch.xpack.ml.action.ForecastJobAction.Response;
|
||||
|
||||
public class ForecastJobActionResponseTests extends AbstractStreamableTestCase<Response> {
|
||||
|
||||
@Override
|
||||
protected Response createTestInstance() {
|
||||
return new Response(randomBoolean(), randomNonNegativeLong());
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Response createBlankInstance() {
|
||||
return new Response();
|
||||
}
|
||||
|
||||
}
|
|
@ -357,47 +357,47 @@ public class AutodetectResultProcessorIT extends XPackSingleNodeTestCase {
|
|||
private List<AutodetectResult> results = new ArrayList<>();
|
||||
|
||||
ResultsBuilder addBucket(Bucket bucket) {
|
||||
results.add(new AutodetectResult(Objects.requireNonNull(bucket), null, null, null, null, null, null, null, null));
|
||||
results.add(new AutodetectResult(Objects.requireNonNull(bucket), null, null, null, null, null, null, null, null, null, null));
|
||||
return this;
|
||||
}
|
||||
|
||||
ResultsBuilder addRecords(List<AnomalyRecord> records) {
|
||||
results.add(new AutodetectResult(null, records, null, null, null, null, null, null, null));
|
||||
results.add(new AutodetectResult(null, records, null, null, null, null, null, null, null, null, null));
|
||||
return this;
|
||||
}
|
||||
|
||||
ResultsBuilder addInfluencers(List<Influencer> influencers) {
|
||||
results.add(new AutodetectResult(null, null, influencers, null, null, null, null, null, null));
|
||||
results.add(new AutodetectResult(null, null, influencers, null, null, null, null, null, null, null, null));
|
||||
return this;
|
||||
}
|
||||
|
||||
ResultsBuilder addCategoryDefinition(CategoryDefinition categoryDefinition) {
|
||||
results.add(new AutodetectResult(null, null, null, null, null, null, null, categoryDefinition, null));
|
||||
results.add(new AutodetectResult(null, null, null, null, null, null, null, null, null, categoryDefinition, null));
|
||||
return this;
|
||||
}
|
||||
|
||||
ResultsBuilder addmodelPlot(ModelPlot modelPlot) {
|
||||
results.add(new AutodetectResult(null, null, null, null, null, null, modelPlot, null, null));
|
||||
results.add(new AutodetectResult(null, null, null, null, null, null, modelPlot, null, null, null, null));
|
||||
return this;
|
||||
}
|
||||
|
||||
ResultsBuilder addModelSizeStats(ModelSizeStats modelSizeStats) {
|
||||
results.add(new AutodetectResult(null, null, null, null, null, modelSizeStats, null, null, null));
|
||||
results.add(new AutodetectResult(null, null, null, null, null, modelSizeStats, null, null, null, null, null));
|
||||
return this;
|
||||
}
|
||||
|
||||
ResultsBuilder addModelSnapshot(ModelSnapshot modelSnapshot) {
|
||||
results.add(new AutodetectResult(null, null, null, null, modelSnapshot, null, null, null, null));
|
||||
results.add(new AutodetectResult(null, null, null, null, modelSnapshot, null, null, null, null, null, null));
|
||||
return this;
|
||||
}
|
||||
|
||||
ResultsBuilder addQuantiles(Quantiles quantiles) {
|
||||
results.add(new AutodetectResult(null, null, null, quantiles, null, null, null, null, null));
|
||||
results.add(new AutodetectResult(null, null, null, quantiles, null, null, null, null, null, null, null));
|
||||
return this;
|
||||
}
|
||||
|
||||
ResultsBuilder addFlushAcknowledgement(FlushAcknowledgement flushAcknowledgement) {
|
||||
results.add(new AutodetectResult(null, null, null, null, null, null, null, null, flushAcknowledgement));
|
||||
results.add(new AutodetectResult(null, null, null, null, null, null, null, null, null, null, flushAcknowledgement));
|
||||
return this;
|
||||
}
|
||||
|
||||
|
|
|
@ -0,0 +1,44 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.ml.job.process.autodetect.params;
|
||||
|
||||
import org.elasticsearch.ElasticsearchParseException;
|
||||
import org.elasticsearch.common.ParseField;
|
||||
import org.elasticsearch.test.ESTestCase;
|
||||
import org.elasticsearch.xpack.ml.job.messages.Messages;
|
||||
|
||||
import static org.hamcrest.Matchers.greaterThanOrEqualTo;
|
||||
import static org.hamcrest.Matchers.lessThanOrEqualTo;
|
||||
|
||||
public class ForecastParamsTests extends ESTestCase {
|
||||
|
||||
private static ParseField END = new ParseField("end");
|
||||
|
||||
public void testDefault_GivesTomorrowTimeInSeconds() {
|
||||
long nowSecs = System.currentTimeMillis() / 1000;
|
||||
nowSecs += 60 * 60 * 24;
|
||||
|
||||
ForecastParams params = ForecastParams.builder().build();
|
||||
assertThat(params.getEndTime(), greaterThanOrEqualTo(nowSecs));
|
||||
assertThat(params.getEndTime(), lessThanOrEqualTo(nowSecs +1));
|
||||
}
|
||||
|
||||
public void test_UnparseableEndTimeThrows() {
|
||||
ElasticsearchParseException e =
|
||||
ESTestCase.expectThrows(ElasticsearchParseException.class, () -> ForecastParams.builder().endTime("bad", END).build());
|
||||
assertEquals(Messages.getMessage(Messages.REST_INVALID_DATETIME_PARAMS, "end", "bad"), e.getMessage());
|
||||
}
|
||||
|
||||
public void testFormats() {
|
||||
assertEquals(10L, ForecastParams.builder().endTime("10000", END).build().getEndTime());
|
||||
assertEquals(1462096800L, ForecastParams.builder().endTime("2016-05-01T10:00:00Z", END).build().getEndTime());
|
||||
|
||||
long nowSecs = System.currentTimeMillis() / 1000;
|
||||
long end = ForecastParams.builder().endTime("now+2H", END).build().getEndTime();
|
||||
assertThat(end, greaterThanOrEqualTo(nowSecs + 7200));
|
||||
assertThat(end, lessThanOrEqualTo(nowSecs + 7200 +1));
|
||||
}
|
||||
}
|
|
@ -35,6 +35,8 @@ public class AutodetectResultTests extends AbstractSerializingTestCase<Autodetec
|
|||
ModelSnapshot modelSnapshot;
|
||||
ModelSizeStats.Builder modelSizeStats;
|
||||
ModelPlot modelPlot;
|
||||
Forecast forecast;
|
||||
ForecastStats forecastStats;
|
||||
CategoryDefinition categoryDefinition;
|
||||
FlushAcknowledgement flushAcknowledgement;
|
||||
String jobId = "foo";
|
||||
|
@ -84,6 +86,16 @@ public class AutodetectResultTests extends AbstractSerializingTestCase<Autodetec
|
|||
} else {
|
||||
modelPlot = null;
|
||||
}
|
||||
if (randomBoolean()) {
|
||||
forecast = new Forecast(jobId, randomNonNegativeLong(), new Date(randomLong()), randomNonNegativeLong());
|
||||
} else {
|
||||
forecast = null;
|
||||
}
|
||||
if (randomBoolean()) {
|
||||
forecastStats = new ForecastStats(jobId, randomNonNegativeLong());
|
||||
} else {
|
||||
forecastStats = null;
|
||||
}
|
||||
if (randomBoolean()) {
|
||||
categoryDefinition = new CategoryDefinition(jobId);
|
||||
categoryDefinition.setCategoryId(randomLong());
|
||||
|
@ -96,7 +108,8 @@ public class AutodetectResultTests extends AbstractSerializingTestCase<Autodetec
|
|||
flushAcknowledgement = null;
|
||||
}
|
||||
return new AutodetectResult(bucket, records, influencers, quantiles, modelSnapshot,
|
||||
modelSizeStats == null ? null : modelSizeStats.build(), modelPlot, categoryDefinition, flushAcknowledgement);
|
||||
modelSizeStats == null ? null : modelSizeStats.build(), modelPlot, forecast, forecastStats, categoryDefinition,
|
||||
flushAcknowledgement);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -0,0 +1,46 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.ml.job.results;
|
||||
|
||||
import org.elasticsearch.common.io.stream.Writeable.Reader;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.test.AbstractSerializingTestCase;
|
||||
|
||||
import java.io.IOException;
|
||||
|
||||
public class ForecastStatsTests extends AbstractSerializingTestCase<ForecastStats> {
|
||||
|
||||
@Override
|
||||
protected ForecastStats parseInstance(XContentParser parser) {
|
||||
return ForecastStats.PARSER.apply(parser, null);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected ForecastStats createTestInstance() {
|
||||
return createTestInstance("ForecastStatsTest", randomNonNegativeLong());
|
||||
}
|
||||
|
||||
public ForecastStats createTestInstance(String jobId, long forecastId) {
|
||||
ForecastStats forecastStats = new ForecastStats(jobId, forecastId);
|
||||
|
||||
if (randomBoolean()) {
|
||||
forecastStats.setRecordCount(randomLong());
|
||||
}
|
||||
|
||||
return forecastStats;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Reader<ForecastStats> instanceReader() {
|
||||
return ForecastStats::new;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected ForecastStats doParseInstance(XContentParser parser) throws IOException {
|
||||
return ForecastStats.PARSER.apply(parser, null);
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,68 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
package org.elasticsearch.xpack.ml.job.results;
|
||||
|
||||
import org.elasticsearch.common.io.stream.Writeable.Reader;
|
||||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.test.AbstractSerializingTestCase;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Date;
|
||||
|
||||
public class ForecastTests extends AbstractSerializingTestCase<Forecast> {
|
||||
|
||||
@Override
|
||||
protected Forecast parseInstance(XContentParser parser) {
|
||||
return Forecast.PARSER.apply(parser, null);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Forecast createTestInstance() {
|
||||
return createTestInstance("ForecastTest");
|
||||
}
|
||||
|
||||
public Forecast createTestInstance(String jobId) {
|
||||
Forecast forecast = new Forecast(jobId, randomNonNegativeLong(), new Date(randomLong()), randomNonNegativeLong());
|
||||
|
||||
if (randomBoolean()) {
|
||||
forecast.setByFieldName(randomAlphaOfLengthBetween(1, 20));
|
||||
}
|
||||
if (randomBoolean()) {
|
||||
forecast.setByFieldValue(randomAlphaOfLengthBetween(1, 20));
|
||||
}
|
||||
if (randomBoolean()) {
|
||||
forecast.setPartitionFieldName(randomAlphaOfLengthBetween(1, 20));
|
||||
}
|
||||
if (randomBoolean()) {
|
||||
forecast.setPartitionFieldValue(randomAlphaOfLengthBetween(1, 20));
|
||||
}
|
||||
if (randomBoolean()) {
|
||||
forecast.setModelFeature(randomAlphaOfLengthBetween(1, 20));
|
||||
}
|
||||
if (randomBoolean()) {
|
||||
forecast.setForecastLower(randomDouble());
|
||||
}
|
||||
if (randomBoolean()) {
|
||||
forecast.setForecastUpper(randomDouble());
|
||||
}
|
||||
if (randomBoolean()) {
|
||||
forecast.setForecastPrediction(randomDouble());
|
||||
}
|
||||
|
||||
return forecast;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Reader<Forecast> instanceReader() {
|
||||
return Forecast::new;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Forecast doParseInstance(XContentParser parser) throws IOException {
|
||||
return Forecast.PARSER.apply(parser, null);
|
||||
}
|
||||
|
||||
}
|
|
@ -17,7 +17,7 @@ public class PkiRealmBootstrapCheckTests extends ESTestCase {
|
|||
public void testPkiRealmBootstrapDefault() throws Exception {
|
||||
assertFalse(new PkiRealmBootstrapCheck(new SSLService(Settings.EMPTY,
|
||||
new Environment(Settings.builder().put("path.home", createTempDir()).build()))).check((new BootstrapContext(Settings
|
||||
.EMPTY, null))));
|
||||
.EMPTY, null))).isFailure());
|
||||
}
|
||||
|
||||
public void testBootstrapCheckWithPkiRealm() throws Exception {
|
||||
|
@ -26,42 +26,42 @@ public class PkiRealmBootstrapCheckTests extends ESTestCase {
|
|||
.put("path.home", createTempDir())
|
||||
.build();
|
||||
Environment env = new Environment(settings);
|
||||
assertFalse(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)));
|
||||
assertFalse(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
// disable client auth default
|
||||
settings = Settings.builder().put(settings)
|
||||
.put("xpack.ssl.client_authentication", "none")
|
||||
.build();
|
||||
env = new Environment(settings);
|
||||
assertTrue(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)));
|
||||
assertTrue(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
// enable ssl for http
|
||||
settings = Settings.builder().put(settings)
|
||||
.put("xpack.security.http.ssl.enabled", true)
|
||||
.build();
|
||||
env = new Environment(settings);
|
||||
assertTrue(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)));
|
||||
assertTrue(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
// enable client auth for http
|
||||
settings = Settings.builder().put(settings)
|
||||
.put("xpack.security.http.ssl.client_authentication", randomFrom("required", "optional"))
|
||||
.build();
|
||||
env = new Environment(settings);
|
||||
assertFalse(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)));
|
||||
assertFalse(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
// disable http ssl
|
||||
settings = Settings.builder().put(settings)
|
||||
.put("xpack.security.http.ssl.enabled", false)
|
||||
.build();
|
||||
env = new Environment(settings);
|
||||
assertTrue(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)));
|
||||
assertTrue(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
// set transport client auth
|
||||
settings = Settings.builder().put(settings)
|
||||
.put("xpack.security.transport.client_authentication", randomFrom("required", "optional"))
|
||||
.build();
|
||||
env = new Environment(settings);
|
||||
assertTrue(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)));
|
||||
assertTrue(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
// test with transport profile
|
||||
settings = Settings.builder().put(settings)
|
||||
|
@ -69,7 +69,7 @@ public class PkiRealmBootstrapCheckTests extends ESTestCase {
|
|||
.put("transport.profiles.foo.xpack.security.ssl.client_authentication", randomFrom("required", "optional"))
|
||||
.build();
|
||||
env = new Environment(settings);
|
||||
assertFalse(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)));
|
||||
assertFalse(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)).isFailure());
|
||||
}
|
||||
|
||||
public void testBootstrapCheckWithDisabledRealm() throws Exception {
|
||||
|
@ -80,6 +80,6 @@ public class PkiRealmBootstrapCheckTests extends ESTestCase {
|
|||
.put("path.home", createTempDir())
|
||||
.build();
|
||||
Environment env = new Environment(settings);
|
||||
assertFalse(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)));
|
||||
assertFalse(new PkiRealmBootstrapCheck(new SSLService(settings, env)).check(new BootstrapContext(settings, null)).isFailure());
|
||||
}
|
||||
}
|
||||
|
|
|
@ -64,7 +64,7 @@ public class SecurityLifecycleServiceTests extends ESTestCase {
|
|||
|
||||
threadPool = new TestThreadPool("security template service tests");
|
||||
transportClient = new MockTransportClient(Settings.EMPTY);
|
||||
class IClient extends InternalClient {
|
||||
class IClient extends InternalSecurityClient {
|
||||
IClient(Client transportClient) {
|
||||
super(Settings.EMPTY, null, transportClient);
|
||||
}
|
||||
|
@ -79,7 +79,7 @@ public class SecurityLifecycleServiceTests extends ESTestCase {
|
|||
}
|
||||
}
|
||||
|
||||
InternalClient client = new IClient(transportClient);
|
||||
InternalSecurityClient client = new IClient(transportClient);
|
||||
securityLifecycleService = new SecurityLifecycleService(Settings.EMPTY, clusterService,
|
||||
threadPool, client, mock(IndexAuditTrail.class));
|
||||
listeners = new CopyOnWriteArrayList<>();
|
||||
|
|
|
@ -77,9 +77,9 @@ public class SecurityTests extends ESTestCase {
|
|||
allowedSettings.addAll(ClusterSettings.BUILT_IN_CLUSTER_SETTINGS);
|
||||
ClusterSettings clusterSettings = new ClusterSettings(settings, allowedSettings);
|
||||
when(clusterService.getClusterSettings()).thenReturn(clusterSettings);
|
||||
InternalClient client = new InternalClient(Settings.EMPTY, threadPool, mock(Client.class));
|
||||
when(threadPool.relativeTimeInMillis()).thenReturn(1L);
|
||||
return security.createComponents(client, threadPool, clusterService, mock(ResourceWatcherService.class), Arrays.asList(extensions));
|
||||
return security.createComponents(mock(Client.class), threadPool, clusterService, mock(ResourceWatcherService.class),
|
||||
Arrays.asList(extensions));
|
||||
}
|
||||
|
||||
private <T> T findComponent(Class<T> type, Collection<Object> components) {
|
||||
|
|
|
@ -16,29 +16,29 @@ public class TokenSSLBootsrapCheckTests extends ESTestCase {
|
|||
public void testTokenSSLBootstrapCheck() {
|
||||
Settings settings = Settings.EMPTY;
|
||||
|
||||
assertFalse(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)));
|
||||
assertFalse(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
settings = Settings.builder()
|
||||
.put(NetworkModule.HTTP_ENABLED.getKey(), false)
|
||||
.put(XPackSettings.TOKEN_SERVICE_ENABLED_SETTING.getKey(), true).build();
|
||||
assertFalse(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)));
|
||||
assertFalse(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
settings = Settings.builder().put(XPackSettings.HTTP_SSL_ENABLED.getKey(), true).build();
|
||||
assertFalse(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)));
|
||||
assertFalse(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
// XPackSettings.HTTP_SSL_ENABLED default false
|
||||
settings = Settings.builder().put(XPackSettings.TOKEN_SERVICE_ENABLED_SETTING.getKey(), true).build();
|
||||
assertTrue(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)));
|
||||
assertTrue(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
settings = Settings.builder()
|
||||
.put(XPackSettings.HTTP_SSL_ENABLED.getKey(), false)
|
||||
.put(XPackSettings.TOKEN_SERVICE_ENABLED_SETTING.getKey(), true).build();
|
||||
assertTrue(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)));
|
||||
assertTrue(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
settings = Settings.builder()
|
||||
.put(XPackSettings.HTTP_SSL_ENABLED.getKey(), false)
|
||||
.put(XPackSettings.TOKEN_SERVICE_ENABLED_SETTING.getKey(), true)
|
||||
.put(NetworkModule.HTTP_ENABLED.getKey(), false).build();
|
||||
assertFalse(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)));
|
||||
assertFalse(new TokenSSLBootstrapCheck().check(new BootstrapContext(settings, null)).isFailure());
|
||||
}
|
||||
}
|
||||
|
|
|
@ -141,7 +141,7 @@ public class AuditTrailTests extends SecurityIntegTestCase {
|
|||
return eventsRef.get();
|
||||
}
|
||||
private Collection<Map<String, Object>> getAuditEvents() throws Exception {
|
||||
final InternalClient client = internalClient();
|
||||
final InternalClient client = internalSecurityClient();
|
||||
DateTime now = new DateTime(DateTimeZone.UTC);
|
||||
String indexName = IndexNameResolver.resolve(IndexAuditTrail.INDEX_NAME_PREFIX, now, IndexNameResolver.Rollover.DAILY);
|
||||
|
||||
|
|
|
@ -22,6 +22,7 @@ import org.elasticsearch.threadpool.ThreadPool;
|
|||
import org.elasticsearch.transport.MockTransportClient;
|
||||
import org.elasticsearch.transport.TransportMessage;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.audit.index.IndexAuditTrail.State;
|
||||
import org.elasticsearch.xpack.security.authc.AuthenticationToken;
|
||||
import org.elasticsearch.xpack.security.transport.filter.SecurityIpFilterRule;
|
||||
|
@ -43,7 +44,7 @@ import static org.mockito.Mockito.when;
|
|||
|
||||
public class IndexAuditTrailMutedTests extends ESTestCase {
|
||||
|
||||
private InternalClient client;
|
||||
private InternalSecurityClient client;
|
||||
private TransportClient transportClient;
|
||||
private ThreadPool threadPool;
|
||||
private ClusterService clusterService;
|
||||
|
@ -62,7 +63,7 @@ public class IndexAuditTrailMutedTests extends ESTestCase {
|
|||
threadPool = new TestThreadPool("index audit trail tests");
|
||||
transportClient = new MockTransportClient(Settings.EMPTY);
|
||||
clientCalled = new AtomicBoolean(false);
|
||||
class IClient extends InternalClient {
|
||||
class IClient extends InternalSecurityClient {
|
||||
IClient(Client transportClient){
|
||||
super(Settings.EMPTY, threadPool, transportClient);
|
||||
}
|
||||
|
|
|
@ -296,7 +296,7 @@ public class IndexAuditTrailTests extends SecurityIntegTestCase {
|
|||
when(nodes.isLocalNodeElectedMaster()).thenReturn(true);
|
||||
threadPool = new TestThreadPool("index audit trail tests");
|
||||
enqueuedMessage = new SetOnce<>();
|
||||
auditor = new IndexAuditTrail(settings, internalClient(), threadPool, clusterService) {
|
||||
auditor = new IndexAuditTrail(settings, internalSecurityClient(), threadPool, clusterService) {
|
||||
@Override
|
||||
void enqueue(Message message, String type) {
|
||||
enqueuedMessage.set(message);
|
||||
|
|
|
@ -50,7 +50,7 @@ public class IndexAuditTrailUpdateMappingTests extends SecurityIntegTestCase {
|
|||
when(localNode.getHostAddress()).thenReturn(buildNewFakeTransportAddress().toString());
|
||||
ClusterService clusterService = mock(ClusterService.class);
|
||||
when(clusterService.localNode()).thenReturn(localNode);
|
||||
auditor = new IndexAuditTrail(settings, internalClient(), threadPool, clusterService);
|
||||
auditor = new IndexAuditTrail(settings, internalSecurityClient(), threadPool, clusterService);
|
||||
|
||||
// before starting we add an event
|
||||
auditor.authenticationFailed(new FakeRestRequest());
|
||||
|
|
|
@ -48,6 +48,7 @@ import org.elasticsearch.threadpool.ThreadPool;
|
|||
import org.elasticsearch.transport.TransportMessage;
|
||||
import org.elasticsearch.xpack.XPackSettings;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.SecurityLifecycleService;
|
||||
import org.elasticsearch.xpack.security.audit.AuditTrailService;
|
||||
import org.elasticsearch.xpack.security.authc.Authentication.RealmRef;
|
||||
|
@ -135,7 +136,7 @@ public class AuthenticationServiceTests extends ESTestCase {
|
|||
threadPool = new ThreadPool(settings,
|
||||
new FixedExecutorBuilder(settings, TokenService.THREAD_POOL_NAME, 1, 1000, "xpack.security.authc.token.thread_pool"));
|
||||
threadContext = threadPool.getThreadContext();
|
||||
InternalClient internalClient = new InternalClient(Settings.EMPTY, threadPool, client);
|
||||
InternalSecurityClient internalClient = new InternalSecurityClient(Settings.EMPTY, threadPool, client);
|
||||
lifecycleService = mock(SecurityLifecycleService.class);
|
||||
ClusterService clusterService = new ClusterService(settings, new ClusterSettings(settings, ClusterSettings
|
||||
.BUILT_IN_CLUSTER_SETTINGS), threadPool, Collections.emptyMap());
|
||||
|
|
|
@ -56,7 +56,7 @@ public class TokenAuthIntegTests extends SecurityIntegTestCase {
|
|||
}
|
||||
|
||||
public void testTokenServiceBootstrapOnNodeJoin() throws Exception {
|
||||
final Client client = internalClient();
|
||||
final Client client = internalSecurityClient();
|
||||
SecurityClient securityClient = new SecurityClient(client);
|
||||
CreateTokenResponse response = securityClient.prepareCreateToken()
|
||||
.setGrantType("password")
|
||||
|
@ -84,7 +84,7 @@ public class TokenAuthIntegTests extends SecurityIntegTestCase {
|
|||
|
||||
|
||||
public void testTokenServiceCanRotateKeys() throws Exception {
|
||||
final Client client = internalClient();
|
||||
final Client client = internalSecurityClient();
|
||||
SecurityClient securityClient = new SecurityClient(client);
|
||||
CreateTokenResponse response = securityClient.prepareCreateToken()
|
||||
.setGrantType("password")
|
||||
|
@ -116,7 +116,7 @@ public class TokenAuthIntegTests extends SecurityIntegTestCase {
|
|||
}
|
||||
|
||||
public void testExpiredTokensDeletedAfterExpiration() throws Exception {
|
||||
final Client client = internalClient();
|
||||
final Client client = internalSecurityClient();
|
||||
SecurityClient securityClient = new SecurityClient(client);
|
||||
CreateTokenResponse response = securityClient.prepareCreateToken()
|
||||
.setGrantType("password")
|
||||
|
|
|
@ -24,6 +24,7 @@ import org.elasticsearch.threadpool.FixedExecutorBuilder;
|
|||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.xpack.XPackSettings;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.SecurityLifecycleService;
|
||||
import org.elasticsearch.xpack.security.authc.Authentication.RealmRef;
|
||||
import org.elasticsearch.xpack.security.authc.TokenService.BytesKey;
|
||||
|
@ -49,7 +50,7 @@ import static org.mockito.Mockito.when;
|
|||
|
||||
public class TokenServiceTests extends ESTestCase {
|
||||
|
||||
private InternalClient internalClient;
|
||||
private InternalSecurityClient internalClient;
|
||||
private static ThreadPool threadPool;
|
||||
private static final Settings settings = Settings.builder().put(Node.NODE_NAME_SETTING.getKey(), "TokenServiceTests")
|
||||
.put(XPackSettings.TOKEN_SERVICE_ENABLED_SETTING.getKey(), true).build();
|
||||
|
@ -63,7 +64,7 @@ public class TokenServiceTests extends ESTestCase {
|
|||
@Before
|
||||
public void setupClient() throws GeneralSecurityException {
|
||||
client = mock(Client.class);
|
||||
internalClient = new InternalClient(settings, threadPool, client);
|
||||
internalClient = new InternalSecurityClient(settings, threadPool, client);
|
||||
lifecycleService = mock(SecurityLifecycleService.class);
|
||||
when(lifecycleService.isSecurityIndexWriteable()).thenReturn(true);
|
||||
doAnswer(invocationOnMock -> {
|
||||
|
|
|
@ -29,6 +29,7 @@ import org.elasticsearch.common.settings.Settings;
|
|||
import org.elasticsearch.index.get.GetResult;
|
||||
import org.elasticsearch.test.ESTestCase;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.SecurityLifecycleService;
|
||||
import org.elasticsearch.xpack.security.authc.AuthenticationResult;
|
||||
import org.elasticsearch.xpack.security.authc.support.Hasher;
|
||||
|
@ -54,12 +55,12 @@ public class NativeUsersStoreTests extends ESTestCase {
|
|||
private static final String PASSWORD_FIELD = User.Fields.PASSWORD.getPreferredName();
|
||||
private static final String BLANK_PASSWORD = "";
|
||||
|
||||
private InternalClient internalClient;
|
||||
private InternalSecurityClient internalClient;
|
||||
private final List<Tuple<ActionRequest, ActionListener<? extends ActionResponse>>> requests = new CopyOnWriteArrayList<>();
|
||||
|
||||
@Before
|
||||
public void setupMocks() {
|
||||
internalClient = new InternalClient(Settings.EMPTY, null, null) {
|
||||
internalClient = new InternalSecurityClient(Settings.EMPTY, null, null) {
|
||||
|
||||
@Override
|
||||
protected <
|
||||
|
|
|
@ -46,7 +46,7 @@ public class RoleMappingFileBootstrapCheckTests extends ESTestCase {
|
|||
final BootstrapCheck check = RoleMappingFileBootstrapCheck.create(config);
|
||||
assertThat(check, notNullValue());
|
||||
assertThat(check.alwaysEnforce(), equalTo(true));
|
||||
assertThat(check.check(new BootstrapContext(settings, null)), equalTo(false));
|
||||
assertFalse(check.check(new BootstrapContext(settings, null)).isFailure());
|
||||
}
|
||||
|
||||
public void testBootstrapCheckOfMissingFile() {
|
||||
|
@ -59,10 +59,11 @@ public class RoleMappingFileBootstrapCheckTests extends ESTestCase {
|
|||
final BootstrapCheck check = RoleMappingFileBootstrapCheck.create(config);
|
||||
assertThat(check, notNullValue());
|
||||
assertThat(check.alwaysEnforce(), equalTo(true));
|
||||
assertThat(check.check(new BootstrapContext(settings, null)), equalTo(true));
|
||||
assertThat(check.errorMessage(), containsString("the-realm-name"));
|
||||
assertThat(check.errorMessage(), containsString(fileName));
|
||||
assertThat(check.errorMessage(), containsString("does not exist"));
|
||||
final BootstrapCheck.BootstrapCheckResult result = check.check(new BootstrapContext(settings, null));
|
||||
assertTrue(result.isFailure());
|
||||
assertThat(result.getMessage(), containsString("the-realm-name"));
|
||||
assertThat(result.getMessage(), containsString(fileName));
|
||||
assertThat(result.getMessage(), containsString("does not exist"));
|
||||
}
|
||||
|
||||
public void testBootstrapCheckWithInvalidYaml() throws IOException {
|
||||
|
@ -77,10 +78,11 @@ public class RoleMappingFileBootstrapCheckTests extends ESTestCase {
|
|||
final BootstrapCheck check = RoleMappingFileBootstrapCheck.create(config);
|
||||
assertThat(check, notNullValue());
|
||||
assertThat(check.alwaysEnforce(), equalTo(true));
|
||||
assertThat(check.check(new BootstrapContext(settings, null)), equalTo(true));
|
||||
assertThat(check.errorMessage(), containsString("the-realm-name"));
|
||||
assertThat(check.errorMessage(), containsString(file.toString()));
|
||||
assertThat(check.errorMessage(), containsString("could not read"));
|
||||
final BootstrapCheck.BootstrapCheckResult result = check.check(new BootstrapContext(settings, null));
|
||||
assertTrue(result.isFailure());
|
||||
assertThat(result.getMessage(), containsString("the-realm-name"));
|
||||
assertThat(result.getMessage(), containsString(file.toString()));
|
||||
assertThat(result.getMessage(), containsString("could not read"));
|
||||
}
|
||||
|
||||
public void testBootstrapCheckWithInvalidDn() throws IOException {
|
||||
|
@ -95,10 +97,11 @@ public class RoleMappingFileBootstrapCheckTests extends ESTestCase {
|
|||
final BootstrapCheck check = RoleMappingFileBootstrapCheck.create(config);
|
||||
assertThat(check, notNullValue());
|
||||
assertThat(check.alwaysEnforce(), equalTo(true));
|
||||
assertThat(check.check(new BootstrapContext(settings, null)), equalTo(true));
|
||||
assertThat(check.errorMessage(), containsString("the-realm-name"));
|
||||
assertThat(check.errorMessage(), containsString(file.toString()));
|
||||
assertThat(check.errorMessage(), containsString("invalid DN"));
|
||||
assertThat(check.errorMessage(), containsString("not-a-dn"));
|
||||
final BootstrapCheck.BootstrapCheckResult result = check.check(new BootstrapContext(settings, null));
|
||||
assertTrue(result.isFailure());
|
||||
assertThat(result.getMessage(), containsString("the-realm-name"));
|
||||
assertThat(result.getMessage(), containsString(file.toString()));
|
||||
assertThat(result.getMessage(), containsString("invalid DN"));
|
||||
assertThat(result.getMessage(), containsString("not-a-dn"));
|
||||
}
|
||||
}
|
||||
|
|
|
@ -17,6 +17,7 @@ import org.elasticsearch.common.util.concurrent.ThreadContext;
|
|||
import org.elasticsearch.env.Environment;
|
||||
import org.elasticsearch.test.ESTestCase;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.SecurityLifecycleService;
|
||||
import org.elasticsearch.xpack.security.authc.RealmConfig;
|
||||
import org.elasticsearch.xpack.security.authc.support.UserRoleMapper;
|
||||
|
@ -53,7 +54,7 @@ public class NativeUserRoleMapperTests extends ESTestCase {
|
|||
Collections.singletonList(FieldPredicate.create("cn=mutants,ou=groups,ou=dept_h,o=forces,dc=gc,dc=ca"))),
|
||||
Arrays.asList("mutants"), Collections.emptyMap(), false);
|
||||
|
||||
final InternalClient client = mock(InternalClient.class);
|
||||
final InternalSecurityClient client = mock(InternalSecurityClient.class);
|
||||
final SecurityLifecycleService lifecycleService = mock(SecurityLifecycleService.class);
|
||||
when(lifecycleService.isSecurityIndexAvailable()).thenReturn(true);
|
||||
|
||||
|
|
|
@ -789,7 +789,7 @@ public class AuthorizationServiceTests extends ESTestCase {
|
|||
}
|
||||
}
|
||||
|
||||
public void testXPackUserAndSuperusersCanExecuteOperationAgainstSecurityIndex() {
|
||||
public void testSuperusersCanExecuteOperationAgainstSecurityIndex() {
|
||||
final User superuser = new User("custom_admin", ReservedRolesStore.SUPERUSER_ROLE_DESCRIPTOR.getName());
|
||||
roleMap.put(ReservedRolesStore.SUPERUSER_ROLE_DESCRIPTOR.getName(), ReservedRolesStore.SUPERUSER_ROLE_DESCRIPTOR);
|
||||
ClusterState state = mock(ClusterState.class);
|
||||
|
@ -800,37 +800,35 @@ public class AuthorizationServiceTests extends ESTestCase {
|
|||
.numberOfShards(1).numberOfReplicas(0).build(), true)
|
||||
.build());
|
||||
|
||||
for (User user : Arrays.asList(XPackUser.INSTANCE, superuser)) {
|
||||
List<Tuple<String, TransportRequest>> requests = new ArrayList<>();
|
||||
requests.add(new Tuple<>(DeleteAction.NAME, new DeleteRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(BulkAction.NAME + "[s]",
|
||||
createBulkShardRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, DeleteRequest::new)));
|
||||
requests.add(new Tuple<>(UpdateAction.NAME, new UpdateRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(IndexAction.NAME, new IndexRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(BulkAction.NAME + "[s]",
|
||||
createBulkShardRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, IndexRequest::new)));
|
||||
requests.add(new Tuple<>(SearchAction.NAME, new SearchRequest(SecurityLifecycleService.SECURITY_INDEX_NAME)));
|
||||
requests.add(new Tuple<>(TermVectorsAction.NAME,
|
||||
new TermVectorsRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(GetAction.NAME, new GetRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(TermVectorsAction.NAME,
|
||||
new TermVectorsRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(IndicesAliasesAction.NAME, new IndicesAliasesRequest()
|
||||
.addAliasAction(AliasActions.add().alias("security_alias").index(SecurityLifecycleService.SECURITY_INDEX_NAME))));
|
||||
requests.add(new Tuple<>(ClusterHealthAction.NAME, new ClusterHealthRequest(SecurityLifecycleService.SECURITY_INDEX_NAME)));
|
||||
requests.add(new Tuple<>(ClusterHealthAction.NAME,
|
||||
new ClusterHealthRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "foo", "bar")));
|
||||
List<Tuple<String, TransportRequest>> requests = new ArrayList<>();
|
||||
requests.add(new Tuple<>(DeleteAction.NAME, new DeleteRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(BulkAction.NAME + "[s]",
|
||||
createBulkShardRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, DeleteRequest::new)));
|
||||
requests.add(new Tuple<>(UpdateAction.NAME, new UpdateRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(IndexAction.NAME, new IndexRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(BulkAction.NAME + "[s]",
|
||||
createBulkShardRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, IndexRequest::new)));
|
||||
requests.add(new Tuple<>(SearchAction.NAME, new SearchRequest(SecurityLifecycleService.SECURITY_INDEX_NAME)));
|
||||
requests.add(new Tuple<>(TermVectorsAction.NAME,
|
||||
new TermVectorsRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(GetAction.NAME, new GetRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(TermVectorsAction.NAME,
|
||||
new TermVectorsRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "type", "id")));
|
||||
requests.add(new Tuple<>(IndicesAliasesAction.NAME, new IndicesAliasesRequest()
|
||||
.addAliasAction(AliasActions.add().alias("security_alias").index(SecurityLifecycleService.SECURITY_INDEX_NAME))));
|
||||
requests.add(new Tuple<>(ClusterHealthAction.NAME, new ClusterHealthRequest(SecurityLifecycleService.SECURITY_INDEX_NAME)));
|
||||
requests.add(new Tuple<>(ClusterHealthAction.NAME,
|
||||
new ClusterHealthRequest(SecurityLifecycleService.SECURITY_INDEX_NAME, "foo", "bar")));
|
||||
|
||||
for (Tuple<String, TransportRequest> requestTuple : requests) {
|
||||
String action = requestTuple.v1();
|
||||
TransportRequest request = requestTuple.v2();
|
||||
authorize(createAuthentication(user), action, request);
|
||||
verify(auditTrail).accessGranted(user, action, request, null);
|
||||
}
|
||||
for (Tuple<String, TransportRequest> requestTuple : requests) {
|
||||
String action = requestTuple.v1();
|
||||
TransportRequest request = requestTuple.v2();
|
||||
authorize(createAuthentication(superuser), action, request);
|
||||
verify(auditTrail).accessGranted(superuser, action, request, null);
|
||||
}
|
||||
}
|
||||
|
||||
public void testXPackUserAndSuperusersCanExecuteOperationAgainstSecurityIndexWithWildcard() {
|
||||
public void testSuperusersCanExecuteOperationAgainstSecurityIndexWithWildcard() {
|
||||
final User superuser = new User("custom_admin", ReservedRolesStore.SUPERUSER_ROLE_DESCRIPTOR.getName());
|
||||
roleMap.put(ReservedRolesStore.SUPERUSER_ROLE_DESCRIPTOR.getName(), ReservedRolesStore.SUPERUSER_ROLE_DESCRIPTOR);
|
||||
ClusterState state = mock(ClusterState.class);
|
||||
|
@ -843,11 +841,6 @@ public class AuthorizationServiceTests extends ESTestCase {
|
|||
|
||||
String action = SearchAction.NAME;
|
||||
SearchRequest request = new SearchRequest("_all");
|
||||
authorize(createAuthentication(XPackUser.INSTANCE), action, request);
|
||||
verify(auditTrail).accessGranted(XPackUser.INSTANCE, action, request, null);
|
||||
assertThat(request.indices(), arrayContaining(".security"));
|
||||
|
||||
request = new SearchRequest("_all");
|
||||
authorize(createAuthentication(superuser), action, request);
|
||||
verify(auditTrail).accessGranted(superuser, action, request, null);
|
||||
assertThat(request.indices(), arrayContaining(".security"));
|
||||
|
@ -1227,7 +1220,7 @@ public class AuthorizationServiceTests extends ESTestCase {
|
|||
PlainActionFuture<Role> rolesFuture = new PlainActionFuture<>();
|
||||
authorizationService.roles(XPackUser.INSTANCE, rolesFuture);
|
||||
final Role roles = rolesFuture.actionGet();
|
||||
assertThat(roles, equalTo(ReservedRolesStore.SUPERUSER_ROLE));
|
||||
assertThat(roles, equalTo(XPackUser.ROLE));
|
||||
verifyZeroInteractions(rolesStore);
|
||||
}
|
||||
|
||||
|
|
|
@ -69,6 +69,7 @@ import org.elasticsearch.xpack.security.authz.store.ReservedRolesStore;
|
|||
import org.elasticsearch.xpack.security.test.SecurityTestUtils;
|
||||
import org.elasticsearch.xpack.security.user.AnonymousUser;
|
||||
import org.elasticsearch.xpack.security.user.User;
|
||||
import org.elasticsearch.xpack.security.user.XPackSecurityUser;
|
||||
import org.elasticsearch.xpack.security.user.XPackUser;
|
||||
import org.junit.Before;
|
||||
|
||||
|
@ -1191,22 +1192,29 @@ public class IndicesAndAliasesResolverTests extends ESTestCase {
|
|||
}
|
||||
}
|
||||
|
||||
public void testXPackUserHasAccessToSecurityIndex() {
|
||||
public void testXPackSecurityUserHasAccessToSecurityIndex() {
|
||||
SearchRequest request = new SearchRequest();
|
||||
{
|
||||
final AuthorizedIndices authorizedIndices = buildAuthorizedIndices(XPackUser.INSTANCE, SearchAction.NAME);
|
||||
final AuthorizedIndices authorizedIndices = buildAuthorizedIndices(XPackSecurityUser.INSTANCE, SearchAction.NAME);
|
||||
List<String> indices = resolveIndices(request, authorizedIndices).getLocal();
|
||||
assertThat(indices, hasItem(SecurityLifecycleService.SECURITY_INDEX_NAME));
|
||||
}
|
||||
{
|
||||
IndicesAliasesRequest aliasesRequest = new IndicesAliasesRequest();
|
||||
aliasesRequest.addAliasAction(AliasActions.add().alias("security_alias").index(SecurityLifecycleService.SECURITY_INDEX_NAME));
|
||||
final AuthorizedIndices authorizedIndices = buildAuthorizedIndices(XPackUser.INSTANCE, IndicesAliasesAction.NAME);
|
||||
final AuthorizedIndices authorizedIndices = buildAuthorizedIndices(XPackSecurityUser.INSTANCE, IndicesAliasesAction.NAME);
|
||||
List<String> indices = resolveIndices(aliasesRequest, authorizedIndices).getLocal();
|
||||
assertThat(indices, hasItem(SecurityLifecycleService.SECURITY_INDEX_NAME));
|
||||
}
|
||||
}
|
||||
|
||||
public void testXPackUserDoesNotHaveAccessToSecurityIndex() {
|
||||
SearchRequest request = new SearchRequest();
|
||||
final AuthorizedIndices authorizedIndices = buildAuthorizedIndices(XPackUser.INSTANCE, SearchAction.NAME);
|
||||
List<String> indices = resolveIndices(request, authorizedIndices).getLocal();
|
||||
assertThat(indices, not(hasItem(SecurityLifecycleService.SECURITY_INDEX_NAME)));
|
||||
}
|
||||
|
||||
public void testNonXPackUserAccessingSecurityIndex() {
|
||||
User allAccessUser = new User("all_access", "all_access");
|
||||
roleMap.put("all_access", new RoleDescriptor("all_access", new String[] { "all" },
|
||||
|
|
|
@ -38,6 +38,7 @@ import org.elasticsearch.test.ESTestCase;
|
|||
import org.elasticsearch.threadpool.TestThreadPool;
|
||||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.SecurityLifecycleService;
|
||||
import org.elasticsearch.xpack.security.action.role.PutRoleRequest;
|
||||
import org.elasticsearch.xpack.security.audit.index.IndexAuditTrail;
|
||||
|
@ -184,7 +185,7 @@ public class NativeRolesStoreTests extends ESTestCase {
|
|||
}
|
||||
|
||||
public void testPutOfRoleWithFlsDlsUnlicensed() throws IOException {
|
||||
final InternalClient internalClient = mock(InternalClient.class);
|
||||
final InternalSecurityClient internalClient = mock(InternalSecurityClient.class);
|
||||
final ClusterService clusterService = mock(ClusterService.class);
|
||||
final XPackLicenseState licenseState = mock(XPackLicenseState.class);
|
||||
final AtomicBoolean methodCalled = new AtomicBoolean(false);
|
||||
|
|
|
@ -80,6 +80,7 @@ import org.elasticsearch.xpack.security.authz.accesscontrol.IndicesAccessControl
|
|||
import org.elasticsearch.xpack.security.authz.permission.FieldPermissionsCache;
|
||||
import org.elasticsearch.xpack.security.authz.permission.Role;
|
||||
import org.elasticsearch.xpack.security.user.SystemUser;
|
||||
import org.elasticsearch.xpack.security.user.XPackUser;
|
||||
import org.elasticsearch.xpack.watcher.execution.TriggeredWatchStore;
|
||||
import org.elasticsearch.xpack.watcher.history.HistoryStore;
|
||||
import org.elasticsearch.xpack.watcher.transport.actions.ack.AckWatchAction;
|
||||
|
@ -123,6 +124,7 @@ public class ReservedRolesStoreTests extends ESTestCase {
|
|||
assertThat(ReservedRolesStore.isReserved("watcher_user"), is(true));
|
||||
assertThat(ReservedRolesStore.isReserved("watcher_admin"), is(true));
|
||||
assertThat(ReservedRolesStore.isReserved("kibana_dashboard_only_user"), is(true));
|
||||
assertThat(ReservedRolesStore.isReserved(XPackUser.ROLE_NAME), is(true));
|
||||
}
|
||||
|
||||
public void testIngestAdminRole() {
|
||||
|
|
|
@ -45,6 +45,7 @@ import org.elasticsearch.index.shard.ShardId;
|
|||
import org.elasticsearch.test.ESTestCase;
|
||||
import org.elasticsearch.threadpool.ThreadPool;
|
||||
import org.elasticsearch.xpack.security.InternalClient;
|
||||
import org.elasticsearch.xpack.security.InternalSecurityClient;
|
||||
import org.elasticsearch.xpack.security.test.SecurityTestUtils;
|
||||
import org.elasticsearch.xpack.template.TemplateUtils;
|
||||
import org.hamcrest.Matchers;
|
||||
|
@ -71,7 +72,7 @@ public class IndexLifecycleManagerTests extends ESTestCase {
|
|||
when(threadPool.getThreadContext()).thenReturn(new ThreadContext(Settings.EMPTY));
|
||||
|
||||
actions = new LinkedHashMap<>();
|
||||
final InternalClient client = new InternalClient(Settings.EMPTY, threadPool, mockClient) {
|
||||
final InternalSecurityClient client = new InternalSecurityClient(Settings.EMPTY, threadPool, mockClient) {
|
||||
@Override
|
||||
protected <Request extends ActionRequest,
|
||||
Response extends ActionResponse,
|
||||
|
|
|
@ -16,7 +16,7 @@ public class SSLBootstrapCheckTests extends ESTestCase {
|
|||
public void testSSLBootstrapCheckWithNoKey() throws Exception {
|
||||
SSLService sslService = new SSLService(Settings.EMPTY, null);
|
||||
SSLBootstrapCheck bootstrapCheck = new SSLBootstrapCheck(sslService, null);
|
||||
assertTrue(bootstrapCheck.check(new BootstrapContext(Settings.EMPTY, null)));
|
||||
assertTrue(bootstrapCheck.check(new BootstrapContext(Settings.EMPTY, null)).isFailure());
|
||||
}
|
||||
|
||||
public void testSSLBootstrapCheckWithKey() throws Exception {
|
||||
|
@ -33,7 +33,7 @@ public class SSLBootstrapCheckTests extends ESTestCase {
|
|||
.build();
|
||||
final Environment env = randomBoolean() ? new Environment(settings) : null;
|
||||
SSLBootstrapCheck bootstrapCheck = new SSLBootstrapCheck(new SSLService(settings, env), env);
|
||||
assertFalse(bootstrapCheck.check(new BootstrapContext(settings, null)));
|
||||
assertFalse(bootstrapCheck.check(new BootstrapContext(settings, null)).isFailure());
|
||||
}
|
||||
|
||||
public void testSSLBootstrapCheckWithDefaultCABeingTrusted() throws Exception {
|
||||
|
@ -53,14 +53,14 @@ public class SSLBootstrapCheckTests extends ESTestCase {
|
|||
.build();
|
||||
final Environment env = randomBoolean() ? new Environment(settings) : null;
|
||||
SSLBootstrapCheck bootstrapCheck = new SSLBootstrapCheck(new SSLService(settings, env), env);
|
||||
assertTrue(bootstrapCheck.check(new BootstrapContext(settings, null)));
|
||||
assertTrue(bootstrapCheck.check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
settings = Settings.builder().put(settings.filter((s) -> s.contains(".certificate_authorities")))
|
||||
.put("xpack.security.http.ssl.certificate_authorities",
|
||||
getDataPath("/org/elasticsearch/xpack/ssl/ca.pem").toString())
|
||||
.build();
|
||||
bootstrapCheck = new SSLBootstrapCheck(new SSLService(settings, env), env);
|
||||
assertTrue(bootstrapCheck.check(new BootstrapContext(settings, null)));
|
||||
assertTrue(bootstrapCheck.check(new BootstrapContext(settings, null)).isFailure());
|
||||
}
|
||||
|
||||
public void testSSLBootstrapCheckWithDefaultKeyBeingUsed() throws Exception {
|
||||
|
@ -79,7 +79,7 @@ public class SSLBootstrapCheckTests extends ESTestCase {
|
|||
.build();
|
||||
final Environment env = randomBoolean() ? new Environment(settings) : null;
|
||||
SSLBootstrapCheck bootstrapCheck = new SSLBootstrapCheck(new SSLService(settings, env), env);
|
||||
assertTrue(bootstrapCheck.check(new BootstrapContext(settings, null)));
|
||||
assertTrue(bootstrapCheck.check(new BootstrapContext(settings, null)).isFailure());
|
||||
|
||||
settings = Settings.builder().put(settings.filter((s) -> s.contains(".http.ssl.")))
|
||||
.put("xpack.security.transport.profiles.foo.xpack.security.ssl.key",
|
||||
|
@ -88,6 +88,6 @@ public class SSLBootstrapCheckTests extends ESTestCase {
|
|||
getDataPath("/org/elasticsearch/xpack/ssl/ca.pem").toString())
|
||||
.build();
|
||||
bootstrapCheck = new SSLBootstrapCheck(new SSLService(settings, env), env);
|
||||
assertTrue(bootstrapCheck.check(new BootstrapContext(settings, null)));
|
||||
assertTrue(bootstrapCheck.check(new BootstrapContext(settings, null)).isFailure());
|
||||
}
|
||||
}
|
||||
|
|
|
@ -18,7 +18,7 @@ public class EncryptSensitiveDataBootstrapCheckTests extends ESTestCase {
|
|||
Settings settings = Settings.builder().put("path.home", createTempDir()).build();
|
||||
Environment env = new Environment(settings);
|
||||
EncryptSensitiveDataBootstrapCheck check = new EncryptSensitiveDataBootstrapCheck(env);
|
||||
assertFalse(check.check(new BootstrapContext(settings, null)));
|
||||
assertFalse(check.check(new BootstrapContext(settings, null)).isFailure());
|
||||
assertTrue(check.alwaysEnforce());
|
||||
}
|
||||
|
||||
|
@ -29,7 +29,7 @@ public class EncryptSensitiveDataBootstrapCheckTests extends ESTestCase {
|
|||
.build();
|
||||
Environment env = new Environment(settings);
|
||||
EncryptSensitiveDataBootstrapCheck check = new EncryptSensitiveDataBootstrapCheck(env);
|
||||
assertTrue(check.check(new BootstrapContext(settings, null)));
|
||||
assertTrue(check.check(new BootstrapContext(settings, null)).isFailure());
|
||||
}
|
||||
|
||||
public void testKeyInKeystore() {
|
||||
|
@ -42,6 +42,7 @@ public class EncryptSensitiveDataBootstrapCheckTests extends ESTestCase {
|
|||
.build();
|
||||
Environment env = new Environment(settings);
|
||||
EncryptSensitiveDataBootstrapCheck check = new EncryptSensitiveDataBootstrapCheck(env);
|
||||
assertFalse(check.check(new BootstrapContext(settings, null)));
|
||||
assertFalse(check.check(new BootstrapContext(settings, null)).isFailure());
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -161,3 +161,4 @@ indices:data/write/update/byquery
|
|||
indices:data/write/delete/byquery
|
||||
indices:data/write/reindex
|
||||
cluster:admin/xpack/deprecation/info
|
||||
cluster:admin/xpack/ml/job/forecast
|
||||
|
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"xpack.ml.forecast": {
|
||||
"methods": [ "POST" ],
|
||||
"url": {
|
||||
"path": "/_xpack/ml/anomaly_detectors/{job_id}/_forecast",
|
||||
"paths": [ "/_xpack/ml/anomaly_detectors/{job_id}/_forecast" ],
|
||||
"parts": {
|
||||
"job_id": {
|
||||
"type": "string",
|
||||
"required": true,
|
||||
"description": "The ID of the job to forecast for"
|
||||
}
|
||||
},
|
||||
"params": {
|
||||
"end": {
|
||||
"type": "string",
|
||||
"required": false,
|
||||
"description": "The end time of the forecast"
|
||||
}
|
||||
}
|
||||
},
|
||||
"body": null
|
||||
}
|
||||
}
|
|
@ -0,0 +1,41 @@
|
|||
setup:
|
||||
- do:
|
||||
xpack.ml.put_job:
|
||||
job_id: forecast-job
|
||||
body: >
|
||||
{
|
||||
"description":"A forecast job",
|
||||
"analysis_config" : {
|
||||
"detectors" :[{"function":"metric","field_name":"responsetime","by_field_name":"airline"}]
|
||||
},
|
||||
"data_description" : {
|
||||
"format":"xcontent"
|
||||
}
|
||||
}
|
||||
|
||||
---
|
||||
"Test forecast unknown job":
|
||||
- do:
|
||||
catch: missing
|
||||
xpack.ml.forecast:
|
||||
job_id: "non-existing-job"
|
||||
|
||||
---
|
||||
"Test forecast on closed job":
|
||||
- do:
|
||||
catch: /status_exception/
|
||||
xpack.ml.forecast:
|
||||
job_id: "forecast-job"
|
||||
|
||||
---
|
||||
"Test bad end param errors":
|
||||
|
||||
- do:
|
||||
xpack.ml.open_job:
|
||||
job_id: "forecast-job"
|
||||
|
||||
- do:
|
||||
catch: /parse_exception/
|
||||
xpack.ml.forecast:
|
||||
job_id: "forecast-job"
|
||||
end: "tomorrow"
|
Loading…
Reference in New Issue