[DOCS] Updates terms in machine learning datafeed APIs (#44883)
This commit is contained in:
parent
d4b2d21339
commit
cef375f883
|
@ -4,10 +4,12 @@
|
|||
:response: AcknowledgedResponse
|
||||
--
|
||||
[id="{upid}-delete-datafeed"]
|
||||
=== Delete Datafeed API
|
||||
=== Delete datafeed API
|
||||
|
||||
Deletes an existing datafeed.
|
||||
|
||||
[id="{upid}-{api}-request"]
|
||||
==== Delete Datafeed Request
|
||||
==== Delete datafeed request
|
||||
|
||||
A +{request}+ object requires a non-null `datafeedId` and can optionally set `force`.
|
||||
|
||||
|
@ -15,18 +17,17 @@ A +{request}+ object requires a non-null `datafeedId` and can optionally set `fo
|
|||
---------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-request]
|
||||
---------------------------------------------------
|
||||
<1> Use to forcefully delete a started datafeed;
|
||||
this method is quicker than stopping and deleting the datafeed.
|
||||
Defaults to `false`.
|
||||
<1> Use to forcefully delete a started datafeed. This method is quicker than
|
||||
stopping and deleting the datafeed. Defaults to `false`.
|
||||
|
||||
include::../execution.asciidoc[]
|
||||
|
||||
[id="{upid}-{api}-response"]
|
||||
==== Delete Datafeed Response
|
||||
==== Delete datafeed response
|
||||
|
||||
The returned +{response}+ object indicates the acknowledgement of the request:
|
||||
["source","java",subs="attributes,callouts,macros"]
|
||||
---------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-response]
|
||||
---------------------------------------------------
|
||||
<1> `isAcknowledged` was the deletion request acknowledged or not
|
||||
<1> `isAcknowledged` was the deletion request acknowledged or not.
|
||||
|
|
|
@ -4,14 +4,13 @@
|
|||
:response: PutDatafeedResponse
|
||||
--
|
||||
[id="{upid}-{api}"]
|
||||
=== Put Datafeed API
|
||||
=== Put datafeed API
|
||||
|
||||
The Put Datafeed API can be used to create a new {ml} datafeed
|
||||
in the cluster. The API accepts a +{request}+ object
|
||||
Creates a new {ml} datafeed in the cluster. The API accepts a +{request}+ object
|
||||
as a request and returns a +{response}+.
|
||||
|
||||
[id="{upid}-{api}-request"]
|
||||
==== Put Datafeed Request
|
||||
==== Put datafeed request
|
||||
|
||||
A +{request}+ requires the following argument:
|
||||
|
||||
|
@ -19,10 +18,10 @@ A +{request}+ requires the following argument:
|
|||
--------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-request]
|
||||
--------------------------------------------------
|
||||
<1> The configuration of the {ml} datafeed to create
|
||||
<1> The configuration of the {ml} datafeed to create.
|
||||
|
||||
[id="{upid}-{api}-config"]
|
||||
==== Datafeed Configuration
|
||||
==== Datafeed configuration
|
||||
|
||||
The `DatafeedConfig` object contains all the details about the {ml} datafeed
|
||||
configuration.
|
||||
|
@ -33,10 +32,10 @@ A `DatafeedConfig` requires the following arguments:
|
|||
--------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-config]
|
||||
--------------------------------------------------
|
||||
<1> The datafeed ID and the job ID
|
||||
<2> The indices that contain the data to retrieve and feed into the job
|
||||
<1> The datafeed ID and the {anomaly-job} ID.
|
||||
<2> The indices that contain the data to retrieve and feed into the {anomaly-job}.
|
||||
|
||||
==== Optional Arguments
|
||||
==== Optional arguments
|
||||
The following arguments are optional:
|
||||
|
||||
["source","java",subs="attributes,callouts,macros"]
|
||||
|
@ -49,7 +48,8 @@ include-tagged::{doc-tests-file}[{api}-config-set-chunking-config]
|
|||
--------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-config-set-frequency]
|
||||
--------------------------------------------------
|
||||
<1> The interval at which scheduled queries are made while the datafeed runs in real time.
|
||||
<1> The interval at which scheduled queries are made while the datafeed runs in
|
||||
real time.
|
||||
|
||||
["source","java",subs="attributes,callouts,macros"]
|
||||
--------------------------------------------------
|
||||
|
@ -72,8 +72,9 @@ The window must be larger than the Job's bucket size, but smaller than 24 hours,
|
|||
and span less than 10,000 buckets.
|
||||
Defaults to `null`, which causes an appropriate window span to be calculated when
|
||||
the datafeed runs.
|
||||
The default `check_window` span calculation is the max between `2h` or `8 * bucket_span`.
|
||||
To explicitly disable, pass `DelayedDataCheckConfig.disabledDelayedDataCheckConfig()`.
|
||||
The default `check_window` span calculation is the max between `2h` or
|
||||
`8 * bucket_span`. To explicitly disable, pass
|
||||
`DelayedDataCheckConfig.disabledDelayedDataCheckConfig()`.
|
||||
|
||||
["source","java",subs="attributes,callouts,macros"]
|
||||
--------------------------------------------------
|
||||
|
@ -101,4 +102,4 @@ default values:
|
|||
--------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-response]
|
||||
--------------------------------------------------
|
||||
<1> The created datafeed
|
||||
<1> The created datafeed.
|
||||
|
|
|
@ -4,14 +4,13 @@
|
|||
:response: StartDatafeedResponse
|
||||
--
|
||||
[id="{upid}-{api}"]
|
||||
=== Start Datafeed API
|
||||
=== Start datafeed API
|
||||
|
||||
The Start Datafeed API provides the ability to start a {ml} datafeed in the cluster.
|
||||
It accepts a +{request}+ object and responds
|
||||
with a +{response}+ object.
|
||||
Starts a {ml} datafeed in the cluster. It accepts a +{request}+ object and
|
||||
responds with a +{response}+ object.
|
||||
|
||||
[id="{upid}-{api}-request"]
|
||||
==== Start Datafeed Request
|
||||
==== Start datafeed request
|
||||
|
||||
A +{request}+ object is created referencing a non-null `datafeedId`.
|
||||
All other fields are optional for the request.
|
||||
|
@ -20,9 +19,9 @@ All other fields are optional for the request.
|
|||
--------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-request]
|
||||
--------------------------------------------------
|
||||
<1> Constructing a new request referencing an existing `datafeedId`
|
||||
<1> Constructing a new request referencing an existing `datafeedId`.
|
||||
|
||||
==== Optional Arguments
|
||||
==== Optional arguments
|
||||
|
||||
The following arguments are optional.
|
||||
|
||||
|
|
|
@ -4,14 +4,13 @@
|
|||
:response: PutDatafeedResponse
|
||||
--
|
||||
[id="{upid}-{api}"]
|
||||
=== Update Datafeed API
|
||||
=== Update datafeed API
|
||||
|
||||
The Update Datafeed API can be used to update a {ml} datafeed
|
||||
in the cluster. The API accepts a +{request}+ object
|
||||
Updates a {ml} datafeed in the cluster. The API accepts a +{request}+ object
|
||||
as a request and returns a +{response}+.
|
||||
|
||||
[id="{upid}-{api}-request"]
|
||||
==== Update Datafeed Request
|
||||
==== Update datafeed request
|
||||
|
||||
A +{request}+ requires the following argument:
|
||||
|
||||
|
@ -22,7 +21,7 @@ include-tagged::{doc-tests-file}[{api}-request]
|
|||
<1> The updated configuration of the {ml} datafeed
|
||||
|
||||
[id="{upid}-{api}-config"]
|
||||
==== Updated Datafeed Arguments
|
||||
==== Updated datafeed arguments
|
||||
|
||||
A `DatafeedUpdate` requires an existing non-null `datafeedId` and
|
||||
allows updating various settings.
|
||||
|
@ -31,12 +30,15 @@ allows updating various settings.
|
|||
--------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-config]
|
||||
--------------------------------------------------
|
||||
<1> Mandatory, non-null `datafeedId` referencing an existing {ml} datafeed
|
||||
<2> Optional, set the datafeed Aggregations for data gathering
|
||||
<3> Optional, the indices that contain the data to retrieve and feed into the job
|
||||
<1> Mandatory, non-null `datafeedId` referencing an existing {ml} datafeed.
|
||||
<2> Optional, set the datafeed aggregations for data gathering.
|
||||
<3> Optional, the indices that contain the data to retrieve and feed into the
|
||||
{anomaly-job}.
|
||||
<4> Optional, specifies how data searches are split into time chunks.
|
||||
<5> Optional, the interval at which scheduled queries are made while the datafeed runs in real time.
|
||||
<6> Optional, a query to filter the search results by. Defaults to the `match_all` query.
|
||||
<5> Optional, the interval at which scheduled queries are made while the
|
||||
datafeed runs in real time.
|
||||
<6> Optional, a query to filter the search results by. Defaults to the
|
||||
`match_all` query.
|
||||
<7> Optional, the time interval behind real time that data is queried.
|
||||
<8> Optional, allows the use of script fields.
|
||||
<9> Optional, the `size` parameter used in the searches.
|
||||
|
@ -53,4 +55,4 @@ the updated {ml} datafeed if it has been successfully updated.
|
|||
--------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-response]
|
||||
--------------------------------------------------
|
||||
<1> The updated datafeed
|
||||
<1> The updated datafeed.
|
||||
|
|
|
@ -18,7 +18,7 @@ Instantiates a {dfeed}.
|
|||
[[ml-put-datafeed-prereqs]]
|
||||
==== {api-prereq-title}
|
||||
|
||||
* You must create a job before you create a {dfeed}.
|
||||
* You must create an {anomaly-job} before you create a {dfeed}.
|
||||
* If {es} {security-features} are enabled, you must have `manage_ml` or `manage`
|
||||
cluster privileges to use this API. See
|
||||
{stack-ov}/security-privileges.html[Security privileges].
|
||||
|
@ -26,7 +26,7 @@ cluster privileges to use this API. See
|
|||
[[ml-put-datafeed-desc]]
|
||||
==== {api-description-title}
|
||||
|
||||
You can associate only one {dfeed} to each job.
|
||||
You can associate only one {dfeed} to each {anomaly-job}.
|
||||
|
||||
[IMPORTANT]
|
||||
====
|
||||
|
@ -75,7 +75,7 @@ those same roles.
|
|||
|
||||
`job_id`::
|
||||
(Required, string) A numerical character string that uniquely identifies the
|
||||
job.
|
||||
{anomaly-job}.
|
||||
|
||||
`query`::
|
||||
(Optional, object) The {es} query domain-specific language (DSL). This value
|
||||
|
|
|
@ -18,8 +18,8 @@ Starts one or more {dfeeds}.
|
|||
[[ml-start-datafeed-prereqs]]
|
||||
==== {api-prereq-title}
|
||||
|
||||
* Before you can start a {dfeed}, the job must be open. Otherwise, an error
|
||||
occurs.
|
||||
* Before you can start a {dfeed}, the {anomaly-job} must be open. Otherwise, an
|
||||
error occurs.
|
||||
* If {es} {security-features} are enabled, you must have `manage_ml` or `manage`
|
||||
cluster privileges to use this API. See
|
||||
{stack-ov}/security-privileges.html[Security privileges].
|
||||
|
@ -36,7 +36,8 @@ If you want to analyze from the beginning of a dataset, you can specify any date
|
|||
earlier than that beginning date.
|
||||
|
||||
If you do not specify a start time and the {dfeed} is associated with a new
|
||||
job, the analysis starts from the earliest time for which data is available.
|
||||
{anomaly-job}, the analysis starts from the earliest time for which data is
|
||||
available.
|
||||
|
||||
When you start a {dfeed}, you can also specify an end time. If you do so, the
|
||||
job analyzes data from the start time until the end time, at which point the
|
||||
|
|
|
@ -67,7 +67,7 @@ The following properties can be updated after the {dfeed} is created:
|
|||
|
||||
`job_id`::
|
||||
(Optional, string) A numerical character string that uniquely identifies the
|
||||
job.
|
||||
{anomaly-job}.
|
||||
|
||||
`query`::
|
||||
(Optional, object) The {es} query domain-specific language (DSL). This value
|
||||
|
|
Loading…
Reference in New Issue