Add Execute Command to API documentation. (#711)

* Add Execute Command to API documentation.

Signed-off-by: Naarcha-AWS <naarcha@amazon.com>

* Remove Anomaly

Signed-off-by: Naarcha-AWS <naarcha@amazon.com>

* Add editorial review

Signed-off-by: Naarcha-AWS <naarcha@amazon.com>

* Final feedback

Signed-off-by: Naarcha-AWS <naarcha@amazon.com>

* Make sure note renders properly

Signed-off-by: Naarcha-AWS <naarcha@amazon.com>

* Fix capitalization

Signed-off-by: Naarcha-AWS <naarcha@amazon.com>
This commit is contained in:
Naarcha-AWS 2022-07-06 11:16:03 -05:00 committed by GitHub
parent f0b4eab1c9
commit 16b3099e2f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 146 additions and 4 deletions

View File

@ -198,9 +198,9 @@ time_zone | string | The time zone for the time_field field | "UTC"
For FIT RCF, you can train the model with historical data and store the trained model in your index. The model will be deserialized and predict new data points when using the Predict API. However, the model in the index will not be refreshed with new data, because the model is fixed in time.
## Anomaly Localization
## Localization
The Anomaly Localization algorithm finds subset level-information for aggregate data (for example, aggregated over time) that demonstrates the activity of interest, such as spikes, drops, changes, or anomalies. Localization can be applied in different scenarios, such as data exploration or root cause analysis, to expose the contributors driving the activity of interest in the aggregate data.
The Localization algorithm finds subset-level information for aggregate data (for example, aggregated over time) that demonstrates the activity of interest, such as spikes, drops, changes, or anomalies. Localization can be applied in different scenarios, such as data exploration or root cause analysis, to expose the contributors driving the activity of interest in the aggregate data.
### Parameters
@ -219,9 +219,9 @@ num_outputs | integer | The maximum number of values from localization/slicing |
filter_query | Long | (Optional) Reduces the collection of data for analysis | Optional.empty()
anomaly_star | QueryBuilder | (Optional) The time after which the data will be analyzed | Optional.empty()
### Example
### Example: Execute localization
The following example executes Anomaly Localization against an RCA index.
The following example executes Localization against an RCA index.
**Request**

View File

@ -644,6 +644,9 @@ GET /_plugins/_ml/tasks/_search
Delete a task based on the task_id.
ML Commons does not check the task status when running the `Delete` request. There is a risk that a currently running task could be deleted before the task completes. To check the status of a task, run `GET /_plugins/_ml/tasks/<task_id>` before task deletion.
{: .note}
```json
DELETE /_plugins/_ml/tasks/{task_id}
```
@ -729,8 +732,147 @@ GET /_plugins/_ml/stats
}
```
## Execute
Some algorithms, such as [Localization]({{site.url}}{{site.baseurl}}/ml-commons-plugin/algorithms#localization), don't require trained models. You can run no-model-based algorithms using the `execute` API.
```json
POST _plugins/_ml/_execute/<algorithm_name>
```
### Example: Execute localization
The following example uses the Localization algorithm to find subset-level information for aggregate data (for example, aggregated over time) that demonstrates the activity of interest, such as spikes, drops, changes, or anomalies.
```json
POST /_plugins/_ml/_execute/anomaly_localization
{
"index_name": "rca-index",
"attribute_field_names": [
"attribute"
],
"aggregations": [
{
"sum": {
"sum": {
"field": "value"
}
}
}
],
"time_field_name": "timestamp",
"start_time": 1620630000000,
"end_time": 1621234800000,
"min_time_interval": 86400000,
"num_outputs": 10
}
```
Upon execution, the API returns the following:
```json
"results" : [
{
"name" : "sum",
"result" : {
"buckets" : [
{
"start_time" : 1620630000000,
"end_time" : 1620716400000,
"overall_aggregate_value" : 65.0
},
{
"start_time" : 1620716400000,
"end_time" : 1620802800000,
"overall_aggregate_value" : 75.0,
"entities" : [
{
"key" : [
"attr0"
],
"contribution_value" : 1.0,
"base_value" : 2.0,
"new_value" : 3.0
},
{
"key" : [
"attr1"
],
"contribution_value" : 1.0,
"base_value" : 3.0,
"new_value" : 4.0
},
{
"key" : [
"attr2"
],
"contribution_value" : 1.0,
"base_value" : 4.0,
"new_value" : 5.0
},
{
"key" : [
"attr3"
],
"contribution_value" : 1.0,
"base_value" : 5.0,
"new_value" : 6.0
},
{
"key" : [
"attr4"
],
"contribution_value" : 1.0,
"base_value" : 6.0,
"new_value" : 7.0
},
{
"key" : [
"attr5"
],
"contribution_value" : 1.0,
"base_value" : 7.0,
"new_value" : 8.0
},
{
"key" : [
"attr6"
],
"contribution_value" : 1.0,
"base_value" : 8.0,
"new_value" : 9.0
},
{
"key" : [
"attr7"
],
"contribution_value" : 1.0,
"base_value" : 9.0,
"new_value" : 10.0
},
{
"key" : [
"attr8"
],
"contribution_value" : 1.0,
"base_value" : 10.0,
"new_value" : 11.0
},
{
"key" : [
"attr9"
],
"contribution_value" : 1.0,
"base_value" : 11.0,
"new_value" : 12.0
}
]
},
...
]
}
}
]
}
```