OpenSearch/docs/en/rest-api/ml/post-data.asciidoc

92 lines
2.8 KiB
Plaintext
Raw Normal View History

//lcawley: Verified example output 2017-04-11
[[ml-post-data]]
==== Post Data to Jobs
The post data API allows you to send data to an anomaly detection job for analysis.
The job must have been opened prior to sending data.
===== Request
`POST _xpack/ml/anomaly_detectors/<job_id>/_data --data-binary @<data-file.json>`
===== Description
File sizes are limited to 100 Mb, so if your file is larger,
then split it into multiple files and upload each one separately in sequential time order.
When running in real-time, it is generally recommended to arrange to perform
many small uploads, rather than queueing data to upload larger files.
IMPORTANT: Data can only be accepted from a single connection.
Use a single connection synchronously to send data, close, flush, or delete a single job.
It is not currently possible to post data to multiple jobs using wildcards
or a comma separated list.
You must have `manage_ml`, or `manage` cluster privileges to use this API.
For more information, see <<privileges-list-cluster>>.
===== Path Parameters
`job_id` (required)::
(string) Identifier for the job
===== Request Body
`reset_start`::
(string) Specifies the start of the bucket resetting range
`reset_end`::
(string) Specifies the end of the bucket resetting range
////
===== Responses
200
(EmptyResponse) The cluster has been successfully deleted
404
(BasicFailedReply) The cluster specified by {cluster_id} cannot be found (code: clusters.cluster_not_found)
412
(BasicFailedReply) The Elasticsearch cluster has not been shutdown yet (code: clusters.cluster_plan_state_error)
The following example sends data from file `data-file.json` to a job called `my_analysis`.
////
===== Examples
The following example posts data from the farequote.json file to the `farequote` job:
[source,js]
--------------------------------------------------
$ curl -s -H "Content-type: application/json"
-X POST http:\/\/localhost:9200/_xpack/ml/anomaly_detectors/it_ops_new_kpi/_data
--data-binary @it_ops_new_kpi.json
--------------------------------------------------
// CONSOLE
// TEST[skip:todo]
//TBD: Create example of how to post a small data example in Kibana?
When the data is sent, you receive information about the operational progress of the job.
For example:
----
{
"job_id":"it_ops_new_kpi",
"processed_record_count":21435,
"processed_field_count":64305,
"input_bytes":2589063,
"input_field_count":85740,
"invalid_date_count":0,
"missing_field_count":0,
"out_of_order_timestamp_count":0,
"empty_bucket_count":16,
"sparse_bucket_count":0,
"bucket_count":2165,
"earliest_record_timestamp":1454020569000,
"latest_record_timestamp":1455318669000,
"last_data_time":1491952300658,
"latest_empty_bucket_timestamp":1454541600000,
"input_record_count":21435
}
----
For more information about these properties, see <<ml-jobcounts,Job Counts>>.