OpenSearch/docs/en/rest-api/ml/post-data.asciidoc

98 lines
2.8 KiB
Plaintext
Raw Normal View History

//lcawley: Verified example output 2017-04-11
[[ml-post-data]]
=== Post Data to Jobs
The post data API enables you to send data to an anomaly detection job for analysis.
==== Request
`POST _xpack/ml/anomaly_detectors/<job_id>/_data --data-binary @<data-file.json>`
==== Description
The job must have a state of `open` to receive and process the data.
The data that you send to the job must use the JSON format.
File sizes are limited to 100 Mb. If your file is larger, split it into multiple
files and upload each one separately in sequential time order. When running in
real time, it is generally recommended that you perform many small uploads,
rather than queueing data to upload larger files.
When uploading data, check the <<ml-datacounts,job data counts>> for progress.
The following records will not be processed:
* Records not in chronological order and outside the latency window
* Records with an invalid timestamp
//TBD link to Working with Out of Order timeseries concept doc
IMPORTANT: Data can only be accepted from a single connection. Use a single
connection synchronously to send data, close, flush, or delete a single job.
It is not currently possible to post data to multiple jobs using wildcards
or a comma-separated list.
==== Path Parameters
`job_id` (required)::
(string) Identifier for the job
==== Request Body
`reset_start`::
(string) Specifies the start of the bucket resetting range
`reset_end`::
(string) Specifies the end of the bucket resetting range
==== Authorization
You must have `manage_ml`, or `manage` cluster privileges to use this API.
For more information, see <<privileges-list-cluster>>.
==== Examples
The following example posts data from the farequote.json file to the `farequote` job:
[source,js]
--------------------------------------------------
$ curl -s -H "Content-type: application/json"
-X POST http:\/\/localhost:9200/_xpack/ml/anomaly_detectors/it_ops_new_kpi/_data
--data-binary @it_ops_new_kpi.json
--------------------------------------------------
//TBD: Create example of how to post a small data example in Kibana?
When the data is sent, you receive information about the operational progress of the job.
For example:
[source,js]
----
{
"job_id":"it_ops_new_kpi",
"processed_record_count":21435,
"processed_field_count":64305,
"input_bytes":2589063,
"input_field_count":85740,
"invalid_date_count":0,
"missing_field_count":0,
"out_of_order_timestamp_count":0,
"empty_bucket_count":16,
"sparse_bucket_count":0,
"bucket_count":2165,
"earliest_record_timestamp":1454020569000,
"latest_record_timestamp":1455318669000,
"last_data_time":1491952300658,
"latest_empty_bucket_timestamp":1454541600000,
"input_record_count":21435
}
----
For more information about these properties, see <<ml-jobstats,Job Stats>>.