[DOCS] Fix ML post_data docs (elastic/x-pack-elasticsearch#2689)

It was pointed out in
https://github.com/elastic/elasticsearch-net/pull/2856#discussion_r142830656
that our post_data docs incorrectly say that reset_start and reset_end are
body parameters.  In fact they are query parameters.

There were also a number of other errors and ommissions on this page that I
have attempted to correct.

Original commit: elastic/x-pack-elasticsearch@c83decacc7
This commit is contained in:
David Roberts 2017-10-11 10:47:07 +01:00 committed by GitHub
parent a4f7db4f66
commit c84d69fde3
1 changed files with 19 additions and 11 deletions

View File

@ -7,17 +7,22 @@ The post data API enables you to send data to an anomaly detection job for analy
==== Request ==== Request
`POST _xpack/ml/anomaly_detectors/<job_id>/_data --data-binary @<data-file.json>` `POST _xpack/ml/anomaly_detectors/<job_id>/_data`
==== Description ==== Description
The job must have a state of `open` to receive and process the data. The job must have a state of `open` to receive and process the data.
The data that you send to the job must use the JSON format. The data that you send to the job must use the JSON format. Multiple JSON
documents can be sent, either adjacent with no separator in between them or
whitespace separated. Newline delimited JSON (NDJSON) is a possible whitespace
separated format, and for this the `Content-Type` header should be set to
`application/x-ndjson`.
File sizes are limited to 100 Mb. If your file is larger, split it into multiple Upload sizes are limited to the Elasticsearch HTTP receive buffer size
files and upload each one separately in sequential time order. When running in (default 100 Mb). If your data is larger, split it into multiple chunks
and upload each one separately in sequential time order. When running in
real time, it is generally recommended that you perform many small uploads, real time, it is generally recommended that you perform many small uploads,
rather than queueing data to upload larger files. rather than queueing data to upload larger files.
@ -29,9 +34,8 @@ The following records will not be processed:
//TBD link to Working with Out of Order timeseries concept doc //TBD link to Working with Out of Order timeseries concept doc
IMPORTANT: Data can only be accepted from a single connection. Use a single IMPORTANT: For each job, data can only be accepted from a single connection at
connection synchronously to send data, close, flush, or delete a single job. a time. It is not currently possible to post data to multiple jobs using wildcards
It is not currently possible to post data to multiple jobs using wildcards
or a comma-separated list. or a comma-separated list.
@ -41,7 +45,7 @@ or a comma-separated list.
(string) Identifier for the job (string) Identifier for the job
==== Request Body ==== Query Parameters
`reset_start`:: `reset_start`::
(string) Specifies the start of the bucket resetting range (string) Specifies the start of the bucket resetting range
@ -50,6 +54,12 @@ or a comma-separated list.
(string) Specifies the end of the bucket resetting range (string) Specifies the end of the bucket resetting range
==== Request Body
A sequence of one or more JSON documents containing the data to be analyzed.
Only whitespace characters are permitted in between the documents.
==== Authorization ==== Authorization
You must have `manage_ml`, or `manage` cluster privileges to use this API. You must have `manage_ml`, or `manage` cluster privileges to use this API.
@ -60,7 +70,7 @@ For more information, see
==== Examples ==== Examples
The following example posts data from the farequote.json file to the `farequote` job: The following example posts data from the it_ops_new_kpi.json file to the `it_ops_new_kpi` job:
[source,js] [source,js]
-------------------------------------------------- --------------------------------------------------
@ -69,8 +79,6 @@ $ curl -s -H "Content-type: application/json"
--data-binary @it_ops_new_kpi.json --data-binary @it_ops_new_kpi.json
-------------------------------------------------- --------------------------------------------------
//TBD: Create example of how to post a small data example in Kibana?
When the data is sent, you receive information about the operational progress of the job. When the data is sent, you receive information about the operational progress of the job.
For example: For example: