[role="xpack"]
[testenv="platinum"]
[[update-dfanalytics]]
=== Update {dfanalytics-jobs} API
[subs="attributes"]
++++
Update {dfanalytics-jobs}
++++
Updates an existing {dfanalytics-job}.
experimental[]
[[ml-update-dfanalytics-request]]
==== {api-request-title}
`POST _ml/data_frame/analytics//_update`
[[ml-update-dfanalytics-prereq]]
==== {api-prereq-title}
If the {es} {security-features} are enabled, you must have the following
built-in roles and privileges:
* `machine_learning_admin`
* `kibana_admin` (UI only)
* source indices: `read`, `view_index_metadata`
* destination index: `read`, `create_index`, `manage` and `index`
* cluster: `monitor` (UI only)
For more information, see <> and <>.
NOTE: The {dfanalytics-job} remembers which roles the user who created it had at
the time of creation. When you start the job, it performs the analysis using
those same roles. If you provide
<>,
those credentials are used instead.
[[ml-update-dfanalytics-desc]]
==== {api-description-title}
This API updates an existing {dfanalytics-job} that performs an analysis on the source
indices and stores the outcome in a destination index.
[[ml-update-dfanalytics-path-params]]
==== {api-path-parms-title}
``::
(Required, string)
include::{es-repo-dir}/ml/ml-shared.asciidoc[tag=job-id-data-frame-analytics-define]
[role="child_attributes"]
[[ml-update-dfanalytics-request-body]]
==== {api-request-body-title}
`allow_lazy_start`::
(Optional, boolean)
Specifies whether this job can start when there is insufficient {ml} node
capacity for it to be immediately assigned to a node. The default is `false`; if
a {ml} node with capacity to run the job cannot immediately be found, the API
returns an error. However, this is also subject to the cluster-wide
`xpack.ml.max_lazy_ml_nodes` setting. See <>. If this
option is set to `true`, the API does not return an error and the job waits in
the `starting` state until sufficient {ml} node capacity is available.
`description`::
(Optional, string)
include::{es-repo-dir}/ml/ml-shared.asciidoc[tag=description-dfa]
`model_memory_limit`::
(Optional, string)
The approximate maximum amount of memory resources that are permitted for
analytical processing. The default value for {dfanalytics-jobs} is `1gb`. If
your `elasticsearch.yml` file contains an `xpack.ml.max_model_memory_limit`
setting, an error occurs when you try to create {dfanalytics-jobs} that have
`model_memory_limit` values greater than that setting. For more information, see
<>.
[[ml-update-dfanalytics-example]]
==== {api-examples-title}
[[ml-update-dfanalytics-example-preprocess]]
===== Updating model memory limit example
The following example shows how to update the model memory limit for the existing {dfanalytics} configuration.
[source,console]
--------------------------------------------------
POST _ml/data_frame/analytics/model-flight-delays/_update
{
"model_memory_limit": "200mb"
}
--------------------------------------------------
// TEST[skip:setup kibana sample data]