-- :api: put-trained-model :request: PutTrainedModelRequest :response: PutTrainedModelResponse -- [role="xpack"] [id="{upid}-{api}"] === Put trained models API experimental:[] experimental::[] Creates a new trained model for inference. The API accepts a +{request}+ object as a request and returns a +{response}+. [id="{upid}-{api}-request"] ==== Put trained models request A +{request}+ requires the following argument: ["source","java",subs="attributes,callouts,macros"] -------------------------------------------------- include-tagged::{doc-tests-file}[{api}-request] -------------------------------------------------- <1> The configuration of the {infer} trained model to create [id="{upid}-{api}-config"] ==== Trained model configuration The `TrainedModelConfig` object contains all the details about the trained model configuration and contains the following arguments: ["source","java",subs="attributes,callouts,macros"] -------------------------------------------------- include-tagged::{doc-tests-file}[{api}-config] -------------------------------------------------- <1> The {infer} definition for the model <2> Optionally, if the {infer} definition is large, you may choose to compress it for transport. Do not supply both the compressed and uncompressed definitions. <3> The unique model id <4> The input field names for the model definition <5> Optionally, a human-readable description <6> Optionally, an object map contain metadata about the model <7> Optionally, an array of tags to organize the model <8> The default inference config to use with the model. Must match the underlying definition target_type. include::../execution.asciidoc[] [id="{upid}-{api}-response"] ==== Response The returned +{response}+ contains the newly created trained model. The +{response}+ will omit the model definition as a precaution against streaming large model definitions back to the client. ["source","java",subs="attributes,callouts,macros"] -------------------------------------------------- include-tagged::{doc-tests-file}[{api}-response] --------------------------------------------------