Add flow framework documentation (#6257)

* Add flow framework documentation

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Tech review comments

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Update _automating-workflows/api/create-workflow.md

Co-authored-by: Owais Kazi <owaiskazi19@gmail.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* Add callout of edges being optional

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Add unregister to word list

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Add registering local pretrained and custom models

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Apply suggestions from code review

Co-authored-by: Melissa Vagi <vagimeli@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Melissa Vagi <vagimeli@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* Update _automating-workflows/api/deprovision-workflow.md

Co-authored-by: Melissa Vagi <vagimeli@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* Update _automating-workflows/workflow-steps.md

Co-authored-by: Melissa Vagi <vagimeli@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Melissa Vagi <vagimeli@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* Added Flow Framework plugin to Vale

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Apply suggestions from code review

Co-authored-by: Nathan Bower <nbower@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* More editorial comments

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Update _automating-workflows/api/get-workflow-status.md

Co-authored-by: Nathan Bower <nbower@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* Update _automating-workflows/api/get-workflow-status.md

Co-authored-by: Nathan Bower <nbower@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* Add note about provisioning

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Update _automating-workflows/index.md

Co-authored-by: Nathan Bower <nbower@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* Update _automating-workflows/workflow-steps.md

Co-authored-by: Nathan Bower <nbower@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>

* More editorial comments

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Removed code font from headings

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Add agent documentation links

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Add experimental label and more links

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Add sample templates link

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

* Added a tracking issue to warning

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>

---------

Signed-off-by: Fanit Kolchina <kolchfa@amazon.com>
Signed-off-by: kolchfa-aws <105444904+kolchfa-aws@users.noreply.github.com>
Co-authored-by: Owais Kazi <owaiskazi19@gmail.com>
Co-authored-by: Melissa Vagi <vagimeli@amazon.com>
Co-authored-by: Nathan Bower <nbower@amazon.com>
This commit is contained in:
kolchfa-aws 2024-02-07 18:44:49 -05:00 committed by GitHub
parent e91cf1ff93
commit 6c92b54eba
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
20 changed files with 1510 additions and 1 deletions

View File

@ -4,6 +4,7 @@ Asynchronous Search plugin
Crypto plugin Crypto plugin
Cross-Cluster Replication plugin Cross-Cluster Replication plugin
Custom Codecs plugin Custom Codecs plugin
Flow Framework plugin
Maps plugin Maps plugin
Notebooks plugin Notebooks plugin
Notifications plugin Notifications plugin

View File

@ -20,6 +20,7 @@ Boolean
[Dd]eallocate [Dd]eallocate
[Dd]eduplicates? [Dd]eduplicates?
[Dd]eduplication [Dd]eduplication
[Dd]eprovision(s|ed|ing)?
[Dd]eserialize [Dd]eserialize
[Dd]eserialization [Dd]eserialization
Dev Dev
@ -131,6 +132,7 @@ tebibyte
[Uu]nigram [Uu]nigram
[Uu]nnesting [Uu]nnesting
[Uu]nrecovered [Uu]nrecovered
[Uu]nregister(s|ed|ing)?
[Uu]pdatable [Uu]pdatable
[Uu]psert [Uu]psert
[Ww]alkthrough [Ww]alkthrough

View File

@ -0,0 +1,255 @@
---
layout: default
title: Create or update a workflow
parent: Workflow APIs
nav_order: 10
---
# Create or update a workflow
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
Creating a workflow adds the content of a workflow template to the flow framework system index. You can provide workflows in JSON format (by specifying `Content-Type: application/json`) or YAML format (by specifying `Content-Type: application/yaml`). By default, the workflow is validated to help identify invalid configurations, including:
* Workflow steps requiring an OpenSearch plugin that is not installed.
* Workflow steps relying on previous node input that is provided by those steps.
* Workflow step fields with invalid values.
* Workflow graph (node/edge) configurations containing cycles or with duplicate IDs.
To obtain the validation template for workflow steps, call the [Get Workflow Steps API]({{site.url}}{{site.baseurl}}/automating-workflows/api/get-workflow-steps/).
Once a workflow is created, provide its `workflow_id` to other APIs.
The `POST` method creates a new workflow. The `PUT` method updates an existing workflow.
You can only update a workflow if it has not yet been provisioned.
{: .note}
## Path and HTTP methods
```json
POST /_plugins/_flow_framework/workflow
PUT /_plugins/_flow_framework/workflow/<workflow_id>
```
## Path parameters
The following table lists the available path parameters.
| Parameter | Data type | Description |
| :--- | :--- | :--- |
| `workflow_id` | String | The ID of the workflow to be updated. Required for the `PUT` method. |
## Query parameters
Workflows are normally created and provisioned in separate steps. However, once you have thoroughly tested the workflow, you can combine the create and provision steps by including the `provision` query parameter:
```json
POST /_plugins/_flow_framework/workflow?provision=true
```
{% include copy-curl.html %}
When set to `true`, the [Provision Workflow API]({{site.url}}{{site.baseurl}}/automating-workflows/api/provision-workflow/) is executed immediately following creation.
By default, workflows are validated when they are created to ensure that the syntax is valid and that the graph does not contain cycles. This behavior can be controlled with the `validation` query parameter. If `validation` is set to `all`, OpenSearch performs a complete template validation. Any other value of the `validation` parameter suppresses validation, allowing an incomplete/work-in-progress template to be saved. To disable template validation, set `validation` to `none`:
```json
POST /_plugins/_flow_framework/workflow?validation=none
```
{% include copy-curl.html %}
The following table lists the available query parameters. All query parameters are optional.
| Parameter | Data type | Description |
| :--- | :--- | :--- |
| `provision` | Boolean | Whether to provision the workflow as part of the request. Default is `false`. |
| `validation` | String | Whether to validate the workflow. Valid values are `all` (validate the template) and `none` (do not validate the template). Default is `all`. |
## Request fields
The following table lists the available request fields.
|Field |Data type |Required/Optional |Description |
|:--- |:--- |:--- |:--- |
|`name` |String |Required |The name of the workflow. |
|`description` |String |Optional |A description of the workflow. |
|`use_case` |String |Optional | A use case, which can be used with the Search Workflow API to find related workflows. In the future, OpenSearch may provide some standard use cases to ease categorization, but currently you can use this field to specify custom values. |
|`version` |Object |Optional | A key-value map with two fields: `template`, which identifies the template version, and `compatibility`, which identifies a list of minimum required OpenSearch versions. |
|`workflows` |Object |Optional |A map of workflows. Presently, only the `provision` key is supported. The value for the workflow key is a key-value map that includes fields for `user_params` and lists of `nodes` and `edges`. |
#### Example request: Register and deploy an externally hosted model (YAML)
To provide a template in YAML format, specify `Content-Type: application/yaml` in the request header:
```bash
curl -XPOST "http://localhost:9200/_plugins/_flow_framework/workflow" -H 'Content-Type: application/yaml'
```
YAML templates permit comments.
{: .tip}
The following is an example YAML template for registering and deploying an externally hosted model:
```yaml
# This name is required
name: createconnector-registerremotemodel-deploymodel
# Other fields are optional but useful
description: This template creates a connector to a remote model, registers it, and
deploys that model
# Other templates with a similar use case can be searched
use_case: REMOTE_MODEL_DEPLOYMENT
version:
# Templates may be versioned by their authors
template: 1.0.0
# Compatibility with OpenSearch 2.12.0 and higher and 3.0.0 and higher
compatibility:
- 2.12.0
- 3.0.0
# One or more workflows can be included, presently only provision is supported
workflows:
provision:
# These nodes are the workflow steps corresponding to ML Commons APIs
nodes:
# This ID must be unique to this workflow
- id: create_connector_1
# There may be multiple steps with the same type
type: create_connector
# These inputs match the Create Connector API body
user_inputs:
name: OpenAI Chat Connector
description: The connector to public OpenAI model service for GPT 3.5
version: '1'
protocol: http
parameters:
endpoint: api.openai.com
model: gpt-3.5-turbo
credential:
openAI_key: '12345'
actions:
- action_type: predict
method: POST
url: https://${parameters.endpoint}/v1/chat/completions
# This ID must be unique to this workflow
- id: register_model_2
type: register_remote_model
# This step needs the connector_id produced as an output of the previous step
previous_node_inputs:
create_connector_1: connector_id
# These inputs match the Register Model API body
user_inputs:
name: openAI-gpt-3.5-turbo
function_name: remote
description: test model
# This ID must be unique to this workflow
- id: deploy_model_3
type: deploy_model
# This step needs the model_id produced as an output of the previous step
previous_node_inputs:
register_model_2: model_id
# Since the nodes include previous_node_inputs these are optional to define
# They will be added automatically and included in the stored template
# Additional edges may also be added here if required for sequencing
edges:
- source: create_connector_1
dest: register_model_2
- source: register_model_2
dest: deploy_model_3
```
{% include copy-curl.html %}
#### Example request: Register and deploy a remote model (JSON)
To provide a template in JSON format, specify `Content-Type: application/json` in the request header:
```bash
curl -XPOST "http://localhost:9200/_plugins/_flow_framework/workflow" -H 'Content-Type: application/json'
```
The following JSON template is equivalent to the YAML template provided in the previous section:
```json
{
"name": "createconnector-registerremotemodel-deploymodel",
"description": "This template creates a connector to a remote model, registers it, and deploys that model",
"use_case": "REMOTE_MODEL_DEPLOYMENT",
"version": {
"template": "1.0.0",
"compatibility": [
"2.12.0",
"3.0.0"
]
},
"workflows": {
"provision": {
"nodes": [
{
"id": "create_connector_1",
"type": "create_connector",
"user_inputs": {
"name": "OpenAI Chat Connector",
"description": "The connector to public OpenAI model service for GPT 3.5",
"version": "1",
"protocol": "http",
"parameters": {
"endpoint": "api.openai.com",
"model": "gpt-3.5-turbo"
},
"credential": {
"openAI_key": "12345"
},
"actions": [
{
"action_type": "predict",
"method": "POST",
"url": "https://${parameters.endpoint}/v1/chat/completions"
}
]
}
},
{
"id": "register_model_2",
"type": "register_remote_model",
"previous_node_inputs": {
"create_connector_1": "connector_id"
},
"user_inputs": {
"name": "openAI-gpt-3.5-turbo",
"function_name": "remote",
"description": "test model"
}
},
{
"id": "deploy_model_3",
"type": "deploy_model",
"previous_node_inputs": {
"register_model_2": "model_id"
}
}
],
"edges": [
{
"source": "create_connector_1",
"dest": "register_model_2"
},
{
"source": "register_model_2",
"dest": "deploy_model_3"
}
]
}
}
}
```
{% include copy-curl.html %}
#### Example response
OpenSearch responds with the `workflow_id`:
```json
{
"workflow_id" : "8xL8bowB8y25Tqfenm50"
}
```
Once you have created a workflow, you can use other workflow APIs with the `workflow_id`.

View File

@ -0,0 +1,56 @@
---
layout: default
title: Delete a workflow
parent: Workflow APIs
nav_order: 80
---
# Delete a workflow
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
When you no longer need a workflow template, you can delete it by calling the Delete Workflow API.
Note that deleting a workflow only deletes the stored template but does not deprovision its resources.
## Path and HTTP methods
```json
DELETE /_plugins/_flow_framework/workflow/<workflow_id>
```
## Path parameters
The following table lists the available path parameters.
| Parameter | Data type | Description |
| :--- | :--- | :--- |
| `workflow_id` | String | The ID of the workflow to be retrieved. Required. |
#### Example request
```
DELETE /_plugins/_flow_framework/workflow/8xL8bowB8y25Tqfenm50
```
{% include copy-curl.html %}
#### Example response
If the workflow exists, a delete response contains the status of the deletion, where the `result` field is set to `deleted` on success or `not_found` if the workflow does not exist (it may have already been deleted):
```json
{
"_index": ".plugins-flow_framework-templates",
"_id": "8xL8bowB8y25Tqfenm50",
"_version": 2,
"result": "deleted",
"_shards": {
"total": 1,
"successful": 1,
"failed": 0
},
"_seq_no": 2,
"_primary_term": 1
}
```

View File

@ -0,0 +1,61 @@
---
layout: default
title: Deprovision a workflow
parent: Workflow APIs
nav_order: 70
---
# Deprovision a workflow
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
When you no longer need a workflow, you can deprovision its resources. Most workflow steps that create a resource have corresponding workflow steps to reverse that action. To retrieve all resources currently created for a workflow, call the [Get Workflow Status API]({{site.url}}{{site.baseurl}}/automating-workflows/api/get-workflow-status/). When you call the Deprovision Workflow API, resources included in the `resources_created` field of the Get Workflow Status API response will be removed using a workflow step corresponding to the one that provisioned them.
The workflow executes the provisioning workflow steps in reverse order. If failures occur because of resource dependencies, such as preventing deletion of a registered model if it is still deployed, the workflow attempts retries.
## Path and HTTP methods
```json
POST /_plugins/_flow_framework/workflow/<workflow_id>/_deprovision
```
## Path parameters
The following table lists the available path parameters.
| Parameter | Data type | Description |
| :--- | :--- | :--- |
| `workflow_id` | String | The ID of the workflow to be deprovisioned. Required. |
### Example request
```json
POST /_plugins/_flow_framework/workflow/8xL8bowB8y25Tqfenm50/_deprovision
```
{% include copy-curl.html %}
### Example response
If deprovisioning is successful, OpenSearch responds with the same `workflow_id` that was used in the request:
```json
{
"workflow_id" : "8xL8bowB8y25Tqfenm50"
}
```
If deprovisioning did not completely remove all resources, OpenSearch responds with a `202 (ACCEPTED)` status and identifies the resources that were not deprovisioned:
```json
{
"error": "Failed to deprovision some resources: [connector_id Lw7PX4wBfVtHp98y06wV]."
}
```
In some cases, the failure happens because of another dependent resource that took some time to be removed. In this case, you can attempt to send the same request again.
{: .tip}
To obtain a more detailed deprovisioning status than is provided by the summary in the error response, query the [Get Workflow Status API]({{site.url}}{{site.baseurl}}/automating-workflows/api/get-workflow-status/).
On success, the workflow returns to a `NOT_STARTED` state. If some resources have not yet been removed, they are provided in the response.

View File

@ -0,0 +1,111 @@
---
layout: default
title: Get a workflow status
parent: Workflow APIs
nav_order: 40
---
# Get a workflow status
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
[Provisioning a workflow]({{site.url}}{{site.baseurl}}/automating-workflows/api/provision-workflow/) may take a significant amount of time, particularly when the action is associated with OpenSearch indexing operations. The Get Workflow State API permits monitoring of the provisioning deployment status until it is complete.
## Path and HTTP methods
```json
GET /_plugins/_flow_framework/workflow/<workflow_id>/_status
```
## Path parameters
The following table lists the available path parameters.
| Parameter | Data type | Description |
| :--- | :--- | :--- |
| `workflow_id` | String | The ID of the workflow from which to obtain the status. Required for the `PUT` method. |
## Query parameters
The `all` parameter specifies whether the response should return all fields.
When set to `false` (the default), the response contains the following fields:
- `workflow_id`
- any `error` state
- `state`
- a list of `resources_created`
When set to `true`, the response contains the following additional fields:
- `provisioning_progress`
- `provision_start_time`
- `provision_end_time`
- `user`
- `user_outputs`
To receive all available fields in the response, set `all` to `true`:
```json
GET /_plugins/_flow_framework/workflow/8xL8bowB8y25Tqfenm50/_status?all=true
```
{% include copy-curl.html %}
#### Example request
```json
GET /_plugins/_flow_framework/workflow/8xL8bowB8y25Tqfenm50/_status
```
{% include copy-curl.html %}
#### Example response
OpenSearch responds with a summary of the provisioning status and a list of created resources.
Before provisioning has begun, OpenSearch does not return any resources:
```json
{
"workflow_id" : "8xL8bowB8y25Tqfenm50",
"state": "NOT_STARTED"
}
```
While provisioning is in progress, OpenSearch returns a partial resource list:
```json
{
"workflow_id" : "8xL8bowB8y25Tqfenm50",
"state": "PROVISIONING",
"resources_created": [
{
"workflow_step_name": "create_connector",
"workflow_step_id": "create_connector_1",
"connector_id": "NdjCQYwBLmvn802B0IwE"
}
]
}
```
Upon provisioning completion, OpenSearch returns the full resource list:
```json
{
"workflow_id" : "8xL8bowB8y25Tqfenm50",
"state": "COMPLETED",
"resources_created": [
{
"workflow_step_name": "create_connector",
"workflow_step_id": "create_connector_1",
"connector_id": "NdjCQYwBLmvn802B0IwE"
},
{
"workflow_step_name": "register_remote_model",
"workflow_step_id": "register_model_2",
"model_id": "N9jCQYwBLmvn802B0oyh"
}
]
}
```

View File

@ -0,0 +1,63 @@
---
layout: default
title: Get workflow steps
parent: Workflow APIs
nav_order: 50
---
# Get workflow steps
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
OpenSearch validates workflows by using the validation template that lists the required inputs, generated outputs, and required plugins for all steps. For example, for the `register_remote_model` step, the validation template appears as follows:
```json
{
"register_remote_model": {
"inputs": [
"name",
"connector_id"
],
"outputs": [
"model_id",
"register_model_status"
],
"required_plugins": [
"opensearch-ml"
]
}
}
```
The Get Workflow Steps API retrieves this file.
## Path and HTTP methods
```json
GET /_plugins/_flow_framework/workflow/_steps
```
#### Example request
```json
GET /_plugins/_flow_framework/workflow/_steps
```
{% include copy-curl.html %}
#### Example response
OpenSearch responds with the validation template containing the steps. The order of fields in the returned steps may not exactly match the original JSON but will function identically.
To retrieve the template in YAML format, specify `Content-Type: application/yaml` in the request header:
```bash
curl -XGET "http://localhost:9200/_plugins/_flow_framework/workflow/8xL8bowB8y25Tqfenm50" -H 'Content-Type: application/yaml'
```
To retrieve the template in JSON format, specify `Content-Type: application/json` in the request header:
```bash
curl -XGET "http://localhost:9200/_plugins/_flow_framework/workflow/8xL8bowB8y25Tqfenm50" -H 'Content-Type: application/json'
```

View File

@ -0,0 +1,50 @@
---
layout: default
title: Get a workflow
parent: Workflow APIs
nav_order: 20
---
# Get a workflow
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
The Get Workflow API retrieves the workflow template.
## Path and HTTP methods
```json
GET /_plugins/_flow_framework/workflow/<workflow_id>
```
## Path parameters
The following table lists the available path parameters.
| Parameter | Data type | Description |
| :--- | :--- | :--- |
| `workflow_id` | String | The ID of the workflow to be retrieved. Required. |
#### Example request
```json
GET /_plugins/_flow_framework/workflow/8xL8bowB8y25Tqfenm50
```
{% include copy-curl.html %}
#### Example response
To retrieve a template in YAML format, specify `Content-Type: application/yaml` in the request header:
```bash
curl -XGET "http://localhost:9200/_plugins/_flow_framework/workflow/8xL8bowB8y25Tqfenm50" -H 'Content-Type: application/yaml'
```
To retrieve a template in JSON format, specify `Content-Type: application/json` in the request header:
```bash
curl -XGET "http://localhost:9200/_plugins/_flow_framework/workflow/8xL8bowB8y25Tqfenm50" -H 'Content-Type: application/json'
```
OpenSearch responds with the stored template containing the same content as the body of the [create workflow]({{site.url}}{{site.baseurl}}/automating-workflows/api/create-workflow/) request. The order of fields in the returned template may not exactly match the original template but will function identically.

View File

@ -0,0 +1,23 @@
---
layout: default
title: Workflow APIs
nav_order: 40
has_children: true
has_toc: false
---
# Workflow APIs
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
OpenSearch supports the following workflow APIs:
* [Create or update workflow]({{site.url}}{{site.baseurl}}/automating-workflows/api/create-workflow/)
* [Get workflow]({{site.url}}{{site.baseurl}}/automating-workflows/api/get-workflow/)
* [Provision workflow]({{site.url}}{{site.baseurl}}/automating-workflows/api/provision-workflow/)
* [Get workflow status]({{site.url}}{{site.baseurl}}/automating-workflows/api/get-workflow-status/)
* [Get workflow steps]({{site.url}}{{site.baseurl}}/automating-workflows/api/get-workflow-steps/)
* [Search workflow]({{site.url}}{{site.baseurl}}/automating-workflows/api/search-workflow/)
* [Deprovision workflow]({{site.url}}{{site.baseurl}}/automating-workflows/api/deprovision-workflow/)
* [Delete workflow]({{site.url}}{{site.baseurl}}/automating-workflows/api/delete-workflow/)

View File

@ -0,0 +1,51 @@
---
layout: default
title: Provision a workflow
parent: Workflow APIs
nav_order: 30
---
# Provision a workflow
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
Provisioning a workflow is a one-time setup process usually performed by a cluster administrator to create resources that will be used by end users.
The `workflows` template field may contain multiple workflows. The workflow with the `provision` key can be executed with this API. This API is also executed when the [Create or Update Workflow API]({{site.url}}{{site.baseurl}}/automating-workflows/api/create-workflow/) is called with the `provision` parameter set to `true`.
You can only provision a workflow if it has not yet been provisioned. Deprovision the workflow if you need to repeat provisioning.
{: .note}
## Path and HTTP methods
```json
POST /_plugins/_flow_framework/workflow/<workflow_id>/_provision
```
## Path parameters
The following table lists the available path parameters.
| Parameter | Data type | Description |
| :--- | :--- | :--- |
| `workflow_id` | String | The ID of the workflow to be provisioned. Required. |
#### Example request
```json
POST /_plugins/_flow_framework/workflow/8xL8bowB8y25Tqfenm50/_provision
```
{% include copy-curl.html %}
#### Example response
OpenSearch responds with the same `workflow_id` that was used in the request:
```json
{
"workflow_id" : "8xL8bowB8y25Tqfenm50"
}
```
To obtain the provisioning status, query the [Get Workflow State API]({{site.url}}{{site.baseurl}}/automating-workflows/api/get-workflow-status/).

View File

@ -0,0 +1,50 @@
---
layout: default
title: Search for a workflow
parent: Workflow APIs
nav_order: 60
---
# Search for a workflow
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
You can retrieve created workflows with their `workflow_id` or search for workflows by using a query matching a field. You can use the `use_case` field to search for similar workflows.
## Path and HTTP methods
```json
GET /_plugins/_flow_framework/workflow/_search
POST /_plugins/_flow_framework/workflow/_search
```
#### Example request: All created workflows
```json
GET /_plugins/_flow_framework/workflow/_search
{
"query": {
"match_all": {}
}
}
```
{% include copy-curl.html %}
#### Example request: All workflows with a `use_case` of `REMOTE_MODEL_DEPLOYMENT`
```json
GET /_plugins/_flow_framework/workflow/_search
{
"query": {
"match": {
"use_case": "REMOTE_MODEL_DEPLOYMENT"
}
}
}
```
{% include copy-curl.html %}
#### Example response
OpenSearch responds with a list of workflow templates matching the search parameters.

View File

@ -0,0 +1,49 @@
---
layout: default
title: Automating workflows
nav_order: 1
has_children: false
nav_exclude: true
redirect_from: /automating-workflows/
---
# Automating workflows
**Introduced 2.12**
{: .label .label-purple }
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
You can automate complex OpenSearch setup and preprocessing tasks by providing templates for common use cases. For example, automating machine learning (ML) setup tasks streamlines the use of OpenSearch ML offerings.
In OpenSearch 2.12, workflow automation is limited to ML tasks.
{: .info}
OpenSearch use case templates provide a compact description of the setup process in a JSON or YAML document. These templates describe automated workflow configurations for conversational chat or query generation, AI connectors, tools, agents, and other components that prepare OpenSearch as a backend for generative models. For template examples, see [Sample templates](https://github.com/opensearch-project/flow-framework/tree/main/sample-templates).
## Key features
Workflow automation provides the following benefits:
* **Use case templates**: Get started with predefined templates that outline the setup process for your general use cases.
* **Customizable workflows**: Customize the workflow templates to your specific use case.
* **Setup automation**: Easily configure AI connectors, tools, agents, and other components in a single API call.
## Overview
**Templates** implement workflow automation in OpenSearch. You can provide these templates in JSON or YAML format. You can describe one or more templates with a sequence of steps required for a particular use case. Each template consists of the following elements:
* **Metadata**: A name, description, use case category, template version, and OpenSearch version compatibility range.
* **User input**: Parameters expected from the user that are common to all automation steps across all workflows, such as an index name.
* **Workflows**: One or more workflows containing the following elements:
* **User input**: Parameters expected from the user that are specific to the steps in this workflow.
* **Workflow Steps**: The workflow steps described as a directed acyclic graph (DAG):
* ***Nodes*** describe steps of the process, which may be executed in parallel. For the syntax of workflow steps, see [Workflow steps]({{site.url}}{{site.baseurl}}/automating-workflows/workflow-steps/).
* ***Edges*** sequence nodes to be executed after the previous step is complete and may use the output fields of previous steps. When a node includes a key in the `previous_node_input` map referring to a previous nodes workflow step, a corresponding edge is automatically added to the template during parsing and may be omitted for the sake of simplicity.
## Next steps
- For supported APIs, see [Workflow APIs]({{site.url}}{{site.baseurl}}/automating-workflows/api/index/).
- For the workflow step syntax, see [Workflow steps]({{site.url}}{{site.baseurl}}/automating-workflows/workflow-steps/).
- For a complete example, see [Workflow tutorial]({{site.url}}{{site.baseurl}}/automating-workflows/workflow-tutorial/).
- For configurable settings, see [Workflow settings]({{site.url}}{{site.baseurl}}/automating-workflows/workflow-settings/).

View File

@ -0,0 +1,20 @@
---
layout: default
title: Workflow settings
nav_order: 30
---
# Workflow settings
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
The following keys represent configurable workflow settings.
|Setting |Data type |Default value |Description |
|:--- |:--- |:--- |:--- |
|`plugins.flow_framework.enabled` |Boolean |`false` |Whether the Flow Framework API is enabled. |
|`plugins.flow_framework.max_workflows` |Integer |`1000` | The maximum number of workflows that you can create. When the limit is above 1,000, the number of existing workflows is defined as a lower bound for performance reasons, so the actual maximum may slightly exceed this value. |
|`plugins.flow_framework.max_workflow_steps` |Integer |`50` |The maximum number of steps a workflow can have. |
|`plugins.flow_framework.request_timeout` |Time units |`10s` |The default timeout for REST requests, which applies to internal search queries. |
|`plugins.flow_framework.task_request_retry_duration` |Time units |`5s` | When steps correspond to an API that produces a `task_id`, OpenSearch will retry them at this interval until completion. |

View File

@ -0,0 +1,64 @@
---
layout: default
title: Workflow steps
nav_order: 10
---
# Workflow steps
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
_Workflow steps_ form basic "building blocks" for process automation. Most steps directly correspond to OpenSearch or plugin API operations, such as CRUD operations on machine learning (ML) connectors, models, and agents. Some steps simplify the configuration by reusing the body expected by these APIs across multiple steps. For example, once you configure a _tool_, you can use it with multiple _agents_.
## Workflow step fields
Workflow steps are actively being developed to expand automation capabilities. Workflow step (graph node) configuration includes the following fields.
|Field |Data type |Required/Optional |Description |
|:--- |:--- |:--- |:--- |
|`id` |String |Required | A user-provided ID for the step. The ID must be unique within a given workflow and is useful for identifying resources created by the step. For example, a `register_agent` step may return an `agent_id` that has been registered. Using this ID, you can determine which step produced which resource. |
|`type` |String |Required |The type of action to take, such as `deploy_model`, which corresponds to the API for which the step is used. Multiple steps may share the same type but must each have their own unique ID. For a list of supported types, see [Workflow step types](#workflow-step-types). |
|`previous_node_inputs` |Object |Optional | A key-value map specifying user inputs that are produced by a previous step in the workflow. For each key-value pair, the key is the previous step's `id` and the value is an API body field name (such as `model_id`) that will be produced as an output of a previous step in the workflow. For example, `register_remote_model` (key) may produce a `model_id` (value) that is required for a subsequent `deploy_model` step. <br> A graph edge is automatically added to the workflow connecting the previous step's key as the source and the current node as the destination. <br>In some cases, you can include [additional inputs](#additional-fields) in this field. |
|`user_inputs` |Object |Optional | A key-value map of inputs supported by the corresponding API for this specific step. Some inputs are required for an API, while others are optional. Required inputs may be specified here, if known, or in the `previous_node_inputs` field. The [Get Workflow Steps API]({{site.url}}{{site.baseurl}}/automating-workflows/api/get-workflow-steps/) identifies required inputs and step outputs. <br> Substitutions are supported in string values, lists of strings, and maps with string values. The pattern `{% raw %}${{previous_step_id.output_key}}{% endraw %}` will be replaced by the value in the previous step's output with the given key. For example, if a parameter map in the user inputs includes a key `embedding_model_id` with a value `{% raw %}${{deploy_embedding_model.model_id}}{% endraw %}`, then the `model_id` output of the `deploy_embedding_model` step will be substituted here. This performs a similar function to the `previous_node_input` map but is not validated and does not automatically infer edges. <br>In some cases, you can include [additional inputs](#additional-fields) in this field. |
## Workflow step types
The following table lists the workflow step types. The `user_inputs` fields for these steps correspond directly to the linked APIs.
|Step type |Corresponding API |Description |
|--- |--- |--- |
|`noop` |No API | A no-operation (no-op) step that does nothing. It may be useful in some cases for synchronizing parallel steps. |
|`create_connector` |[Create Connector]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/connector-apis/create-connector/) |Creates a connector to a model hosted on a third-party platform. |
|`delete_connector` |[Delete Connector]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/connector-apis/delete-connector/) |Deletes a connector to a model hosted on a third-party platform. |
|`register_model_group` |[Register Model Group]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-group-apis/register-model-group/) |Registers a model group. The model group will be deleted automatically once no model is present in the group. |
|`register_remote_model` |[Register Model (remote)]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/register-model/#register-a-model-hosted-on-a-third-party-platform) |Registers a model hosted on a third-party platform. If the `user_inputs` field contains a `deploy` key that is set to `true`, also deploys the model. |
|`register_local_pretrained_model` |[Register Model (pretrained)]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/register-model/#register-a-pretrained-text-embedding-model) | Registers an OpenSearch-provided pretrained text embedding model that is hosted on your OpenSearch cluster. If the `user_inputs` field contains a `deploy` key that is set to `true`, also deploys the model. |
|`register_local_sparse_encoding_model` |[Register Model (sparse)]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/register-model/#register-a-pretrained-sparse-encoding-model) | Registers an OpenSearch-provided pretrained sparse encoding model that is hosted on your OpenSearch cluster. If the `user_inputs` field contains a `deploy` key that is set to `true`, also deploys the model. |
|`register_local_custom_model` |[Register Model (custom)]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/register-model/#register-a-custom-model) | Registers a custom model that is hosted on your OpenSearch cluster. If the `user_inputs` field contains a `deploy` key that is set to `true`, also deploys the model. |
|`delete_model` |[Delete Model]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/delete-model/) |Unregisters and deletes a model. |
|`deploy_model` |[Deploy Model]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/deploy-model/) |Deploys a registered model into memory. |
|`undeploy_model` |[Undeploy Model]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/undeploy-model/) |Undeploys a deployed model from memory. |
|`register_agent` |[Register Agent API]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/agent-apis/register-agent/) |Registers an agent as part of the ML Commons Agent Framework. |
|`delete_agent` |[Delete Agent API]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/agent-apis/delete-agent/) |Deletes an agent. |
|`create_tool` |No API | A special-case non-API step encapsulating the specification of a tool for an agent in the ML Commons Agent Framework. These will be listed as `previous_node_inputs` for the appropriate register agent step, with the value set to `tools`. |
## Additional fields
You can include the following additional fields in the `user_inputs` field when indicated.
|Field |Data type |Description |
|--- |--- |--- |
|`node_timeout` |Time units |A user-provided timeout for this step. For example, `20s` for a 20-second timeout. |
|`deploy` |Boolean |Applicable to the Register Model step type. If set to `true`, also executes the Deploy Model step. |
|`tools_order` |List |Applicable only to the Register Agent step type. Specifies the ordering of `tools`. For example, specify `["foo_tool", "bar_tool"]` to sequence those tools in that order. |
You can include the following additional fields in the `previous_node_inputs` field when indicated.
|Field |Data type |Description |
|--- |--- |--- |
|`model_id` |String |The `model_id` is used as an input for several steps. As a special case for the Register Agent step type, if an `llm.model_id` field is not present in the `user_inputs` and not present in `previous_node_inputs`, the `model_id` field from the previous node may be used as a backup for the model ID. |
## Example workflow steps
For example workflow step implementations, see the [Workflow tutorial]({{site.url}}{{site.baseurl}}/automating-workflows/workflow-tutorial/).

View File

@ -0,0 +1,636 @@
---
layout: default
title: Workflow tutorial
nav_order: 20
---
# Workflow tutorial
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/flow-framework/issues/475).
{: .warning}
You can automate the setup of common use cases, such as conversational chat, using a Chain-of-Thought (CoT) agent. An _agent_ orchestrates and runs ML models and tools. A _tool_ performs a set of specific tasks. This page presents a complete example of setting up a CoT agent. For more information about agents and tools, see [Agents and tools]({{site.url}}{{site.baseurl}}/ml-commons-plugin/agents-tools/index/)
The setup requires the following sequence of API requests, with provisioned resources used in subsequent requests. The following list provides an overview of the steps required for this workflow. The step names correspond to the names in the template:
1. **Deploy a model on the cluster**
* [`create_connector_1`](#create_connector_1): Create a connector to an externally hosted model.
* [`register_model_2`](#register_model_2): Register a model using the connector that you created.
* [`deploy_model_3`](#deploy_model_3): Deploy the model.
1. **Use the deployed model for inference**
* Set up several tools that perform specific tasks:
* [`math_tool`](#math_tool): Set up a math tool.
* [`ml_model_tool`](#ml_model_tool): Set up a machine learning (ML) model tool.
* Set up one or more agents that use some combination of the tools:
* [`sub_agent`](#sub_agent): Create an agent that uses the math tool.
* Set up tools representing these agents:
* [`agent_tool`](#agent_tool): Wrap the `sub_agent` so that you can use it as a tool.
* [`root_agent`](#root_agent): Set up a root agent that may delegate the task to either a tool or another agent.
The following sections describe the steps in detail. For the complete workflow template, see [Complete YAML workflow template](#complete-yaml-workflow-template).
## Workflow graph
The workflow described in the previous section is organized into a [template](#complete-yaml-workflow-template). Note that you can order the steps in several ways. In the example template, the `ml_model_tool` step is specified right before the `root_agent` step, but you can specify it at any point after the `deploy_model_3` step and before the `root_agent` step. The following diagram shows the directed acyclic graph (DAG) that OpenSearch creates for all of the steps in the order specified in the template.
![Example workflow steps graph]({{site.url}}{{site.baseurl}}/images/automatic-workflow-dag.png){:style="width: 100%; max-width: 600px;" class="img-centered"}
## 1. Deploy a model on the cluster
To deploy a model on the cluster, you need to create a connector to the model, register the model, and deploy the model.
<!-- vale off -->
### create_connector_1
<!-- vale on -->
The first step in the workflow is to create a connector to an externally hosted model (in the following example, this step is called `create_connector_1`). The content of the `user_inputs` field exactly matches the ML Commons [Create Connector API]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/connector-apis/create-connector/):
```yaml
nodes:
- id: create_connector_1
type: create_connector
user_inputs:
name: OpenAI Chat Connector
description: The connector to public OpenAI model service for GPT 3.5
version: '1'
protocol: http
parameters:
endpoint: api.openai.com
model: gpt-3.5-turbo
credential:
openAI_key: '12345'
actions:
- action_type: predict
method: POST
url: https://${parameters.endpoint}/v1/chat/completions
```
When you create a connector, OpenSearch returns a `connector_id`, which you need in order to register the model.
<!-- vale off -->
### register_model_2
<!-- vale on -->
When registering a model, the `previous_node_inputs` field tells OpenSearch to obtain the required `connector_id` from the output of the `create_connector_1` step. Other inputs required by the [Register Model API]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/register-model/) are included in the `user_inputs` field:
```yaml
- id: register_model_2
type: register_remote_model
previous_node_inputs:
create_connector_1: connector_id
user_inputs:
name: openAI-gpt-3.5-turbo
function_name: remote
description: test model
```
The output of this step is a `model_id`. You must then deploy the registered model to the cluster.
<!-- vale off -->
### deploy_model_3
<!-- vale on -->
The [Deploy Model API]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/deploy-model/) requires the `model_id` from the previous step, as specified in the `previous_node_inputs` field:
```yaml
- id: deploy_model_3
type: deploy_model
# This step needs the model_id produced as an output of the previous step
previous_node_inputs:
register_model_2: model_id
```
When using the Deploy Model API directly, a task ID is returned, requiring use of the [Tasks API](https://opensearch.org/docs/latest/ml-commons-plugin/api/tasks-apis/get-task/) to determine when the deployment is complete. The automated workflow eliminates the manual status check and returns the final `model_id` directly.
### Ordering steps
To order these steps in a sequence, you must connect them by an edge in the graph. When a `previous_node_input` field is present in a step, OpenSearch automatically creates a node with `source` and `dest` fields for this step. The output of the `source` is required as input for the `dest`. For example, the `register_model_2` step requires the `connector_id` from the `create_connector_1` step. Similarly, the `deploy_model_3` step requires the `model_id` from the `register_model_2` step. Thus, OpenSearch creates the first two edges in the graph as follows in order to match the output with the required input and raise errors if the required input is missing:
```yaml
edges:
- source: create_connector_1
dest: register_model_2
- source: register_model_2
dest: deploy_model_3
```
If you define `previous_node_inputs`, then defining edges is optional.
{: .note}
## 2. Use the deployed model for inference
A CoT agent can use the deployed model in a tool. This step doesnt strictly correspond to an API but represents a component of the body required by the [Register Agent API]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/agent-apis/register-agent/). This simplifies the register request and allows reuse of the same tool in multiple agents. For more information about agents and tools, see [Agents and tools]({{site.url}}{{site.baseurl}}/ml-commons-plugin/agents-tools/index/).
<!-- vale off -->
### math_tool
<!-- vale on -->
You can configure other tools to be used by the CoT agent. For example, you can configure a math tool as follows. This tool does not depend on any previous steps:
```yaml
- id: math_tool
type: create_tool
user_inputs:
name: MathTool
type: MathTool
description: A general tool to calculate any math problem. The action input
must be a valid math expression, like 2+3
parameters:
max_iteration: 5
```
<!-- vale off -->
### sub_agent
<!-- vale on -->
To use the math tool in the agent configuration, specify it as one of the tools in the `previous_node_inputs` field of the agent. You can add other tools to `previous_node_inputs` as necessary. The agent also needs a large language model (LLM) in order to reason with the tools. The LLM is defined by the `llm.model_id` field. This example assumes that the `model_id` from the `deploy_model_3` step will be used. However, if another model is already deployed, the `model_id` of that previously deployed model could be included in the `user_inputs` field instead:
```yaml
- id: sub_agent
type: register_agent
previous_node_inputs:
# When llm.model_id is not present this can be used as a fallback value
deploy-model-3: model_id
math_tool: tools
user_inputs:
name: Sub Agent
type: conversational
description: this is a test agent
parameters:
hello: world
llm.parameters:
max_iteration: '5'
stop_when_no_tool_found: 'true'
memory:
type: conversation_index
app_type: chatbot
```
OpenSearch will automatically create the following edges so that the agent can retrieve the fields from the previous node:
```yaml
- source: math_tool
dest: sub_agent
- source: deploy_model_3
dest: sub_agent
```
<!-- vale off -->
### agent_tool
<!-- vale on -->
You can use an agent as a tool for another agent. Registering an agent produces an `agent_id` in the output. The following step defines a tool that uses the `agent_id` from the previous step:
```yaml
- id: agent_tool
type: create_tool
previous_node_inputs:
sub_agent: agent_id
user_inputs:
name: AgentTool
type: AgentTool
description: Agent Tool
parameters:
max_iteration: 5
```
OpenSearch automatically creates an edge connection because this step specifies the `previous_node_input`:
```yaml
- source: sub_agent
dest: agent_tool
```
<!-- vale off -->
### ml_model_tool
<!-- vale on -->
A tool may reference an ML model. This example gets the required `model_id` from the model deployed in a previous step:
```yaml
- id: ml_model_tool
type: create_tool
previous_node_inputs:
deploy-model-3: model_id
user_inputs:
name: MLModelTool
type: MLModelTool
alias: language_model_tool
description: A general tool to answer any question.
parameters:
prompt: Answer the question as best you can.
response_filter: choices[0].message.content
```
OpenSearch automatically creates an edge in order to use the `previous_node_input`:
```yaml
- source: deploy-model-3
dest: ml_model_tool
```
<!-- vale off -->
### root_agent
<!-- vale on -->
A conversational chat application will communicate with a single root agent that includes the ML model tool and the agent tool in its `tools` field. It will also obtain the `llm.model_id` from the deployed model. Some agents require tools to be in a specific order, which can be enforced by including the `tools_order` field in the user inputs:
```yaml
- id: root_agent
type: register_agent
previous_node_inputs:
deploy-model-3: model_id
ml_model_tool: tools
agent_tool: tools
user_inputs:
name: DEMO-Test_Agent_For_CoT
type: conversational
description: this is a test agent
parameters:
prompt: Answer the question as best you can.
llm.parameters:
max_iteration: '5'
stop_when_no_tool_found: 'true'
tools_order: ['agent_tool', 'ml_model_tool']
memory:
type: conversation_index
app_type: chatbot
```
OpenSearch automatically creates edges for the `previous_node_input` sources:
```yaml
- source: deploy-model-3
dest: root_agent
- source: ml_model_tool
dest: root_agent
- source: agent_tool
dest: root_agent
```
For the complete DAG that OpenSearch creates for this workflow, see the [workflow graph](#workflow-graph).
## Complete YAML workflow template
The following is the final template including all of the `provision` workflow steps in YAML format:
<details open markdown="block">
<summary>
YAML template
</summary>
{: .text-delta}
```yaml
# This template demonstrates provisioning the resources for a
# Chain-of-Thought chat bot
name: tool-register-agent
description: test case
use_case: REGISTER_AGENT
version:
template: 1.0.0
compatibility:
- 2.12.0
- 3.0.0
workflows:
# This workflow defines the actions to be taken when the Provision Workflow API is used
provision:
nodes:
# The first three nodes create a connector to a remote model, registers and deploy that model
- id: create_connector_1
type: create_connector
user_inputs:
name: OpenAI Chat Connector
description: The connector to public OpenAI model service for GPT 3.5
version: '1'
protocol: http
parameters:
endpoint: api.openai.com
model: gpt-3.5-turbo
credential:
openAI_key: '12345'
actions:
- action_type: predict
method: POST
url: https://${parameters.endpoint}/v1/chat/completions
- id: register_model_2
type: register_remote_model
previous_node_inputs:
create_connector_1: connector_id
user_inputs:
# deploy: true could be added here instead of the deploy step below
name: openAI-gpt-3.5-turbo
description: test model
- id: deploy_model_3
type: deploy_model
previous_node_inputs:
register_model_2: model_id
# For example purposes, the model_id obtained as the output of the deploy_model_3 step will be used
# for several below steps. However, any other deployed model_id can be used for those steps.
# This is one example tool from the Agent Framework.
- id: math_tool
type: create_tool
user_inputs:
name: MathTool
type: MathTool
description: A general tool to calculate any math problem. The action input
must be a valid math expression, like 2+3
parameters:
max_iteration: 5
# This simple agent only has one tool, but could be configured with many tools
- id: sub_agent
type: register_agent
previous_node_inputs:
deploy-model-3: model_id
math_tool: tools
user_inputs:
name: Sub Agent
type: conversational
description: this is a test agent
parameters:
hello: world
llm.parameters:
max_iteration: '5'
stop_when_no_tool_found: 'true'
memory:
type: conversation_index
app_type: chatbot
# An agent can be used itself as a tool in a nested relationship
- id: agent_tool
type: create_tool
previous_node_inputs:
sub_agent: agent_id
user_inputs:
name: AgentTool
type: AgentTool
description: Agent Tool
parameters:
max_iteration: 5
# An ML Model can be used as a tool
- id: ml_model_tool
type: create_tool
previous_node_inputs:
deploy-model-3: model_id
user_inputs:
name: MLModelTool
type: MLModelTool
alias: language_model_tool
description: A general tool to answer any question.
parameters:
prompt: Answer the question as best you can.
response_filter: choices[0].message.content
# This final agent will be the interface for the CoT chat user
# Using a flow agent type tools_order matters
- id: root_agent
type: register_agent
previous_node_inputs:
deploy-model-3: model_id
ml_model_tool: tools
agent_tool: tools
user_inputs:
name: DEMO-Test_Agent
type: flow
description: this is a test agent
parameters:
prompt: Answer the question as best you can.
llm.parameters:
max_iteration: '5'
stop_when_no_tool_found: 'true'
tools_order: ['agent_tool', 'ml_model_tool']
memory:
type: conversation_index
app_type: chatbot
# These edges are all automatically created with previous_node_input
edges:
- source: create_connector_1
dest: register_model_2
- source: register_model_2
dest: deploy_model_3
- source: math_tool
dest: sub_agent
- source: deploy_model_3
dest: sub_agent
- source: sub_agent
dest: agent_tool
- source: deploy-model-3
dest: ml_model_tool
- source: deploy-model-3
dest: root_agent
- source: ml_model_tool
dest: root_agent
- source: agent_tool
dest: root_agent
```
</details>
## Complete JSON workflow template
The following is the same template in JSON format:
<details open markdown="block">
<summary>
JSON template
</summary>
{: .text-delta}
```json
{
"name": "tool-register-agent",
"description": "test case",
"use_case": "REGISTER_AGENT",
"version": {
"template": "1.0.0",
"compatibility": [
"2.12.0",
"3.0.0"
]
},
"workflows": {
"provision": {
"nodes": [
{
"id": "create_connector_1",
"type": "create_connector",
"user_inputs": {
"name": "OpenAI Chat Connector",
"description": "The connector to public OpenAI model service for GPT 3.5",
"version": "1",
"protocol": "http",
"parameters": {
"endpoint": "api.openai.com",
"model": "gpt-3.5-turbo"
},
"credential": {
"openAI_key": "12345"
},
"actions": [
{
"action_type": "predict",
"method": "POST",
"url": "https://${parameters.endpoint}/v1/chat/completions"
}
]
}
},
{
"id": "register_model_2",
"type": "register_remote_model",
"previous_node_inputs": {
"create_connector_1": "connector_id"
},
"user_inputs": {
"name": "openAI-gpt-3.5-turbo",
"description": "test model"
}
},
{
"id": "deploy_model_3",
"type": "deploy_model",
"previous_node_inputs": {
"register_model_2": "model_id"
}
},
{
"id": "math_tool",
"type": "create_tool",
"user_inputs": {
"name": "MathTool",
"type": "MathTool",
"description": "A general tool to calculate any math problem. The action input must be a valid math expression, like 2+3",
"parameters": {
"max_iteration": 5
}
}
},
{
"id": "sub_agent",
"type": "register_agent",
"previous_node_inputs": {
"deploy-model-3": "llm.model_id",
"math_tool": "tools"
},
"user_inputs": {
"name": "Sub Agent",
"type": "conversational",
"description": "this is a test agent",
"parameters": {
"hello": "world"
},
"llm.parameters": {
"max_iteration": "5",
"stop_when_no_tool_found": "true"
},
"memory": {
"type": "conversation_index"
},
"app_type": "chatbot"
}
},
{
"id": "agent_tool",
"type": "create_tool",
"previous_node_inputs": {
"sub_agent": "agent_id"
},
"user_inputs": {
"name": "AgentTool",
"type": "AgentTool",
"description": "Agent Tool",
"parameters": {
"max_iteration": 5
}
}
},
{
"id": "ml_model_tool",
"type": "create_tool",
"previous_node_inputs": {
"deploy-model-3": "model_id"
},
"user_inputs": {
"name": "MLModelTool",
"type": "MLModelTool",
"alias": "language_model_tool",
"description": "A general tool to answer any question.",
"parameters": {
"prompt": "Answer the question as best you can.",
"response_filter": "choices[0].message.content"
}
}
},
{
"id": "root_agent",
"type": "register_agent",
"previous_node_inputs": {
"deploy-model-3": "llm.model_id",
"ml_model_tool": "tools",
"agent_tool": "tools"
},
"user_inputs": {
"name": "DEMO-Test_Agent",
"type": "flow",
"description": "this is a test agent",
"parameters": {
"prompt": "Answer the question as best you can."
},
"llm.parameters": {
"max_iteration": "5",
"stop_when_no_tool_found": "true"
},
"tools_order": [
"agent_tool",
"ml_model_tool"
],
"memory": {
"type": "conversation_index"
},
"app_type": "chatbot"
}
}
],
"edges": [
{
"source": "create_connector_1",
"dest": "register_model_2"
},
{
"source": "register_model_2",
"dest": "deploy_model_3"
},
{
"source": "math_tool",
"dest": "sub_agent"
},
{
"source": "deploy_model_3",
"dest": "sub_agent"
},
{
"source": "sub_agent",
"dest": "agent_tool"
},
{
"source": "deploy-model-3",
"dest": "ml_model_tool"
},
{
"source": "deploy-model-3",
"dest": "root_agent"
},
{
"source": "ml_model_tool",
"dest": "root_agent"
},
{
"source": "agent_tool",
"dest": "root_agent"
}
]
}
}
}
```
</details>
## Next steps
To learn more about agents and tools, see [Agents and tools]({{site.url}}{{site.baseurl}}/ml-commons-plugin/agents-tools/index/).

View File

@ -112,6 +112,9 @@ collections:
about: about:
permalink: /:collection/:path/ permalink: /:collection/:path/
output: true output: true
automating-workflows:
permalink: /:collection/:path/
output: true
opensearch_collection: opensearch_collection:
# Define the collections used in the theme # Define the collections used in the theme
@ -166,6 +169,9 @@ opensearch_collection:
ml-commons-plugin: ml-commons-plugin:
name: Machine learning name: Machine learning
nav_fold: true nav_fold: true
automating-workflows:
name: Automating workflows
nav_fold: true
monitoring-your-cluster: monitoring-your-cluster:
name: Monitoring your cluster name: Monitoring your cluster
nav_fold: true nav_fold: true

View File

@ -25,6 +25,10 @@ For information about asynchronous search settings, see [Asynchronous Search set
For information about cross-cluster replication settings, see [Replication settings]({{site.url}}{{site.baseurl}}/tuning-your-cluster/replication-plugin/settings/). For information about cross-cluster replication settings, see [Replication settings]({{site.url}}{{site.baseurl}}/tuning-your-cluster/replication-plugin/settings/).
## Flow Framework plugin settings
For information about automatic workflow settings, see [Workflow settings]({{site.url}}{{site.baseurl}}/automating-workflows/workflow-settings/).
## Geospatial plugin settings ## Geospatial plugin settings
For information about the Geospatial plugin's IP2Geo processor settings, see [Cluster settings]({{site.url}}{{site.baseurl}}/ingest-pipelines/processors/ip2geo/#cluster-settings). For information about the Geospatial plugin's IP2Geo processor settings, see [Cluster settings]({{site.url}}{{site.baseurl}}/ingest-pipelines/processors/ip2geo/#cluster-settings).

View File

@ -241,13 +241,14 @@ Major, minor, and patch plugin versions must match OpenSearch major, minor, and
The following plugins are bundled with all OpenSearch distributions except for minimum distribution packages. The following plugins are bundled with all OpenSearch distributions except for minimum distribution packages.
| Plugin Name | Repository | Earliest Available Version | | Plugin name | Repository | Earliest available version |
| :--- | :--- | :--- | | :--- | :--- | :--- |
| Alerting | [opensearch-alerting](https://github.com/opensearch-project/alerting) | 1.0.0 | | Alerting | [opensearch-alerting](https://github.com/opensearch-project/alerting) | 1.0.0 |
| Anomaly Detection | [opensearch-anomaly-detection](https://github.com/opensearch-project/anomaly-detection) | 1.0.0 | | Anomaly Detection | [opensearch-anomaly-detection](https://github.com/opensearch-project/anomaly-detection) | 1.0.0 |
| Asynchronous Search | [opensearch-asynchronous-search](https://github.com/opensearch-project/asynchronous-search) | 1.0.0 | | Asynchronous Search | [opensearch-asynchronous-search](https://github.com/opensearch-project/asynchronous-search) | 1.0.0 |
| Cross Cluster Replication | [opensearch-cross-cluster-replication](https://github.com/opensearch-project/cross-cluster-replication) | 1.1.0 | | Cross Cluster Replication | [opensearch-cross-cluster-replication](https://github.com/opensearch-project/cross-cluster-replication) | 1.1.0 |
| Custom Codecs | [opensearch-custom-codecs](https://github.com/opensearch-project/custom-codecs) | 2.10.0 | | Custom Codecs | [opensearch-custom-codecs](https://github.com/opensearch-project/custom-codecs) | 2.10.0 |
| Flow Framework | [flow-framework](https://github.com/opensearch-project/flow-framework) | 2.12.0 |
| Notebooks<sup>1</sup> | [opensearch-notebooks](https://github.com/opensearch-project/dashboards-notebooks) | 1.0.0 to 1.1.0 | | Notebooks<sup>1</sup> | [opensearch-notebooks](https://github.com/opensearch-project/dashboards-notebooks) | 1.0.0 to 1.1.0 |
| Notifications | [notifications](https://github.com/opensearch-project/notifications) | 2.0.0 | Notifications | [notifications](https://github.com/opensearch-project/notifications) | 2.0.0
| Reports Scheduler | [opensearch-reports-scheduler](https://github.com/opensearch-project/dashboards-reports) | 1.0.0 | | Reports Scheduler | [opensearch-reports-scheduler](https://github.com/opensearch-project/dashboards-reports) | 1.0.0 |

View File

@ -153,6 +153,12 @@ img {
@extend .panel; @extend .panel;
} }
.img-centered {
max-width: 100%;
margin: 0 auto;
display: block;
}
.no-border { .no-border {
border: none; border: none;
box-shadow: none; box-shadow: none;

Binary file not shown.

After

Width:  |  Height:  |  Size: 60 KiB