Re-arranging sections for append and replace docs. (#15497)

This commit is contained in:
Benjamin Hopp 2023-12-06 14:13:05 -07:00 committed by GitHub
parent 6a64f72c67
commit fea53c7084
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -233,6 +233,10 @@ When using concurrent append and replace, keep the following in mind:
### Configure concurrent append and replace
##### Update the compaction settings with the UI
In the **Compaction config** for a datasource, set **Allow concurrent compactions (experimental)** to **True**.
##### Update the compaction settings with the API
Prepare your datasource for concurrent append and replace by setting its task lock type to `REPLACE`.
@ -249,9 +253,6 @@ curl --location --request POST 'http://localhost:8081/druid/coordinator/v1/confi
}'
```
##### Update the compaction settings with the UI
In the **Compaction config** for a datasource, set **Allow concurrent compactions (experimental)** to **True**.
#### Add a task lock type to your ingestion job
@ -262,6 +263,13 @@ Next, you need to configure the task lock type for your ingestion job:
You can provide the context parameter through the API like any other parameter for ingestion job or through the UI.
##### Add a task lock using the Druid console
As part of the **Load data** wizard for classic batch (JSON-based ingestion) and streaming ingestion, you can configure the task lock type for the ingestion during the **Publish** step:
- If you set **Append to existing** to **True**, you can then set **Allow concurrent append tasks (experimental)** to **True**.
- If you set **Append to existing** to **False**, you can then set **Allow concurrent replace tasks (experimental)** to **True**.
##### Add the task lock type through the API
Add the following JSON snippet to your supervisor or ingestion spec if you're using the API:
@ -295,14 +303,6 @@ Set `taskLockType` to `REPLACE` if you're replacing data. For example, if you u
- dynamic partitioning with append to existing set to `false`
##### Add a task lock using the Druid console
As part of the **Load data** wizard for classic batch (JSON-based ingestion) and streaming ingestion, you can configure the task lock type for the ingestion during the **Publish** step:
- If you set **Append to existing** to **True**, you can then set **Allow concurrent append tasks (experimental)** to **True**.
- If you set **Append to existing** to **False**, you can then set **Allow concurrent replace tasks (experimental)** to **True**.
## Learn more
See the following topics for more information: