mirror of https://github.com/apache/druid.git
Reformat Deep Storage options to use table.
This commit is contained in:
parent
baeef08c4c
commit
5b76a52c99
|
@ -12,10 +12,10 @@ A local mount can be used for storage of segments as well. This allows you to u
|
|||
|
||||
In order to use a local mount for deep storage, you need to set the following configuration in your common configs.
|
||||
|
||||
```
|
||||
druid.storage.type=local
|
||||
druid.storage.storageDirectory=<directory for storing segments>
|
||||
```
|
||||
|Property|Possible Values|Description|Default|
|
||||
|--------|---------------|-----------|-------|
|
||||
|`druid.storage.type`|local||Must be set.|
|
||||
|`druid.storage.storageDirectory`||Directory for storing segments.|Must be set.|
|
||||
|
||||
Note that you should generally set `druid.storage.storageDirectory` to something different from `druid.segmentCache.locations` and `druid.segmentCache.infoDir`.
|
||||
|
||||
|
@ -28,21 +28,22 @@ S3-compatible deep storage is basically either S3 or something like Google Stora
|
|||
|
||||
S3 configuration parameters are
|
||||
|
||||
```
|
||||
druid.s3.accessKey=<S3 access key>
|
||||
druid.s3.secretKey=<S3 secret_key>
|
||||
druid.storage.bucket=<bucket to store in>
|
||||
druid.storage.baseKey=<base key prefix to use, i.e. what directory>
|
||||
```
|
||||
|
||||
|Property|Possible Values|Description|Default|
|
||||
|--------|---------------|-----------|-------|
|
||||
|`druid.s3.accessKey`||S3 access key.|Must be set.|
|
||||
|`druid.s3.secretKey`||S3 secret key.|Must be set.|
|
||||
|`druid.storage.bucket`||Bucket to store in.|Must be set.|
|
||||
|`druid.storage.baseKey`||Base key prefix to use, i.e. what directory.|Must be set.|
|
||||
|
||||
### HDFS
|
||||
|
||||
In order to use hdfs for deep storage, you need to set the following configuration in your common configs.
|
||||
|
||||
```
|
||||
druid.storage.type=hdfs
|
||||
druid.storage.storageDirectory=<directory for storing segments>
|
||||
```
|
||||
|Property|Possible Values|Description|Default|
|
||||
|--------|---------------|-----------|-------|
|
||||
|`druid.storage.type`|hdfs||Must be set.|
|
||||
|`druid.storage.storageDirectory`||Directory for storing segments.|Must be set.|
|
||||
|
||||
If you are using the Hadoop indexer, set your output directory to be a location on Hadoop and it will work
|
||||
|
||||
|
@ -58,13 +59,13 @@ Please note that this is a community contributed module and does not support Cas
|
|||
|
||||
[Microsoft Azure Storage](http://azure.microsoft.com/en-us/services/storage/) is another option for deep storage. This requires some additional druid configuration.
|
||||
|
||||
```
|
||||
druid.storage.type=azure
|
||||
druid.azure.account=<azure storage account>
|
||||
druid.azure.key=<azure storage account key>
|
||||
druid.azure.container=<azure storage container>
|
||||
druid.azure.protocol=<optional; valid options: https or http; default: https>
|
||||
druid.azure.maxTries=<optional; number of tries before give up an Azure operation; default: 3; min: 1>
|
||||
```
|
||||
|Property|Possible Values|Description|Default|
|
||||
|--------|---------------|-----------|-------|
|
||||
|`druid.storage.type`|azure||Must be set.|
|
||||
|`druid.azure.account`||Azure Storage account name.|Must be set.|
|
||||
|`druid.azure.key`||Azure Storage account key.|Must be set.|
|
||||
|`druid.azure.container`||Azure Storage container name.|Must be set.|
|
||||
|`druid.azure.protocol`|http or https||https|
|
||||
|`druid.azure.maxTries`||Number of tries before cancel an Azure operation.|3|
|
||||
|
||||
Please note that this is a community contributed module. See [Azure Services](http://azure.microsoft.com/en-us/pricing/free-trial/) for more information.
|
||||
|
|
Loading…
Reference in New Issue