Fix batch ingestion doc

This commit is contained in:
Bingkun 2015-08-26 15:16:21 -05:00
parent 4befd6bfef
commit ae1f104c10
1 changed files with 2 additions and 0 deletions

View File

@ -176,7 +176,9 @@ It is a type of inputSpec that reads data already stored inside druid. It is use
|maxSplitSize|Number|Enables combining multiple segments into single Hadoop InputSplit according to size of segments. Default is none. |no|
Here is what goes inside "ingestionSpec"
|Field|Type|Description|Required|
|-----|----|-----------|--------|
|dataSource|String|Druid dataSource name from which you are loading the data.|yes|
|interval|String|A string representing ISO-8601 Intervals.|yes|
|granularity|String|Defines the granularity of the query while loading data. Default value is "none".See [Granularities](../querying/granularities.html).|no|