Fix various broken links in the docs. (#2833)

This commit is contained in:
Gian Merlino 2016-04-13 13:30:01 -07:00 committed by Fangjin Yang
parent 725ee1401d
commit e320d13385
7 changed files with 8 additions and 8 deletions

View File

@ -65,7 +65,7 @@ druid.emitter.graphite.eventConverter={"type":"all", "namespacePrefix": "druid.t
The second implementation called `whiteList`, will send only the white listed metrics and dimensions. The second implementation called `whiteList`, will send only the white listed metrics and dimensions.
Same as for the `all` converter user has control of `<namespacePrefix>.[<druid service name>].[<druid hostname>].` Same as for the `all` converter user has control of `<namespacePrefix>.[<druid service name>].[<druid hostname>].`
White-list based converter comes with the following default white list map located under resources [defaultWhiteListMap.json](./src/main/resources/defaultWhiteListMap.json) White-list based converter comes with the following default white list map located under resources in `./src/main/resources/defaultWhiteListMap.json`
Although user can override the default white list map by supplying a property called `mapPath`. Although user can override the default white list map by supplying a property called `mapPath`.
This property is a String containing the path for the file containing **white list map Json object**. This property is a String containing the path for the file containing **white list map Json object**.

View File

@ -5,7 +5,7 @@ layout: doc_page
# Kafka Namespaced Lookup # Kafka Namespaced Lookup
<div class="note caution"> <div class="note caution">
Lookups are an <a href="../development/experimental.html">experimental</a> feature. Lookups are an <a href="../experimental.html">experimental</a> feature.
</div> </div>
Make sure to [include](../../operations/including-extensions.html) `druid-namespace-lookup` and `druid-kafka-extraction-namespace` as an extension. Make sure to [include](../../operations/including-extensions.html) `druid-namespace-lookup` and `druid-kafka-extraction-namespace` as an extension.

View File

@ -51,5 +51,5 @@ Make sure to [include](../../operations/including-extensions.html) `mysql-metada
Note: the metadata storage extension is not packaged within the main Druid tarball; it is Note: the metadata storage extension is not packaged within the main Druid tarball; it is
packaged in a separate tarball that can be downloaded from [here](http://druid.io/downloads.html). packaged in a separate tarball that can be downloaded from [here](http://druid.io/downloads.html).
You can also get it using [pull-deps](../pull-deps.html), or you can build You can also get it using [pull-deps](../../operations/pull-deps.html), or you can build
it from source code; see [Build from Source](../development/build.html). it from source code; see [Build from Source](../build.html).

View File

@ -67,7 +67,7 @@ a `pollPeriod` at the end of which time they poll the remote resource of interes
# Supported Lookups # Supported Lookups
For additional lookups, please see our [extensions list](../development/extensions.html). For additional lookups, please see our [extensions list](../extensions.html).
## URI namespace update ## URI namespace update

View File

@ -62,7 +62,7 @@ Tools
Community Helper Libraries Community Helper Libraries
-------------------------- --------------------------
* [madvertise/druid-dumbo](https://github.com/madvertise/druid-dumbo) - Scripts to help generate batch configs for the ingestion of data into Druid * [liquidm/druid-dumbo](https://github.com/liquidm/druid-dumbo) - Scripts to help generate batch configs for the ingestion of data into Druid
* [housejester/druid-test-harness](https://github.com/housejester/druid-test-harness) - A set of scripts to simplify standing up some servers and seeing how things work * [housejester/druid-test-harness](https://github.com/housejester/druid-test-harness) - A set of scripts to simplify standing up some servers and seeing how things work
Community Extensions Community Extensions

View File

@ -194,7 +194,7 @@ Uses [HyperLogLog](http://algo.inria.fr/flajolet/Publications/FlFuGaMe07.pdf) to
{ "type" : "hyperUnique", "name" : <output_name>, "fieldName" : <metric_name> } { "type" : "hyperUnique", "name" : <output_name>, "fieldName" : <metric_name> }
``` ```
For more approximate aggregators, please see [theta sketches](../development/datasketches-aggregators.html). For more approximate aggregators, please see [theta sketches](../development/extensions-core/datasketches-aggregators.html).
## Miscellaneous Aggregations ## Miscellaneous Aggregations

View File

@ -96,7 +96,7 @@ Let's send some data! We'll start with these three records:
``` ```
Druid streaming ingestion requires relatively current messages (relative to a slack time controlled by the Druid streaming ingestion requires relatively current messages (relative to a slack time controlled by the
[windowPeriod](ingestion-streams.html#segmentgranularity-and-windowperiod) value), so you should [windowPeriod](../ingestion/stream-push.html#segmentgranularity-and-windowperiod) value), so you should
replace `2000-01-01T00:00:00Z` in these messages with the current time in ISO8601 format. You can replace `2000-01-01T00:00:00Z` in these messages with the current time in ISO8601 format. You can
get this by running: get this by running: