NIFI-2801 Edited Kafka processor documentation to explicitly state which Kafka versions supported by each processor

This closes #1119.

Signed-off-by: Andy LoPresto <alopresto@apache.org>
This commit is contained in:
Andrew Lim 2016-10-10 12:43:15 -04:00 committed by Andy LoPresto
parent 893fed794c
commit 979b4d8ab9
No known key found for this signature in database
GPG Key ID: 3C6EF65B2F7DEF69
11 changed files with 28 additions and 29 deletions

View File

@ -371,8 +371,7 @@ categorizing them by their functions.
the content fetched from HDFS. the content fetched from HDFS.
- *FetchS3Object*: Fetches the contents of an object from the Amazon Web Services (AWS) Simple Storage Service (S3). The outbound FlowFile contains the contents - *FetchS3Object*: Fetches the contents of an object from the Amazon Web Services (AWS) Simple Storage Service (S3). The outbound FlowFile contains the contents
received from S3. received from S3.
- *GetKafka*: Consumes messages from Apache Kafka. The messages can be emitted as a FlowFile per message or can be batched together using a user-specified - *GetKafka*: Fetches messages from Apache Kafka, specifically for 0.8.x versions. The messages can be emitted as a FlowFile per message or can be batched together using a user-specified delimiter.
delimiter.
- *GetMongo*: Executes a user-specified query against MongoDB and writes the contents to a new FlowFile. - *GetMongo*: Executes a user-specified query against MongoDB and writes the contents to a new FlowFile.
- *GetTwitter*: Allows Users to register a filter to listen to the Twitter "garden hose" or Enterprise endpoint, creating a FlowFile for each tweet - *GetTwitter*: Allows Users to register a filter to listen to the Twitter "garden hose" or Enterprise endpoint, creating a FlowFile for each tweet
that is received. that is received.
@ -386,7 +385,7 @@ categorizing them by their functions.
- *PutSQL*: Executes the contents of a FlowFile as a SQL DDL Statement (INSERT, UPDATE, or DELETE). The contents of the FlowFile must be a valid - *PutSQL*: Executes the contents of a FlowFile as a SQL DDL Statement (INSERT, UPDATE, or DELETE). The contents of the FlowFile must be a valid
SQL statement. Attributes can be used as parameters so that the contents of the FlowFile can be parameterized SQL statements in order to avoid SQL statement. Attributes can be used as parameters so that the contents of the FlowFile can be parameterized SQL statements in order to avoid
SQL injection attacks. SQL injection attacks.
- *PutKafka*: Sends the contents of a FlowFile to Kafka as a message. The FlowFile can be sent as a single message or a delimiter, such as a - *PutKafka*: Sends the contents of a FlowFile as a message to Apache Kafka, specifically for 0.8.x versions. The FlowFile can be sent as a single message or a delimiter, such as a
new-line can be specified, in order to send many messages for a single FlowFile. new-line can be specified, in order to send many messages for a single FlowFile.
- *PutMongo*: Sends the contents of a FlowFile to Mongo as an INSERT or an UPDATE. - *PutMongo*: Sends the contents of a FlowFile to Mongo as an INSERT or an UPDATE.

View File

@ -52,11 +52,11 @@ import org.apache.nifi.processor.util.StandardValidators;
import static org.apache.nifi.processors.kafka.pubsub.KafkaProcessorUtils.HEX_ENCODING; import static org.apache.nifi.processors.kafka.pubsub.KafkaProcessorUtils.HEX_ENCODING;
import static org.apache.nifi.processors.kafka.pubsub.KafkaProcessorUtils.UTF8_ENCODING; import static org.apache.nifi.processors.kafka.pubsub.KafkaProcessorUtils.UTF8_ENCODING;
@CapabilityDescription("Consumes messages from Apache Kafka specifically built against the Kafka 0.10 Consumer API. " @CapabilityDescription("Consumes messages from Apache Kafka specifically built against the Kafka 0.10.x Consumer API. "
+ " Please note there are cases where the publisher can get into an indefinite stuck state. We are closely monitoring" + " Please note there are cases where the publisher can get into an indefinite stuck state. We are closely monitoring"
+ " how this evolves in the Kafka community and will take advantage of those fixes as soon as we can. In the mean time" + " how this evolves in the Kafka community and will take advantage of those fixes as soon as we can. In the meantime"
+ " it is possible to enter states where the only resolution will be to restart the JVM NiFi runs on.") + " it is possible to enter states where the only resolution will be to restart the JVM NiFi runs on. The complementary NiFi processor for sending messages is PublishKafka_0_10.")
@Tags({"Kafka", "Get", "Ingest", "Ingress", "Topic", "PubSub", "Consume", "0.10"}) @Tags({"Kafka", "Get", "Ingest", "Ingress", "Topic", "PubSub", "Consume", "0.10.x"})
@WritesAttributes({ @WritesAttributes({
@WritesAttribute(attribute = KafkaProcessorUtils.KAFKA_COUNT, description = "The number of messages written if more than one"), @WritesAttribute(attribute = KafkaProcessorUtils.KAFKA_COUNT, description = "The number of messages written if more than one"),
@WritesAttribute(attribute = KafkaProcessorUtils.KAFKA_KEY, description = "The key of message if present and if single message. " @WritesAttribute(attribute = KafkaProcessorUtils.KAFKA_KEY, description = "The key of message if present and if single message. "

View File

@ -56,13 +56,13 @@ import org.apache.nifi.processor.io.InputStreamCallback;
import org.apache.nifi.processor.util.FlowFileFilters; import org.apache.nifi.processor.util.FlowFileFilters;
import org.apache.nifi.processor.util.StandardValidators; import org.apache.nifi.processor.util.StandardValidators;
@Tags({"Apache", "Kafka", "Put", "Send", "Message", "PubSub", "0.9.x"}) @Tags({"Apache", "Kafka", "Put", "Send", "Message", "PubSub", "0.10.x"})
@CapabilityDescription("Sends the contents of a FlowFile as a message to Apache Kafka using the Kafka 0.9 producer. " @CapabilityDescription("Sends the contents of a FlowFile as a message to Apache Kafka using the Kafka 0.10.x Producer API."
+ "The messages to send may be individual FlowFiles or may be delimited, using a " + "The messages to send may be individual FlowFiles or may be delimited, using a "
+ "user-specified delimiter, such as a new-line. " + "user-specified delimiter, such as a new-line. "
+ " Please note there are cases where the publisher can get into an indefinite stuck state. We are closely monitoring" + " Please note there are cases where the publisher can get into an indefinite stuck state. We are closely monitoring"
+ " how this evolves in the Kafka community and will take advantage of those fixes as soon as we can. In the mean time" + " how this evolves in the Kafka community and will take advantage of those fixes as soon as we can. In the meantime"
+ " it is possible to enter states where the only resolution will be to restart the JVM NiFi runs on.") + " it is possible to enter states where the only resolution will be to restart the JVM NiFi runs on. The complementary NiFi processor for fetching messages is ConsumeKafka_0_10.")
@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED) @InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
@DynamicProperty(name = "The name of a Kafka configuration property.", value = "The value of a given Kafka configuration property.", @DynamicProperty(name = "The name of a Kafka configuration property.", value = "The value of a given Kafka configuration property.",
description = "These properties will be added on the Kafka configuration after loading any provided configuration properties." description = "These properties will be added on the Kafka configuration after loading any provided configuration properties."

View File

@ -24,8 +24,8 @@
<!-- Processor Documentation ================================================== --> <!-- Processor Documentation ================================================== -->
<h2>Description:</h2> <h2>Description:</h2>
<p> <p>
This Processors polls <a href="http://kafka.apache.org/">Apache Kafka</a> This Processor polls <a href="http://kafka.apache.org/">Apache Kafka</a>
for data using KafkaConsumer API available with Kafka 0.10+. When a message is received for data using KafkaConsumer API available with Kafka 0.10.x. When a message is received
from Kafka, this Processor emits a FlowFile where the content of the FlowFile is the value from Kafka, this Processor emits a FlowFile where the content of the FlowFile is the value
of the Kafka message. of the Kafka message.
</p> </p>

View File

@ -24,9 +24,9 @@
<!-- Processor Documentation ================================================== --> <!-- Processor Documentation ================================================== -->
<h2>Description:</h2> <h2>Description:</h2>
<p> <p>
This Processors puts the contents of a FlowFile to a Topic in This Processor puts the contents of a FlowFile to a Topic in
<a href="http://kafka.apache.org/">Apache Kafka</a> using KafkaProducer API available <a href="http://kafka.apache.org/">Apache Kafka</a> using KafkaProducer API available
with Kafka 0.10+ API. The content of a FlowFile becomes the contents of a Kafka message. with Kafka 0.10.x API. The content of a FlowFile becomes the contents of a Kafka message.
This message is optionally assigned a key by using the &lt;Kafka Key&gt; Property. This message is optionally assigned a key by using the &lt;Kafka Key&gt; Property.
</p> </p>

View File

@ -69,8 +69,8 @@ import kafka.message.MessageAndMetadata;
@SupportsBatching @SupportsBatching
@InputRequirement(Requirement.INPUT_FORBIDDEN) @InputRequirement(Requirement.INPUT_FORBIDDEN)
@CapabilityDescription("Fetches messages from Apache Kafka") @CapabilityDescription("Fetches messages from Apache Kafka, specifically for 0.8.x versions. The complementary NiFi processor for sending messages is PutKafka.")
@Tags({"Kafka", "Apache", "Get", "Ingest", "Ingress", "Topic", "PubSub"}) @Tags({"Kafka", "Apache", "Get", "Ingest", "Ingress", "Topic", "PubSub", "0.8.x"})
@WritesAttributes({ @WritesAttributes({
@WritesAttribute(attribute = "kafka.topic", description = "The name of the Kafka Topic from which the message was received"), @WritesAttribute(attribute = "kafka.topic", description = "The name of the Kafka Topic from which the message was received"),
@WritesAttribute(attribute = "kafka.key", description = "The key of the Kafka message, if it exists and batch size is 1. If" @WritesAttribute(attribute = "kafka.key", description = "The key of the Kafka message, if it exists and batch size is 1. If"

View File

@ -53,9 +53,9 @@ import org.apache.nifi.processor.util.StandardValidators;
import org.apache.nifi.processors.kafka.KafkaPublisher.KafkaPublisherResult; import org.apache.nifi.processors.kafka.KafkaPublisher.KafkaPublisherResult;
@InputRequirement(Requirement.INPUT_REQUIRED) @InputRequirement(Requirement.INPUT_REQUIRED)
@Tags({ "Apache", "Kafka", "Put", "Send", "Message", "PubSub" }) @Tags({ "Apache", "Kafka", "Put", "Send", "Message", "PubSub", "0.8.x"})
@CapabilityDescription("Sends the contents of a FlowFile as a message to Apache Kafka. The messages to send may be individual FlowFiles or may be delimited, using a " @CapabilityDescription("Sends the contents of a FlowFile as a message to Apache Kafka, specifically for 0.8.x versions. The messages to send may be individual FlowFiles or may be delimited, using a "
+ "user-specified delimiter, such as a new-line.") + "user-specified delimiter, such as a new-line. The complementary NiFi processor for fetching messages is GetKafka.")
@DynamicProperty(name = "The name of a Kafka configuration property.", value = "The value of a given Kafka configuration property.", @DynamicProperty(name = "The name of a Kafka configuration property.", value = "The value of a given Kafka configuration property.",
description = "These properties will be added on the Kafka configuration after loading any provided configuration properties." description = "These properties will be added on the Kafka configuration after loading any provided configuration properties."
+ " In the event a dynamic property represents a property that was already set as part of the static properties, its value wil be" + " In the event a dynamic property represents a property that was already set as part of the static properties, its value wil be"

View File

@ -52,11 +52,11 @@ import org.apache.nifi.processor.util.StandardValidators;
import static org.apache.nifi.processors.kafka.pubsub.KafkaProcessorUtils.HEX_ENCODING; import static org.apache.nifi.processors.kafka.pubsub.KafkaProcessorUtils.HEX_ENCODING;
import static org.apache.nifi.processors.kafka.pubsub.KafkaProcessorUtils.UTF8_ENCODING; import static org.apache.nifi.processors.kafka.pubsub.KafkaProcessorUtils.UTF8_ENCODING;
@CapabilityDescription("Consumes messages from Apache Kafka specifically built against the Kafka 0.9 Consumer API. " @CapabilityDescription("Consumes messages from Apache Kafka specifically built against the Kafka 0.9.x Consumer API. "
+ " Please note there are cases where the publisher can get into an indefinite stuck state. We are closely monitoring" + " Please note there are cases where the publisher can get into an indefinite stuck state. We are closely monitoring"
+ " how this evolves in the Kafka community and will take advantage of those fixes as soon as we can. In the mean time" + " how this evolves in the Kafka community and will take advantage of those fixes as soon as we can. In the mean time"
+ " it is possible to enter states where the only resolution will be to restart the JVM NiFi runs on.") + " it is possible to enter states where the only resolution will be to restart the JVM NiFi runs on. The complementary NiFi processor for sending messages is PublishKafka.")
@Tags({"Kafka", "Get", "Ingest", "Ingress", "Topic", "PubSub", "Consume", "0.9"}) @Tags({"Kafka", "Get", "Ingest", "Ingress", "Topic", "PubSub", "Consume", "0.9.x"})
@WritesAttributes({ @WritesAttributes({
@WritesAttribute(attribute = KafkaProcessorUtils.KAFKA_COUNT, description = "The number of messages written if more than one"), @WritesAttribute(attribute = KafkaProcessorUtils.KAFKA_COUNT, description = "The number of messages written if more than one"),
@WritesAttribute(attribute = KafkaProcessorUtils.KAFKA_KEY, description = "The key of message if present and if single message. " @WritesAttribute(attribute = KafkaProcessorUtils.KAFKA_KEY, description = "The key of message if present and if single message. "

View File

@ -57,12 +57,12 @@ import org.apache.nifi.processor.util.FlowFileFilters;
import org.apache.nifi.processor.util.StandardValidators; import org.apache.nifi.processor.util.StandardValidators;
@Tags({"Apache", "Kafka", "Put", "Send", "Message", "PubSub", "0.9.x"}) @Tags({"Apache", "Kafka", "Put", "Send", "Message", "PubSub", "0.9.x"})
@CapabilityDescription("Sends the contents of a FlowFile as a message to Apache Kafka using the Kafka 0.9 producer. " @CapabilityDescription("Sends the contents of a FlowFile as a message to Apache Kafka using the Kafka 0.9.x Producer. "
+ "The messages to send may be individual FlowFiles or may be delimited, using a " + "The messages to send may be individual FlowFiles or may be delimited, using a "
+ "user-specified delimiter, such as a new-line. " + "user-specified delimiter, such as a new-line. "
+ " Please note there are cases where the publisher can get into an indefinite stuck state. We are closely monitoring" + " Please note there are cases where the publisher can get into an indefinite stuck state. We are closely monitoring"
+ " how this evolves in the Kafka community and will take advantage of those fixes as soon as we can. In the mean time" + " how this evolves in the Kafka community and will take advantage of those fixes as soon as we can. In the mean time"
+ " it is possible to enter states where the only resolution will be to restart the JVM NiFi runs on.") + " it is possible to enter states where the only resolution will be to restart the JVM NiFi runs on. The complementary NiFi processor for fetching messages is ConsumeKafka.")
@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED) @InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
@DynamicProperty(name = "The name of a Kafka configuration property.", value = "The value of a given Kafka configuration property.", @DynamicProperty(name = "The name of a Kafka configuration property.", value = "The value of a given Kafka configuration property.",
description = "These properties will be added on the Kafka configuration after loading any provided configuration properties." description = "These properties will be added on the Kafka configuration after loading any provided configuration properties."

View File

@ -24,8 +24,8 @@
<!-- Processor Documentation ================================================== --> <!-- Processor Documentation ================================================== -->
<h2>Description:</h2> <h2>Description:</h2>
<p> <p>
This Processors polls <a href="http://kafka.apache.org/">Apache Kafka</a> This Processor polls <a href="http://kafka.apache.org/">Apache Kafka</a>
for data using KafkaConsumer API available with Kafka 0.9+. When a message is received for data using KafkaConsumer API available with Kafka 0.9.x. When a message is received
from Kafka, this Processor emits a FlowFile where the content of the FlowFile is the value from Kafka, this Processor emits a FlowFile where the content of the FlowFile is the value
of the Kafka message. of the Kafka message.
</p> </p>

View File

@ -24,9 +24,9 @@
<!-- Processor Documentation ================================================== --> <!-- Processor Documentation ================================================== -->
<h2>Description:</h2> <h2>Description:</h2>
<p> <p>
This Processors puts the contents of a FlowFile to a Topic in This Processor puts the contents of a FlowFile to a Topic in
<a href="http://kafka.apache.org/">Apache Kafka</a> using KafkaProducer API available <a href="http://kafka.apache.org/">Apache Kafka</a> using KafkaProducer API available
with Kafka 0.9+ API. The content of a FlowFile becomes the contents of a Kafka message. with Kafka 0.9.x API. The content of a FlowFile becomes the contents of a Kafka message.
This message is optionally assigned a key by using the &lt;Kafka Key&gt; Property. This message is optionally assigned a key by using the &lt;Kafka Key&gt; Property.
</p> </p>