KafkaRecordSink_2_0

Deprecation notice:

Please be aware this processor is deprecated and may be removed in the near future.

Please consider using one the following alternatives: KafkaRecordSink_2_6

Description:

Provides a service to write records to a Kafka 2.x topic.

Tags:

kafka, record, sink

Properties:

In the list below, the names of required properties appear in bold. Any other properties (not in bold) are considered optional. The table also indicates any default values, and whether a property supports the NiFi Expression Language.

Display NameAPI NameDefault ValueAllowable ValuesDescription
Kafka Brokersbootstrap.serverslocalhost:9092Comma-separated list of Kafka Brokers in the format host:port
Supports Expression Language: true (will be evaluated using variable registry only)
Topic NametopicThe name of the Kafka Topic to publish to.
Supports Expression Language: true (will be evaluated using variable registry only)
Record Writerrecord-sink-record-writerController Service API:
RecordSetWriterFactory
Implementations: FreeFormTextRecordSetWriter
CSVRecordSetWriter
ParquetRecordSetWriter
RecordSetWriterLookup
ScriptedRecordSetWriter
XMLRecordSetWriter
JsonRecordSetWriter
AvroRecordSetWriter
Specifies the Controller Service to use for writing out the records.
Delivery GuaranteeacksBest Effort
  • Best Effort Records are considered 'transmitted successfully' after successfully writing the content to a Kafka node, without waiting for a response. This provides the best performance but may result in data loss.
  • Guarantee Single Node Delivery Records are considered 'transmitted successfully' if the message is received by a single Kafka node, whether or not it is replicated. This is faster than <Guarantee Replicated Delivery> but can result in data loss if a Kafka node crashes.
  • Guarantee Replicated Delivery Records are considered 'transmitted unsuccessfully' unless the message is replicated to the appropriate number of Kafka Nodes according to the Topic configuration.
Specifies the requirement for guaranteeing that a message is sent to Kafka. Corresponds to Kafka's 'acks' property.
Message Header Encodingmessage-header-encodingUTF-8For any attribute that is added as a message header, as configured via the <Attributes to Send as Headers> property, this property indicates the Character Encoding to use for serializing the headers.
Security Protocolsecurity.protocolPLAINTEXT
  • PLAINTEXT
  • SSL
  • SASL_PLAINTEXT
  • SASL_SSL
Security protocol used to communicate with brokers. Corresponds to Kafka Client security.protocol property
SASL Mechanismsasl.mechanismGSSAPI
  • GSSAPI General Security Services API for Kerberos authentication
  • PLAIN Plain username and password authentication
  • SCRAM-SHA-256 Salted Challenge Response Authentication Mechanism using SHA-512 with username and password
  • SCRAM-SHA-512 Salted Challenge Response Authentication Mechanism using SHA-256 with username and password
SASL mechanism used for authentication. Corresponds to Kafka Client sasl.mechanism property
Kerberos Credentials Servicekerberos-credentials-serviceController Service API:
KerberosCredentialsService
Implementation: KeytabCredentialsService
Service supporting generalized credentials authentication with Kerberos
Kerberos Service Namesasl.kerberos.service.nameThe service name that matches the primary name of the Kafka server configured in the broker JAAS configuration
Supports Expression Language: true (will be evaluated using variable registry only)
SSL Context Servicessl.context.serviceController Service API:
SSLContextService
Implementations: StandardSSLContextService
StandardRestrictedSSLContextService
Service supporting SSL communication with Kafka brokers
Max Request Sizemax.request.size1 MBThe maximum size of a request in bytes. Corresponds to Kafka's 'max.request.size' property and defaults to 1 MB (1048576).
Acknowledgment Wait Timeack.wait.time5 secsAfter sending a message to Kafka, this indicates the amount of time that we are willing to wait for a response from Kafka. If Kafka does not acknowledge the message within this time period, the FlowFile will be routed to 'failure'.
Max Metadata Wait Timemax.block.ms5 secThe amount of time publisher will wait to obtain metadata or wait for the buffer to flush during the 'send' call before failing the entire 'send' call. Corresponds to Kafka's 'max.block.ms' property
Supports Expression Language: true (will be evaluated using variable registry only)
Compression Typecompression.typenone
  • none
  • gzip
  • snappy
  • lz4
This parameter allows you to specify the compression codec for all data generated by this producer.

Dynamic Properties:

Supports Sensitive Dynamic Properties: No

Dynamic Properties allow the user to specify both the name and value of a property.

NameValueDescription
The name of a Kafka configuration property.The value of a given Kafka configuration property.These properties will be added on the Kafka configuration after loading any provided configuration properties. In the event a dynamic property represents a property that was already set, its value will be ignored and WARN message logged. For the list of available Kafka properties please refer to: http://kafka.apache.org/documentation.html#configuration.
Supports Expression Language: true (will be evaluated using variable registry only)

State management:

This component does not store state.

Restricted:

This component is not restricted.

System Resource Considerations:

None specified.