Kafka Producer
Supported pipeline types:
|
When you configure a Kafka Producer, you define connection information, the partition strategy, and data format to use. You can also configure Kafka Producer to determine the topic to write to at runtime.
The Kafka Producer passes data to partitions in the Kafka topic based on the partition strategy that you choose. You can optionally write a batch of records to the Kafka cluster as a single message. When you want the destination to send responses to a microservice origin within a microservice pipeline, you specify the type of response to send.
You can add additional Kafka configuration properties as needed. You can configure the destination to use Kafka security features. You can also configure the destination to pass message key values stored in the record to Kafka as Kafka message keys.
You can configure the Kafka Producer to work with the Confluent Schema Registry. The Confluent Schema Registry is a distributed storage layer for Avro schemas which uses Kafka as its underlying storage mechanism.
You can also use a connectionconnection to configure the destination.