Kafka Consumer (deprecated)

The Kafka Consumer origin reads data from a single topic in an Apache Kafka cluster. For information about supported versions, see Supported Systems and Versions in the Data Collector documentation.
Important: This stage is deprecated and may be removed in a future release. To read from Kafka, use the Kafka Multitopic Consumer, which can read from multiple topics using multiple threads.

When you configure a Kafka Consumer, you configure the consumer group name, topic, and ZooKeeper connection information.

You can configure the Kafka Consumer to work with the Confluent Schema Registry. The Confluent Schema Registry is a distributed storage layer for Avro schemas which uses Kafka as its underlying storage mechanism.

You can add additional Kafka configuration properties as needed. You can configure the origin to use Kafka security features. You can also configure the origin to capture Kafka message keys and store them in the record.

Kafka Consumer includes record header attributes that enable you to use information about the record in pipeline processing.

You can also use a connection to configure the origin.