Table of Contents

About

A converter is a connect concept.

It's the code used to persist data from a Connector.

Interface Read/Write

Converters are decoupled from connectors themselves to allow for reuse.

For example, using the same Avro converter

  • the JDBC Source Connector can write Avro data to Kafka
  • and the HDFS Sink Connector can read Avro data from Kafka.

Converter Basics

Management

List

The converter properties in a worker configuration file specify the converters.

Type Encode / Decode
key.converter key
value.converter value
internal.converter internal storage topics

Configuration

Each converter implementation will have its own associated configuration requirements.

To pass configuration parameters to key and value converters, prefix them with key.converter. or value.converter.

Example:

  • An AvroConverter bundled with the Schema Registry
key.converter=io.confluent.connect.avro.AvroConverter
# required when bundled with the Schema Registry 
key.converter.schema.registry.url=http://localhost:8081
  • JsonConverter without schemas
key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false

Built-in

converters for:

  • Avro, (Default/Recommended)
  • JSON,
  • and String initially.

Documentation / Reference