About
You can read/write data via the REST Proxy.
The REST Proxy depends on the Schema Registry when producing/consuming avro data, so you’ll need to pass in the details for the detached Schema Registry host.
Articles Related
Management
Startup
- Docker
docker run -d \
--net=host \
--name=kafka-rest \
-e KAFKA_REST_ZOOKEEPER_CONNECT=localhost:32181 \
-e KAFKA_REST_LISTENERS=http://localhost:8082 \
-e KAFKA_REST_SCHEMA_REGISTRY_URL=http://localhost:8081 \
-e KAFKA_REST_HOST_NAME=localhost \
confluentinc/cp-kafka-rest:4.0.0
Port
8082
Consume
Create a consumer instance
curl -X POST -H "Content-Type: application/vnd.kafka.v1+json" \
--data '{"name": "my_consumer_instance", "format": "avro", "auto.offset.reset": "smallest"}' \
http://localhost:8082/consumers/my_avro_consumer
{"instance_id":"my_consumer_instance","base_uri":"http://localhost:8082/consumers/my_avro_consumer/instances/my_consumer_instance"}
Retrieve data from a topic
The messages will be decoded, translated to JSON, and included in the response. The schema used for deserialization is retrieved automatically from the Schema Registry service (defined at startup).
- Example: Retrieve data from the bar topic
curl -X GET -H "Accept: application/vnd.kafka.avro.v1+json" \
http://localhost:8082/consumers/my_avro_consumer/instances/my_consumer_instance/topics/bar
[{"key":null,"value":{"f1":"value1"},"partition":0,"offset":0},{"key":null,"value":{"f1":"value2"},"partition":0,"offset":1},{"key":null,"value":{"f1":"value3"},"partition":0,"offset":2}]