Connect Kafka to Redis

Simplify and speed up your Kafka to Redis sink with a Kafka compatible connector via Lenses UI/ CLI, Native plugin or Helm charts for Kubernetes deployments

Kafka to Redis

About the Kafka to Redis connector

License Apache 2.0

The Redis connector allows you to write data from Kafka to Redis. The connector takes the value from the Kafka Connect SinkRecords and inserts a new entry to Redis. This is an open-source project and so isn't available with Lenses support SLAs.

Options to connect Kafka to Redis

Docker to test the connector

Test in our pre-configured Lenses & Kafka sandbox packed with connectors

Use Lenses with your Kafka

Manage the connector in Lenses against your Kafka and data.

Or Kafka to Redis GitHub Connector

Download the connector the usual way from GitHub

Connector benefits

  • Flexible Deployment
  • Powerful Integration Syntax
  • Monitoring & Alerting
  • Integrate with your GitOps

Why use Lenses.io to write data from Kafka to Redis

This connector saves you from learning terminal commands and endless back-and-forths sinking from Kafka to Redis by managing the Redis stream reactor connector (and all your other connectors on your Kafka Connect Cluster) through Lenses.io. Here you can freely monitor, process and deploy data with the following features:

  • Error handling
  • Fine-grained permissions
  • Data observability
Kafka to Redis

How to push data from Kafka to Redis

  1. Launch the stack
  2. Copy the docker-compose file
  3. Bring up the stack
  4. Start the connector

Share this connector

Experience all features in a single instance always free for developers

Pull docker free