New Kafka to Kafka replication

Learn more
pattern

Confident Kafka replication,
complete autonomy

Kafka replication meant choosing between cloud providers and clunky operations. Until now.

Kafka replication

Kafka replication needs to change

Distributing streaming data across environments and clouds has become an essential pattern for a modern data stack, for everything from disaster recovery, to adopting a Data Mesh architecture.


To be successful, engineers need:

  • Self-service control, avoiding dependence on central teams
  • Straightforward operations, and great governance
  • Reliability, including exactly-once semantics
  • Management through GitOps.

But there isn’t much choice. Most solutions only solve part of the problem, and teams waste time switching between different solutions. More companies are using multiple clouds, and need one tool to unify data sharing across different domains.

Lenses K2K: universal, user-friendly
Kafka replication

A new Kafka replicator that gives any user the power to easily and reliably share real-time data across their business.

Vendor agnostic

To avoid vendor lock-in and replicate data across any cluster and vendor, from Confluent Cloud and Redpanda to MSK and Azure Event Hubs.

Self-service

Full self-service capabilities to empower teams to replicate data, all while governed by multi-cluster Identity and Access Management and auditing.

Enterprise grade

Built by the elite engineering team behind the Lenses SQL engines.

See the Kafka replicator in action

Replicate your topic from one Kafka to another with a click. Define your replication in a simple YAML. Lenses does the rest.

Get a private preview