Kafka replication meant choosing between cloud providers and clunky operations. Until now.
Distributing streaming data across environments and clouds has become an essential pattern for a modern data stack, for everything from disaster recovery, to adopting a Data Mesh architecture.
To be successful, engineers need:
But there isn’t much choice. Most solutions only solve part of the problem, and teams waste time switching between different solutions. More companies are using multiple clouds, and need one tool to unify data sharing across different domains.
A new Kafka replicator that gives any user the power to easily and reliably share real-time data across their business.
To avoid vendor lock-in and replicate data across any cluster and vendor, from Confluent Cloud and Redpanda to MSK and Azure Event Hubs.
Full self-service capabilities to empower teams to replicate data, all while governed by multi-cluster Identity and Access Management and auditing.
Built by the elite engineering team behind the Lenses SQL engines.
Replicate your topic from one Kafka to another with a click. Define your replication in a simple YAML. Lenses does the rest.