Lenses 6.1 - Kafka connectivity to your Copilot & self-service data replication

By

Stefan Bocutiu

Oct 27, 2025

Here at Lenses we’re as always laser-focused on making engineering streaming apps and managing Kafka, not just less stressful, but delightful. 

6.1 is another big step forward.  

It starts with the Lenses MCP Server. It connects your AI assistant such as Cursor and Claude, bringing the knowledge of the internet with the context of your Kafka environment. It has the power to transform the work of engineers building and managing streaming apps. 

But AI cannot do anything without enterprise-grade tools and security as a foundation.  

That’s why we’re also wrapping our K2K data replicator in our governance framework to offer humans and Agents workflows to replicate data across Kafka clusters in a self-service fashion. 

And if you’re a human reading this, we’ve revamped SQL Studio to bring an IDE-like experience to working with streams. It’s designed to streamline how engineers explore & work across all their Kafka assets.  

Last but not least, with many of you having dozens, even hundreds of Kafka clusters, we now offer centralised management of connections to your Kafka environments in Lenses HQ, with error handling whilst still promoting an as-code experience to fit into your CI/CD practices.

Kafka AI-Enablement via MCP

It’s difficult to find any engineer that doesn’t use Claude or ChatGTP on a daily basis. In some cases, some may be vibe-coding in Windsurf or Cursor. When asking questions, these solutions aren’t going to answer with the context of your streaming apps or environment of course. For example, how many optimal app runners should be configured to consume from a topic depends on, but not limited to, the number of partitions and how data is partitioned across them. This is easily something an AI Assistant can help with as long as it has context of the Kafka environment.

Kafka MCP Server - prompt to get context on runners

But it’s not just context. It’s also taking action. This is where the game changes - tasks that may have taken days or even weeks can be automated in minutes.

Create entire Kafka pipeline with an AI prompt

Read more about the opportunities of Lenses MCP in our MCP release blog.

Self-Service Kafka Data Replication

If you’ve been following us, you’ll know Lenses has been developing K2K. A vendor-neutral and enterprise-class data replicator for distributing data from one cluster to another. It offers an alternative to the complex MirrorMaker2 and Confluent’s Replicator and ClusterLinking.

As of today, this was offered as a standalone offering only, self-deployed as a container on Kubernetes. 

But our goal was not just to provide a robust replicator, but offer a delightful experience to manage replication: to give developers self-service control to replicate their data, with security, governance and a premium developer experience.  

For example, an engineer wants to replicate some data from a production cluster to staging to test an application.  Lenses security model permitting, the engineer would be able to do this themselves with just a few clicks. Avoiding complex MirrorMaker2 complexity and Confluent costs.

Kafka Connections

This is driven through a new entity in Lenses: Kafka Connections.  

It allows Platform Engineers to set up connections for Kafka to Kafka replicators, providing control over deployment locations in Kubernetes. This up-front work by Admins allows end users (such as devs) to self-deploy their own K2K replicators. 

Kafka Connections abstract away all credentials from the user. It even means Lenses never needs to store them, including within K2K configuration files. 

For now Kafka Connections target our Kafka to Kafka replicator but future release will open this up to all application types, including AI Agents.

New IDE-style Studio

The SQL Studio has been the central interface for exploring data streams, thanks to the Lenses SQL Snapshot engine.

But as a developer, working with streams involves more than just exploring data. It requires managing and evolving Schemas, validating ACLs, Consumer Groups, Quotas, deploying stream processing jobs and more. 

And although this was possible in Lenses 6.0, it required jumping into different screens. Meanwhile, engineers love working in their IDEs. 

So we have revamped the SQL Studio - offering an IDE-style unified workbench to manage all your Kafka assets with a single experience.

For example, the new asset explorer allows you to explore Topics, Schemas and Consumers in a tree structure. With Quotas, ACLs and Apps coming in the next releases.

It’s full of productivity hacks that users of VS Code will be familiar with. One of the favourite will be the new bookmarking option - where you more popular assets you work with (for example, specific topics) can be pinned

New Environment Connection Experience

Lenses is designed for enterprise Kafka environments - those with dozens perhaps hundreds of clusters, potentially across different flavours & vendors. 

This meant two things:

  • Connections to Kafka environments should be managed as-code in your git repo

  • A horizontally-scalable architecture of “Agents” deployed next to each Kafka environment that holds the connections to Kafka (and Schema Registry, Kafka Connect etc.). 

Agents speak to HQ, not the other way round. That way they can be in a secure network without requiring HQ to penetrate into a higher impact-level network. 

This architecture also ensures there is no data movement and it can scale to 100s of clusters, as HQ requests are federated down to the Agents.

The challenge for engineers setting up Lenses is that there are multiple Connections for each Kafka environment - to the brokers, to schema registry, to Kafka Connect clusters, to auditing solutions etc. This is all controlled now via a file called “provisioning”, but mistakes to configuring connections could easily be made.

To make this simpler we reworked the Lenses Agent to boot with a minimal configuration, all it needs is a connection to Lenses HQ with an Agent key.  An API now allows users to push the provisioning down to the Agent from HQ.

But more than this, we lifted the JSON Schema that defined the configuration of Lenses Agents into HQ itself, we added validation rules and combined this with error reporting, a streamlined wizard to provide an IDE like experience to setting up and deploying the Lenses platform.

You can now quickly select your infrastructure type and the configuration updated and managed for you, with validation, intelli-sense and autocompletion. The new provisioning APIs allow for the agent to reconcile the desired configuration state, applying the file contents, updates, adding or removing connections.

This allows not only managing the configuration via APIs but also provides enhanced support in your IDE to add Lenses to your CI/CD pipelines.

Where to get it

Lenses 6.1 will be released on the 6th November.