• Pricing
  • Install Now
installNow icon
installNow icon
Install Now
homeMobile icon
homeMobile icon
Home
picingMobile icon
picingMobile icon
Pricing
blogMobile icon
blogMobile icon
Blog

Lenses 6.2 - Trusting Agents to build & operate event-driven applications

Andrew Stevenson
By Andrew StevensonApril 15, 2026
Lenses 6.2 Oauth
In this article:
  • 01.Modern engineering and operations of streaming applications built for the Agentic era.
  • 02.Enterprise-grade MCP security for Agents
  • 03.VS Code Extension Preview
  • 04.Improved global IDE Studio
  • 05.K2K Replicator
  • 06.App Migration Workflows
  • 07.Available now

Modern engineering and operations of streaming applications built for the Agentic era.

At Lenses, our goal has always been to help organizations get the most out of their streaming data. We started with visibility into the Apache Kafka, moving up to the part that drives value, the application layer and now the Agentic layer.

Lenses 6 moved us into a multi-Kafka world, as increasing, our clients aren’t just running on one type of Kafka anymore, and as sovereign cloud becomes increasingly topical (no pun intended) this is only increasing.

We are already providing the governance and workflows for software engineers, data engineers, and business users who need secure access to real time streams to build & operate streaming applications.

But we also live in a new age, Agentic Engineering for streaming data, where Agents are as much a user as anyone else. However, when it comes to security, Agents need to be treated with more care than humans, not less.

Lenses 6.2 is leading the way in the industry, enabling Agents to build event-driven and AI applications, giving them safe and governed access to your streaming context.

Enterprise-grade MCP security for Agents

Lenses MCP quickly demonstrated its potential for productivity, allowing engineers to use an AI assistant for tasks like analysing streams, querying schemas or consumer group lag while coding.

However, the initial setup was cumbersome. Engineers had to clone and run the MCP server locally, then manually create a long-lived, static Lenses API key with broad production access. This key was stored in a local config file.

While tolerable for solo experimentation, security teams raised concerns about static, broadly permissive API keys stored in plain text, and the lack of a session-specific audit trail differentiating AI actions from engineer actions.

Trusting non-deterministic Agents and their sub Agents, with access to the internet, to act responsibly with critical infrastructure and streaming data, on our behalf is the synopsis of a 2026 horror movie.

That’s why Lenses 6.2 expands on Lenses’ IAM model for Kafka by supporting OAuth 2.1. As demanded by the MCP protocol. This ensures Agents are only given scoped, short-lived, revocable and audited access to Lenses and Kafka.

We’re packaging a remote MCP Server in Lenses for you, which avoids users having to deploy MCP Servers locally on machines.

And in the coming weeks we’ll be including support for the Client ID Metadata Documents (CIMD) and Dynamic Client Registration (DCR) protocols.

VS Code Extension Preview

Engineering is in the middle of a fundamental shift. Engineers are being asked to both take on a wider set of responsibilities and be 5x more productive.

This requires more context switching than ever before, especially between different tools. From an IDE and AI Copilot to domain-specific monitoring solutions, log viewers, and CI systems.

The first wave of AI-assisted IDEs offered autocomplete and code generation. This next wave - Agentic Engineering - is different. They are connected to the actual tools and infrastructure engineers work with: ticketing systems, code repositories, streaming infrastructure and yes, Lenses, through MCP.

So thanks to Agents, engineers are able to spend more time in the IDE. But there’s still a need to manually execute workflow, or validate information. This should be possible without leaving the IDE.

Thanks to the Lenses VS Code Plugin, we’re proud to offer the experience of Lenses fully integrated in an IDE. From exploring topics with SQL and browsing schemas to deploying schemas and inspecting consumer lag.

Whilst ensuring the same enterprise-grade security that direct Lenses users have benefited from.

You can download the VSIX file from: https://docs.lenses.io/latest/vs-code-extension

Improved global IDE Studio

Not every user of Lenses is in VS Code or IDE’s built on top like Cursor, but they do expect an as-code and modern IDE-like experience for managing their Kafka resources & executing workflows.

Lenses 6.0 brought a new centralized place to view all your Kafka environments and introduced global SQL Studio to globally explore data streams.

In 6.2 we enable you to explore and manage all your Kafka resources (Connectors, Schemas, Consumer Groups, SQL Processors, …) in an IDE-like experience, without needing to drill down to a specific cluster. We have renamed SQL Studio to just Studio.

You’ll see huge productivity improvements and streamlined drilldowns and context menus. One of our favourite (sic) features is, well, Favourites, where you can star your most used Kafka resources for easy access in its own dedicated section.

New Studio
This gif showcases what the New Studio looks like and it's improvements

K2K Replicator

We have made huge improvements in the K2K replicator including being optimized for cross-cluster migrations and disaster recovery use cases. Key updates in K2K standalone release include:

  • Seamless Offset Mapping:

    Continuously replicate and map consumer group offsets from source to target clusters. Your consumers can fail over or migrate and resume exactly where they left off without data loss or duplication.

  • Advanced Topic Routing:

    Move beyond basic 1-to-1 replication. You can now avoid naming collisions or consolidate data on the fly by adding prefixes/suffixes, routing multiple topics to a single static destination (many-to-1), or using Regex for complex pattern-based mapping.

  • Point-in-Time Starts:

    Save bandwidth by copying only the slice of data you actually need. You can now start replication anywhere between the earliest and latest offsets, driven by either fixed offsets or precise timestamps.

  • Streamlined Offset Management:

    Because cross-cluster replication requires dedicated consumer group management, we’ve completely simplified the user flow to surface critical details and reduce configuration friction.

Stay tuned for K2K version 2.0 in the coming weeks for major developments for supporting disaster recovery use cases.

App Migration Workflows

Kafka migrations seem to be a never-ending project - be it as part of a cloud migration, vendor migration, self-managed to managed migration or workload isolation.

We know that migrating apps to a new Kafka cluster can be a notoriously complex challenge. Leveraging the improvements in K2K 1.3, we are introducing a workflow in Studio to totally streamline application migrations, reducing the time they take, the complexity and the risk. Most importantly, it provides a reliable, built-in rollback option,giving you the safety net you need without abstracting away the underlying mechanics of how your Kafka consumer groups operate.

Check out this blog by Jonas and Patrick

Available now

If you’re an existing customer, 6.2 is an easy upgrade from 6.* and available now. For community users, download the latest Lenses Community Edition for free now!

Back to all blogs

Related Blogs

image
image
Blog

Kafka Migrations Need More Than a Replicator

Jonas Best Profile Picture
Jonas Best Profile Picture
By
Jonas Best
kafkaconnections hero banner
kafkaconnections hero banner
Blog

Self-Service Data Replication with K2K - part 1

Drew Oetzel
Drew Oetzel
By
Drew Oetzel
MCP Rocket
MCP Rocket
Blog

The Challenges in Productionising MCP Servers

tun shwe
tun shwe
By
Tun Shwe

Lenses, autonomy in data streaming

Install now
Products
Developer Experience
Kafka replicator
Lenses AI
Kafka Connectors
Pricing
Company
About
Careers
Contact
Solutions by industry
Financial services
For engineers
Docs
Ask Marios Discourse
Github
Slack
For executives
Case studies
Resources
Blog
Press room
Events
LinkedIn
Youtube
Legal
Terms
Privacy
Cookies
SLAs
EULA
© 2026Apache, Apache Kafka, Kafka and associated open source project names are trademarks of the Apache Software Foundation