Tun Shwe
Lenses MCP for Kafka
New Era in AI Enablement for Streaming App Development


Tun Shwe
If 2024 was the year enterprises adopted Generative AI, 2025 is the year Agentic AI became a reality.
In the last six months, the conversations I’m having with engineering leaders have quickly shifted from simple chatbots to AI-enabled IDEs, copilots and autonomous systems that take action.
This has to a large part been helped thanks to MCP. To date there have been 10,000+ MCP integrations built by the community, so to say it is a success is an understatement.
At the heart of it, this leap forward was made possible by Model Context Protocol (MCP).
Released by Anthropic in November 2024, it’s an open standard that gives Large Language Models (LLMs) and the agents powered by them the ability to discover, select and use tools, bridging the gap between making a decision and taking an action.
Many teams have learned that they don’t need a better reasoning model; they need better context. For engineering agents, this means the context of your IT and application infrastructure, your data and your practices.
Agentic AI systems are fundamentally disconnected from the constantly evolving reality of your engineering and IT operations.
Traditional approaches like Retrieval-Augmented Generation (RAG) rely on batch-processed data indexing and embedding, so are simply too slow. This latency creates a critical gap between an event happening and your agent knowing about it.
Since an agent is only as good as the context it receives, when that context is outdated, the agent becomes unreliable and untrustworthy.
The same real-time context that lets agents act decisively is now transforming how engineers build and operate streaming applications.
AI copilots with real-time context from Kafka can be in an exceptional position to help address the complexity of building and managing streaming applications and infrastructure.
Context may include:
Metadata and schema of streams
Payloads in streams
Lineage of streaming data
Topic/infrastructure configuration
State (e.g. Consumer group rebalancing)
Metrics (e.g. Under-replicated partitions)
Engineers have strived for a long time to abstract away the complexity of Kafka, managing schemas, configuring connectors and building streaming applications.
Today, I am excited to announce the Lenses MCP Server, our commitment to closing the gap between real-time data, apps and infrastructure for the next generation of engineering and business AI agents.
Lenses.io is already synonymous with best in class developer experience and governance for data streaming.
The MCP server extends this battle-tested capability to your AI layer via natural language queries.

This context can allow a copilot to offer all manner of assistance and reduce complexity for operators of Kafka and developers alike. Including:

This turns hours of manual work into minutes of guided collaboration.
But MCP isn’t just about context. The tools it has gives it the ability to take action, which can have almost no bounds:
"Build me a stream processing application joining sales orders with customers, filtered to just VIP customers and send it to S3"

Agents should not be treated differently to humans.
The MCP Server is underpinned by the Lenses leading multi-Kafka IAM & governance model, used by 1000s of engineers in a single company alone.
This also includes data masking to avoid agents being exposed to PII.
Agent actions are securely audited, including all data access, ensuring the transparency required to meet upcoming regulations like the EU AI Act and a responsible AI rollout from day one.
The true power of the Lenses SQL Snapshot and Processing engines becomes apparent when invoked by agents through MCP.
The Snapshot engine, built on Akka Streams, provides the powerful but lightweight mechanism for an agent to instantly find and explore data live in a Kafka topic (without needing to move it to an external store). This is in contrast to Flink, which works well for high volume stateful data but is usually excessive for agentic workflows.

The SQL Processing engine, built on Kafka Streams but with syntax parity with the Snapshot engine, offers agents a way of easily stream processing data to address an operational or analytical use case.
In the next 12 months, we'll see a dramatic shift in how we interact with software. Instead of solely clicking through web interfaces, we'll also be telling AI agents what we want to do. They'll do it for us and they’ll be at their best when they have the latest information.
We invite you to start having conversations with your Kafka environments today and join us in building the future of agentic AI systems fueled with real-time context. The Lenses MCP server can be used with existing Lenses v6 installations and can also be used with Lenses Community Edition, so head over to our GitHub repository to get started.
Meet the SRE AI Agent for Apache Kafka....
Andrew Stevenson
Ready to get started with Lenses?
Download free