• Pricing
  • Install Now
installNow icon
installNow icon
Install Now
homeMobile icon
homeMobile icon
Home
picingMobile icon
picingMobile icon
Pricing
blogMobile icon
blogMobile icon
Blog

Lenses MCP for Kafka

Tun Shwe
By Tun ShweOctober 23, 2025
LENSES.IO + MCP SERVER
In this article:
  • 01.What is MCP?
  • 02.The Problem of Stale Context in Engineering
  • 03.Streaming data & infrastructure complexity exhausts engineers
  • 04.Introducing the Lenses MCP Server
  • 05.Use Cases
  • 06.Security and Governance is Paramount
  • 07.SQL for data exploration & processing
  • 08.Get Started with the Lenses MCP Server

If 2024 was the year enterprises adopted Generative AI, 2025 is the year Agentic AI became a reality. 

In the last six months, the conversations I’m having with engineering leaders have quickly shifted from simple chatbots to AI-enabled IDEs, copilots and autonomous systems that take action.

This has to a large part been helped thanks to MCP. To date there have been 10,000+ MCP integrations built by the community, so to say it is a success is an understatement.

What is MCP?

At the heart of it, this leap forward was made possible by Model Context Protocol (MCP). 

Released by Anthropic in November 2024, it’s an open standard that gives Large Language Models (LLMs) and the agents powered by them the ability to discover, select and use tools, bridging the gap between making a decision and taking an action. 

Many teams have learned that they don’t need a better reasoning model; they need better context. For engineering agents, this means the context of your IT and application infrastructure, your data and your practices. 

The Problem of Stale Context in Engineering

Agentic AI systems are fundamentally disconnected from the constantly evolving reality of your engineering and IT operations. 

Traditional approaches like Retrieval-Augmented Generation (RAG) rely on batch-processed data indexing and embedding, so are simply too slow. This latency creates a critical gap between an event happening and your agent knowing about it.

Since an agent is only as good as the context it receives, when that context is outdated, the agent becomes unreliable and untrustworthy.

Streaming data & infrastructure complexity exhausts engineers

The same real-time context that lets agents act decisively is now transforming how engineers build and operate streaming applications. 

AI copilots with real-time context from Kafka can be in an exceptional position to help address the complexity of building and managing streaming applications and infrastructure.

Context may include:

  • Metadata and schema of streams
  • Payloads in streams
  • Lineage of streaming data
  • Topic/infrastructure configuration
  • State (e.g. Consumer group rebalancing)
  • Metrics (e.g. Under-replicated partitions)

Engineers have strived for a long time to abstract away the complexity of Kafka, managing schemas, configuring connectors and building streaming applications. 

Introducing the Lenses MCP Server

Today, I am excited to announce the Lenses MCP Server, our commitment to closing the gap between real-time data, apps and infrastructure for the next generation of engineering and business AI agents. 

Lenses.io is already synonymous with best in class developer experience and governance for data streaming. 

The MCP server extends this battle-tested capability to your AI layer via natural language queries.

Lenses MCP Server architecture


Use Cases

This context can allow a copilot to offer all manner of assistance and reduce complexity for operators of Kafka and developers alike. Including:

Use cases and Answers

This turns hours of manual work into minutes of guided collaboration. 

But MCP isn’t just about context. The tools it has gives it the ability to take action, which can have almost no bounds: 

"Build me a stream processing application joining sales orders with customers, filtered to just VIP customers and send it to S3"

Claude example workflow of creating a data pipeline


Security and Governance is Paramount

Agents should not be treated differently to humans. 

The MCP Server is underpinned by the Lenses leading multi-Kafka IAM & governance model, used by 1000s of engineers in a single company alone.

This also includes data masking to avoid agents being exposed to PII.  

Agent actions are securely audited, including all data access, ensuring the transparency required to meet upcoming regulations like the EU AI Act and a responsible AI rollout from day one.

SQL for data exploration & processing

The true power of the Lenses SQL Snapshot and Processing engines becomes apparent when invoked by agents through MCP. 

The Snapshot engine, built on Akka Streams, provides the powerful but lightweight mechanism for an agent to instantly find and explore data live in a Kafka topic (without needing to move it to an external store). This is in contrast to Flink, which works well for high volume stateful data but is usually excessive for agentic workflows. 

Find a message in Kafka with Lenses MCP and Claude

The SQL Processing engine, built on Kafka Streams but with syntax parity with the Snapshot engine, offers agents a way of easily stream processing data to address an operational or analytical use case.

Get Started with the Lenses MCP Server

In the next 12 months, we'll see a dramatic shift in how we interact with software. Instead of solely clicking through web interfaces, we'll also be telling AI agents what we want to do. They'll do it for us and they’ll be at their best when they have the latest information.

We invite you to start having conversations with your Kafka environments today and join us in building the future of agentic AI systems fueled with real-time context. The Lenses MCP server can be used with existing Lenses v6 installations and can also be used with Lenses Community Edition, so head over to our GitHub repository to get started.

Back to all blogs

Related Blogs

Lenses 6.2 Oauth
Lenses 6.2 Oauth
Blog

Lenses 6.2 - Trusting Agents to build & operate event-driven applications

andrew
andrew
By
Andrew Stevenson
image
image
Blog

Kafka Migrations Need More Than a Replicator

Jonas Best Profile Picture
Jonas Best Profile Picture
By
Jonas Best
kafkaconnections hero banner
kafkaconnections hero banner
Blog

Self-Service Data Replication with K2K - part 1

Drew Oetzel
Drew Oetzel
By
Drew Oetzel

Lenses, autonomy in data streaming

Install now
Products
Developer Experience
Kafka replicator
Lenses AI
Kafka Connectors
Pricing
Company
About
Careers
Contact
Solutions by industry
Financial services
For engineers
Docs
Ask Marios Discourse
Github
Slack
For executives
Case studies
Resources
Blog
Press room
Events
LinkedIn
Youtube
Legal
Terms
Privacy
Cookies
SLAs
EULA
© 2026Apache, Apache Kafka, Kafka and associated open source project names are trademarks of the Apache Software Foundation