Drew Oetzel
Free Kafka tooling: 6 annoying tasks to offload
Lenses Community Edition is a premium Kafka tool for developers to blaze the path to production.

Drew Oetzel
You didn’t become a developer to spend hours hunting down missing messages, or debugging consumer issues. Yet here we are. Valuable dev time evaporates as you wrestle with Apache Kafka, or wait for a central team to unblock you, when you should be finding, prepping, and shipping streaming data in minutes.
Lenses Community Edition tackles these everyday frustrations. We’re proud to offer the latest Lenses 6, a premium tool designed to work easily with multiple different Kafka clusters, completely free – forever.
Here, we will explain how you can use Lenses Community Edition as the Kafka tool you need for the path from dev, all the way to production.
The Kafka tooling space is crowded with options, ranging from CLI tools to enterprise platforms. Most fall into one of three categories:
Administration Kafka tools for cluster and topic management
Kafka monitoring tools for metrics and performance tracking
Kafka stream processing tools for preparing and working with streaming data.
Lenses Community Edition combines all 3 of these use cases into a single Kafka GUI, providing a top notch developer experience. It’s free for up to 2 Kafka clusters (can be different flavors) and 2 users, with a clear upgrade path when you’re ready to grow beyond this.
It is also the only Kafka tool that lets you explore and work with multiple different Kafka clusters at the same time; so whether you’re on an open-source Kafka, Amazon MSK, Confluent, or a mixture of many, Lenses lets you operate data streams across them all.
We love Apache Kafka. But when you love it, and your teams do too, it gets out of control fast. Onboarding dozens, then hundreds of new users and Kafka clusters can quickly become a time-drain:
The struggle is real to find Kafka messages in topics. You’re stuck digging through configuration files, checking consumer offsets, and feeling your productivity draining away.
Lenses Community Edition includes a data catalog of your clusters and their topics. On top of that, it provides a Kafka UI-based Global SQL query interface to find exactly what you need. Here is an example of a SQL query to find failed order messages, without the need to resort to a CLI:
Your app stops, customers complain, and you’re the last to know. There’s a moment of dread when customer support pings you about an issue, and you realize your app has been failing for hours.
Lenses Community Edition surfaces issues before they become incidents. Set Kafka alerts, monitor trends, and spot problems before your users do.
Your entire consumer goes down, thanks to one bad message. Somewhere in your topic, something is broken, and now you are stuck manually inspecting messages while the pressure builds to fix it fast.
Lenses is the Kafka tool to help. Filter messages by timestamp or offset using SQL.
Or:
Create a stream processor* (Lenses SQL processor) that reads from the orders topic
Filter for Kafka messages that failed to decode properly
Write those problematic messages to a dead letter queue topic called orders_dlq
Set this up once, then spend your time fixing the root cause, not searching for the symptom.
Schema changes should be routine, but if you miss a single compatibility check, you risk urgent roll-backs. Without the right Kafka tool, each schema change becomes a white-knuckle deployment, where compatibility issues can cascade into production issues.
Lenses lets you browse your schema registry, compare versions, and test compatibility before you ship – without a single CLI command.
Your consumer is falling behind, and you have no idea why. Consumer lag shouldn’t be something you discover through customer complaints. But tracking offsets manually across partitions is a tedious task nobody has time for.
Lenses Community Edition shows you exactly what’s happening with your consumers, in real time:
Which consumers are falling behind
How fast messages are being processed
When rebalancing events occur
Where your bottlenecks are
You need to join two streams, but it involves writing a Kafka Streams application with hundreds of lines of Java. Or setting up a complex Flink job that requires specialist knowledge. Instead, you can filter, join, or enrich your streams with a simple transformation, saving you precious development time.
Lenses makes this accessible with SQL:
This takes raw data from one stream, and customer data from another stream, combining them on-the-fly, and producing a new enriched stream with just the information needed.
Write once, deploy, and move onto more interesting problems.
*To run Lenses SQL processors in examples 3 and 6, you’ll need to install Lenses Community Edition on Kubernetes using Helm. Read on for deployment tips.
Ultimately, our goal with Lenses Community Edition is to give developers the tools they need to go big with data streaming. No dependence on platform teams to do the basics, or gradually grepping your way through CLIs to find data.
We hope it removes friction from everyday tasks, makes troubleshooting faster and less painful, and opens up streaming data access for you and your team.
Great for trying out Lenses, or for a dev environment.
1. Pull the docker image
2. Connect the Kafka cluster included in the docker file (up to 2 in total)
3. Start exploring data.
Run Lenses Community Edition locally →
Great for using Lenses on production deployments across multiple hosts and regions.
Deploy Lenses Community Edition on Helm →