Lenses DataOps docker box

Learn DataOps in the Lenses Kafka Docker Box

Lenses Box is a complete container solution for you to build applications on a localhost Apache Kafka docker.

Enjoy a 3-min tour!

By submitting the form, you agree to our Terms of Use and Privacy Policy
fast data docker


What is included

Apache Kafka v. 2.5.1, Elasticsearch v. 6.8.7 and ecosystem of open source kafka tools and real-time example data.

Here are some of the things you’ll be able to do:

Use the real-time data & application catalog
Develop real-time applications
Monitor Kafka and app flows
Run admin Kafka operations
Explore streaming data with SQL
Protect HIPAA,GDPR,PCI data
Use streaming SQL applications
Manage schemas, consumers and more
understand lenses box



First get your key here

docker run -e ADV_HOST= \
       -e EULA="https://licenses.lenses.io/d/?id=REGISTER_FOR_KEY" \
       --rm -p 3030:3030 -p 9092:9092 lensesio/box:latest

Note: Make sure you receive in your email your free access key
Note: Kafka Docker will require 4GB RAM. See how to increase docker RAM

If you run on a VM and want Kafka to be accessible over the Internet, set as ADV_HOST your public IP address.


Access from your browser


Default username and password is
admin / admin

lenses user interface from browser


Explore real-time data

1. Select sea_vessel_position_reports
2. Run the following query:

SELECT Speed, Latitude, Longitude
FROM sea_vessel_position_reports
WHERE Speed > 10

Learn real-time SQL

lenses sql studio


Create a new Kafka topic

1. Explore data
2. Use the New Topic button


You can also automate via GitOps

create new kafka topic


Create a data pipeline

Stream real-time events into the Kafka topic (using Kafka Connect)

1. Navigate to Connectors
2. Click New Connector and select the file source connector
3. Use the following configuration


This will continuously move log events from /var/log/broker.log into the kafka topic.


To produce messages with a Kafka producer use as bootstrap broker your ADV_HOST and port 9092. eg. PLAINTEXT://


Stream processing

Use streaming SQL to continuously process real-time data.

1. Click SQL Processors and then New
2. Set a name and use streaming SQL

SET defaults.topic.autocreate=true;
INSERT INTO speed_boats
  SELECT STREAM MMSI, Speed, Longitude AS Long, Latitude AS Lat, `Timestamp`
  FROM sea_vessel_position_reports
  WHERE Speed > 10;

This will continuously filter all messages from the source topic into speed_boats topic (when the Speed of a boat is greater than 10.




Learn streaming SQL →


Consumer Monitor

Now with streaming applications running, access the Consumers link to monitor the status and consumer lag of your real-time apps.
Note: You can also set up alerts to
DataOps alerts with Slack
DataOps alerts with Microsoft teams
DataOps alerts with PagerDuty
DataOps alerts with Datadog
DataOps alerts with Prometheus


When a Consumer is not running you can also change the offsets!



Monitor real-time flows

Access the Topology to view a realtime mapof how applications ,connectors and streaming SQL interacts with Kafka topics.

Click on a node to see the details, metrics and health statuses.


You can add your applications, data pipelines and also microservices in the topology map.



Setup alerting

Real-time monitoring and notifications for:

Infrastructure health
Producer SLAs
Consumer SLAs

Note: Learn more about monitoring integrations

Explore advanced options

Data centric security
Single Sign On authentication
HIPAA,GDPR,PCI data policies
Automate Flows with GitOps
Manage Kafka ACLs & Quotas
AWS MSK, Azure and other Cloud integrations

Check the docs


Want to try on your cluster?

Talk with us

Frequently Asked Questions


Lenses Box is optimized for development and includes a single broker Kafka environment. However, you can evaluate Lenses against your existing setup. Lenses works with all popular Kafka distributions and clouds. Contact our team to help you

Lenses Box is Free for ALL and forever.

In order to use the Docker of Lenses Box you require a free access key. You may need to refresh your key from time to time for security reasons.

You will need a paid subscription to connect Lenses to your own cluster or production.

See pricing

It's not recommended.

Lenses Box is a single-node setup. For production environments it is recommended to have a multi-node setup for scalability and fail-over.

For production or multi-node setups of your Kafka cluster contact our team

Yes. Lenses works with AWS MSK, Azure HDInside, Aiven, Confluent Cloud and other managed cloud services.

Lenses also provides deployment templates tailored for each Cloud provider.

Check Deployment Options

Do you have more questions?
Let us help