Lenses DataOps docker box

Learn DataOps in a Local Kafka Box

Lenses is a complete container solution for you to build applications on a localhost Apache Kafka.

Enjoy a 3-min tour!

By submitting the form, you agree to our Terms of Use and Privacy Policy
fast data docker
DataOps getting started with the Lenses.io box

01.

What is included

Apache Kafka v. 2.5.1, Elasticsearch v. 6.8.7 and ecosystem of open source kafka tools and real-time example data.

Here are some of the things you’ll be able to do:

Use the real-time data & application catalog
Develop real-time applications
Monitor Kafka and app flows
Run admin Kafka operations
Explore streaming data with SQL
Protect HIPAA,GDPR,PCI data
Use streaming SQL applications
Manage schemas, consumers and more
understand lenses box

02.

Start

First get your key here

docker run -e ADV_HOST=127.0.0.1 \
       -e EULA="https://dl.lenses.io/d/?id=REGISTER_FOR_KEY" \
       --rm -p 3030:3030 -p 9092:9092 lensesio/box:latest


Note: Make sure you receive in your email your free access key
Note: Docker will require 4GB RAM. See how to increase docker RAM


TIP
If you run on a VM and want Kafka to be accessible over the Internet, set as ADV_HOST your public IP address.

03.

Access from your browser

http://localhost:3030

Default username and password is
admin / admin

lenses user interface from browser

04.

Explore real-time data

1. Select sea_vessel_position_reports
2. Run the following query:

SELECT Speed, Latitude, Longitude
FROM sea_vessel_position_reports
WHERE Speed > 10

Learn real-time SQL

lenses sql studio

05.

Create a new Kafka topic

1. Explore data
2. Use the New Topic button

TIP

TIP
You can also automate via GitOps

create new kafka topic

06.

Create a data pipeline

Stream real-time events into the Kafka topic (using Kafka Connect)

1. Navigate to Connectors
2. Click New Connector and select the file source connector
3. Use the following configuration

name=my-real-time-file-connector
connector.class=org.apache.kafka.connect.file.FileStreamSourceConnector
topic=broker_logs
file=/var/log/broker.log
tasks.max=1


This will continuously move log events from /var/log/broker.log into the kafka topic.

KAFKA PRODUCER

KAFKA PRODUCER
To produce messages with a Kafka producer use as bootstrap broker your ADV_HOST and port 9092. eg. PLAINTEXT://127.0.0.1:9092

07.

Stream processing

Use streaming SQL to continuously process real-time data.

1. Click SQL Processors and then New
2. Set a name and use streaming SQL

SET defaults.topic.autocreate=true;
INSERT INTO speed_boats
  SELECT STREAM MMSI, Speed, Longitude AS Long, Latitude AS Lat, `Timestamp`
  FROM sea_vessel_position_reports
  WHERE Speed > 10;

This will continuously filter all messages from the source topic into speed_boats topic (when the Speed of a boat is greater than 10.

TIP

TIP

08.

Consumer Monitor

Now with streaming applications running, access the Consumers link to monitor the status and consumer lag of your real-time apps.
Note: You can also set up alerts to
DataOps alerts with PagerDutyDataOps alerts with PagerDutyDataOps alerts with PagerDutyDataOps alerts with DatadogDataOps alerts with Prometheus

TIP

TIP
When a Consumer is not running you can also change the offsets!

consumer-lag-lensesio

09.

Monitor real-time flows

Access the Topology to view a realtime mapof how applications ,connectors and streaming SQL interacts with Kafka topics.

Click on a node to see the details, metrics and health statuses.

TIP

TIP
You can add your applications, data pipelines and also microservices in the topology map.

topology

10.

Setup alerting

Real-time monitoring and notifications for:

Infrastructure health
Producer SLAs
Consumer SLAs

Note: Learn more about monitoring integrations
alert

Explore advanced options

RBAC
Data centric security
Single Sign On authentication
HIPAA,GDPR,PCI data policies
Automate Flows with GitOps
Manage Kafka ACLs & Quotas
AWS MSK, Azure and other Cloud integrations

Check the docs

contact

Want to try on your cluster?

Frequently Asked Questions

Yes.

Lenses Box is optimized for development and includes a single broker Kafka environment. However, you can evaluate Lenses against your existing setup. Lenses works with all popular Kafka distributions and clouds. Contact our team to help you

Lenses Box is Free for ALL and forever.

In order to use the Docker of Lenses Box you require a free access key. You may need to refresh your key from time to time for security reasons.

You will need a paid subscription to connect Lenses to your own cluster or production.

See pricing

It's not recommended.

Lenses Box is a single-node setup. For production environments it is recommended to have a multi-node setup for scalability and fail-over.

For production or multi-node setups of your Kafka cluster contact our team

Yes. Lenses works with AWS MSK, Azure HDInside, Aiven, Confluent Cloud and other managed cloud services.

Lenses also provides deployment templates tailored for each Cloud provider.

Check Deployment Options

Do you have more questions?
Let us help