New Lenses Multi-Kafka Developer Experience

Download

Data discovery for Elasticsearch with SSO & alerting

By

Christina Daskalaki

Mar 25, 2020

Building and operating a data platform requires a multitude of different technologies.

It’s easy to pick the latest and greatest technologies. The difficulty is delivering that infrastructure into a single safe and productive experience and allowing anyone to build and operate real time data applications. Our mission is to make this happen.

You focus on selecting the data infrastructure you want. We’ll bind it together and integrate your corporate tools to create a real-time application and data operations portal. No need for deep infrastructure expertise for every single operation. Enable your teams to focus on delivering value with real time data, building data intensity.

After 1000s of engineering hours, we're thrilled to announce release 3.1. As well laying the foundations for some big further things to come (more on that later..), we have packed the release with capabilities to help you in the following areas.

Productivity

You may know us for our association with Apache Kafka.

Many that operate Kafka also have Elasticsearch. Like two peas in a pod, the two technologies are always close. 

We’ve heard from you that having a single application and data operation portal across both technologies would simplify your operations. 

Self service data discovery in Elasticsearch

Self-service data discoverability & observability across your organisation is a vital part of promoting DataOps.

Developers require a global view of schemas and metadata across all data technologies to be efficient. 

In 3.1, we enable you to create secure and centralised connections into Elasticsearch to look deep inside. 

Not only does this give you visibility of the indexes, metadata and entities (including sharding and replication information) in your ES environment but it also provides data observability, enabling you to explore data using the same SQL engine that our customers love for Kafka.

Elastic Search index metadata, schema and entities for lenses 3.1

As you would expect, the experience between Kafka and Elasticsearch is seamless: data is protected with unified data policies (to anonymise or discover sensitive data) and role-based security backed by namespaces.

Elastic Search Data Policies Data Masking for Lenses 3.1

For a developer, this provides numerous productivity benefits.

An example being that you have a dataset in Kafka and an analytics team wants you to stream data into Elasticsearch. 

You want to explore the ES indexes and metadata and use SQL to explore the data to know how you’re going to map to their schema.  You then create a sink connector before validating the data is mapped correctly in Elasticsearch at the end. You've avoided the endless back and forth between different teams and tools while at the same time ensuring compliance. 

Support for Elasticsearch connections is part of a large pluggable framework that our engineering teams have been busy working on.

Connections, with security, to many different data stores will arrive in the coming releases. These connections will open up more possibilities and help us on our mission to support you whatever underlying infrastructure you deploy. No spoilers, so stay tuned!

lenses.io 3.1 connections manager - elastic search, datadog, slack, pagerduty

Manage Kafka consumers at scale

Managing Apache Kafka consumers has always been a challenge for operations.  Doing so in a safe, self-service, efficient and audited fashion is even more difficult. 

We already allowed individual consumer instance offsets to be changed, typically used if you wanted to skip a particular corrupted message, for example.

With Lenses 3.1 now you can easily manage consumer offsets dynamically and at scale.  

Move the consumers of an entire group and across all (or some) Kafka topics based on a specific common offset or common point in time. If you had 100+ instances of your application and wanted to replay messages by resetting the offsets to last Tuesday, you could do this with a couple of clicks.

edit consumer group offsets for multiple consumers for apache kafka in lenses.io

Like all features in Lenses, this is available via a web interface or our APIs, and fully audited of course.

Compliance

Whilst building a data platform, we hear that complying with internal IT and Security policies can be one of the biggest challenges.  

Teams and projects may not want to onboard and go live onto a data platform without meeting the key requirements.  

Alerting 

Alerts associated with the platform, flows and microservices state or health should to be triggered and routed correctly across different teams within your business.  

In a large organisation, different tenants of the data platform (such as product teams) may have different internal solutions and processes to manage alerts.  

So for this release, we now allow teams to create different alert channels to popular alert management solutions such as Slack, PagerDuty, DataDog, Prometheus and AWS Cloudwatch. 

For example you could have high consumer group lag alerts for different flows routed to different Slack channels. 

custom-alert-rules-and-channels-for-lenses.io

SSO with Azure AD 

A successful data platform grows with the number of teams and users that use it.  

Beyond a certain size or criticality of service, data project teams need to comply with stricter corporate identity management policies ensuring employees have a single identity across all business applications.

Lenses already supports a number of different authentication strategies including LDAP, AD and Kerberos (including the use of multiple simultaneous strategies). With this release now we have increased support for Azure AD SSO over SAML 2.0. Stay tuned for coming support for Okta, OneLogin, KeyCloak & Google!

lenses-sso-authentication-with-azure-AD-login-screen

Multi-Cluster & Multi-Cloud visibility

Data infrastructure is moving more into the cloud via managed service providers such as AWS, Azure and GCP. Regulators are increasingly instructing businesses to adopt a multi-cloud strategy.

This means data platform teams now have the challenge of managing multiple different Kafka distributions across different clouds and data centers. 

Lenses 3.1 introduces the first public release of multi-cluster capabilities. 

Lenses multi-cluster and multi-cloud portal - monitor health of any apache kafka

From a single global portal, one can now access and monitor the health of a larger set of Kafka deployments across any environment.  

You can download the portal now for free and connect it to your existing Lenses instance from here

Links

Explore all the features of Lenses in our features page. Or download the all-in-one Kafka+Lenses free Kafka Development environment to try out the release for yourself!

Read the full release notes of these features and more!

Enjoy the release!