• Pricing
  • Install Now
installNow icon
installNow icon
Install Now
homeMobile icon
homeMobile icon
Home
picingMobile icon
picingMobile icon
Pricing
blogMobile icon
blogMobile icon
Blog

JSON schemas to control your Lenses & infrastructure provisioning

Andrew Stevenson
By Andrew StevensonDecember 1, 2025
JSON schemas to control your Lenses & infrastructure provisioning image
In this article:
  • 01.Managing connections as-code
  • 02.Lenses 6 HQ & Agents
  • 03.Not just a spec
  • 04.Some examples
  • 05.Default snippets
  • 06.IDE Integration

When we talk about JSON schema in the world of Kafka and streaming, you may assume the schema for the events/messages.

But how many times have you fumbled in the configuration about trying to get an application deployed?

Schemas that describe how to configure and deploy Applications or applications-as-code, are also important, allowing us to automate application landscapes. Especially as we will soon be a wash with catalogs for AI Agents, MCP servers etc. 

Note: I recently created JSON Schemas for creating Kafka Connectors. 

Lenses is no different. It's an application like any others that can also deploy other applications (Kafka Connectors, Stream Processors, Kafka to Kafka replicators, …). 

In this blog, I’ll explain of the importance of JSON schemas for configuration and how they can be integrated into your IDE, using managing connections to Kafka environments with Lenses as an example

Managing connections as-code

Lenses connects to Kafka “environments”.  This used to be one environment (an environment includes complimentary services, Kafka Connect Clusters, Schema Registries and auditing/alerting services such as Datadog and Splunk). 

But as the adoption of streaming has grown, so has the size of the Kafka estate.  A business having hundreds of Kafka clusters is not uncommon and of different flavours (Confluent Cloud, Redpanda, Opensource, MSK, …) 

This has made managing the need of managing connections as-code more important and also allows for more mistakes to be made as you may be juggling lots of different types of connections. 

Applications configuration with JSON Schemas

Lenses 6 HQ & Agents

Lenses connects to Kafka environments through Lenses Agents (not AI, although we do have them) that are deployed close to the Kafka cluster.  Each Agent also connects to a central Lenses HQ which is what provides the unified developer experience across the Kafka estate. 

The Agent depends on a magical YAML file holding the connection details. This means you can version control it and automate deployment through your CI toolchain. 

Not just a spec

JSON schemas provide the same benefits to application configuration that they are touted for data:

  1. types definitions (eg. primitives, complex objects, enum support, …)

  2. metadata support; to help users by describing what each field is

  3. mandatory and optional fields; to ensure to catch errors before deployment

  4. validation rules; my favourite, rules that validate field value format (eg. a regex pattern), but also cross-validate across fields. Setting the protocol to SASL_SSL, you can also validate that your bootstrapBrokers have the right protocol set.

This has allowed us to create JSON schemas for managing different Lenses components, including the Lenses K2K replicator and the Lenses Agent provisioning file. 

Some examples

This is based around Lenses provisioning but the provisioning schema covers a lot.

Check if the saslMechanism is PLAIN we have the right Jaas Config module.


"if": {
                 "properties": {
                   "saslMechanism": {
                     "properties": {
                       "value": {
                         "const": "PLAIN"
                       }
                     }
                   }
                 },
                 "required": [
                   "saslMechanism",
                   "saslJaasConfig"
                 ]
               },
               "then": {
                 "properties": {
                   "saslJaasConfig": {
                     "properties": {
                       "value": {
                         "pattern": "^org\\.apache\\.kafka\\.common\\.security\\.plain\\.PlainLoginModule required username=\\\"[^\\\"]+\\\" password=\\\"[^\\\"]+\\\";$"
                       }
                     }
                   }
                 },
                 "required": [
                   "saslJaasConfig"
                 ]
               }
             },

Check the format of the protocol, field level validation


"protocol": {
               "type": "object",
               "description": "Kafka protocol.",
               "properties": {
                 "value": {
                   "type": "string",
                   "description": "Enter the Protocol value (e.g. SASL_SSL, PLAINTEXT, etc.)",
                   "enum": [
                     "SASL_PLAINTEXT",
                     "SASL_SSL",
                     "PLAINTEXT",
                     "SSL"
                   ]
                 }
               },
               "required": [
                 "value"
               ]
             },

Default snippets

This is powerful. Set a number of preconfigured templates that follow the JSON schema. Tweaking the values can trigger validation rules: any IDE that supports JSON schema will automatically highlight issues.

For Lenses this means all the common Kafka providers (Confluent Cloud, Redpanda, ..) and configurations. 

IDE Integration

When combined with your IDE, not only do you get validation and auto completion but also hooks to help fix issues before risk of any failed deployment.

Applications configuration with JSON Schemas image 1

The editors (in this case VS Code), light up mistakes and can be “quick fixed”, or Github Copilot can do it.

Applications configuration with JSON Schemas 3

What does it look like in Lenses?

Lenses is not VS Code (but stay tuned), but we run Monaco Editor (the heart of VS Code) in the product. This is part of our code-first approach to managing streaming data & apps.

Applications configuration with JSON Schemas 4

It's not just provisioning, it's Lenses IAM and our SQL Studio that is advancing to an IDE experience.

Lenses SQL Studio IDE-like experience

Give it a shot with Lenses

Lenses Community edition the free version of Lenses allowing you to connect up to two Kafka environments with access to all features, including our powerful SQL Snapshot engine. 

You can download it here:

This is all powering the next wave of innovation at Lenses, as we continue to provide the best developer experience for streaming data.

The provisioning schema with examples of the validation can be found here.

Back to all blogs

Related Blogs

Lenses 6.2 Oauth
Lenses 6.2 Oauth
Blog

Lenses 6.2 - Trusting Agents to build & operate event-driven applications

andrew
andrew
By
Andrew Stevenson
image
image
Blog

Kafka Migrations Need More Than a Replicator

Jonas Best Profile Picture
Jonas Best Profile Picture
By
Jonas Best
kafkaconnections hero banner
kafkaconnections hero banner
Blog

Self-Service Data Replication with K2K - part 1

Drew Oetzel
Drew Oetzel
By
Drew Oetzel

Lenses, autonomy in data streaming

Install now
Products
Developer Experience
Kafka replicator
Lenses AI
Kafka Connectors
Pricing
Company
About
Careers
Contact
Solutions by industry
Financial services
For engineers
Docs
Ask Marios Discourse
Github
Slack
For executives
Case studies
Resources
Blog
Press room
Events
LinkedIn
Youtube
Legal
Terms
Privacy
Cookies
SLAs
EULA
© 2026Apache, Apache Kafka, Kafka and associated open source project names are trademarks of the Apache Software Foundation