Connect Kafka to Elastic

Simplify and speed up your Kafka to Elastic sink with a Kafka compatible connector via Lenses UI/ CLI, Native plugin or Helm charts for Kubernetes deployments

Kafka to Elastic

About the Kafka to Elastic connector

License Apache 2.0

This Elastic Sink allows you to write data from Kafka to Elastic. The connector converts the value from the Kafka Connect SinkRecords to JSON and uses Elastic’s JSON insert functionality to index a document.

The Sink creates an Index and Type corresponding to the topic name and uses the JSON insert functionality from Elastic’s client. This is an open-source project and so isn't available with Lenses support SLAs.

Connector options for Kafka to Elastic sink

Docker to test the connector

Test in our pre-configured Lenses & Kafka sandbox packed with connectors

Use Lenses with your Kafka

Manage the connector in Lenses against your Kafka and data.

Or Kafka to Elastic GitHub Connector

Download the connector the usual way from GitHub

Connector benefits

  • Flexible Deployment
  • Powerful Integration Syntax
  • Monitoring & Alerting
  • Integrate with your GitOps

Why use Lenses.io to connect Kafka to Elastic?

This connector saves you from learning terminal commands and endless back-and-forths sinking from Kafka to Elastic by managing the Elastic stream reactor connector (and all your other connectors on your Kafka Connect Cluster) through Lenses.io, which lets you freely monitor, process and deploy data with the following features:

  • Error handling
  • Fine-grained permissions
  • Data observability
Kafka to Elastic

How to push data from Kafka to Elastic

This connector is suited for Elastic Search 2.x - For Elastic Search 5.x look here.

Share this connector

Experience all features in a single instance always free for developers

Pull docker free