Connect Kafka to InfluxDB

Simplify and speed up your Kafka to InfluxDB sink with a Kafka compatible connector via Lenses UI/ CLI, Native plugin or Helm charts for Kubernetes deployments

Kafka to influxDB

About the Kafka to InfluxDB connector

License Apache 2.0

The Influx Sink allows you to write data from Kafka to InfluxDB. The connector takes the value from the Kafka Connect SinkRecords and inserts a new entry to InfluxDB.

Connector options for Kafka to InfluxDB sink

Connect Kafka to InfluxDB in a box

Test the deployment & management of the Connector through Lenses Box.

This is a single-broker Kafka with Lenses in a docker.

Try now in a pre-configured docker

Manage and push data from Kafka to InfluxDB with Lenses

Manage the deployment & management of InfluxDB and other Kafka connectors through Lenses.

Create account

Download just the Kafka to InfluxDB connector from GitHub

Manage the deployment of the connector through their standard process.

Try now in git

Connector benefits

  • Flexible Deployment
  • Powerful Integration Syntax
  • Monitoring & Alerting
  • Integrate with your GitOps

Why use Lenses.io to connect Kafka to InfluxDB?

This connector saves you from learning terminal commands and endless back-and-forths sinking from Kafka to InfluxDB by managing the InfluxDB stream reactor connector (and all your other connectors on your Kafka Connect Cluster) through Lenses.io, which lets you freely monitor, process and deploy data with the following features:

  • Error handling
  • Fine-grained permissions
  • Data observability
Kafka to influxDB

How to push data from Kafka to InfluxDB

InfluxDB allows the client API to provide a set of tags (key-value) to each point added. The current connector version allows you to provide them via the KCQL.

  1. Launch the stack
  2. Prepare the target system
  3. Start the connector

Share this connector

Experience all features in a single instance always free for developers

Pull docker free