Sebastian Dragomir
Sebastian Dragomir
More and more companies are embracing the data streaming revolution in order to get a better insight of their business and thus drive growth upwards. With Apache Kafka getting a lot of momentum in the space, it is normal to want to connect a Javascript UI application to it. This Javascript library build on redux brings the best practices of real time data streaming to the browser and provides a new way for building solutions to get and send data on Apache Kafka.
REST endpoints to Kafka is one way to connect to a backend, but now there are new ways to achieve low latency updates from Kafka into the browser. Lenses offers a new data streaming platform for Apache Kafka and exposes a rich set of endpoints for a Javascript application to use Lenses SQL engine for Apache Kafka to get messages from a Kafka topic to the browser using consumer group semantics. The communication is bi-directional; from your web app you can also push records back to Kafka.
Consider a scenario of an FX trading application; to get a list of currency exchange tickers it is now as easy as firing a request with the following:
In this tutorial we will show you how to setup a React/Redux application and retrieve data from Kafka, via SQL, or to connect an existing Redux application and stream data in a matter of minutes.
For this we will be using redux-lenses-streaming , a redux middleware built to make the process as fast and easy as possible.
This guide assumes some basic knowledge of React and Redux. The demo project has the minimal setup needed to connect, login, publish, subscribe and view streaming Kafka messages via the Lenses WebSocket connection.
If you already have an existing Redux project which you want to update with the library, you will need to install it alongside it’s peer dependency, RxJS. (although no knowledge is required of RxJS, we highly recommend looking into it, especially if your application deals with large streams of data / events / messages)
npm install rxjs redux-lenses-streaming --save
We start off by adding the Lenses reducer to our root reducer.
Next we’ll create the lenses middleware and add it in our Redux store; while doing so we can customize the connection options (this way the library will attempt to connect on initialization).
Currently the following options are supported.
In case you need to wait for data or user input, leave the options empty when setting up the middleware and dispatch a connect action with the options as payload:
On successful connection, an action will be dispatched that you can listen to and respond in your reducers/middleware.
import { Type as KafkaType } from 'redux-lenses';
Once the KafkaType.CONNECT_SUCCESS has been received you will be able to access the connection via the reducer which has a connection object and subscriptions array (currently empty).
As you noticed, the options payload includes a user and password property. If they are provided during the connect stage, the library will automatically attempt to login. Another way to achieve this is by dispatching a LOGIN action at a later date using the provided action creator. (see docs for full list)
By this point we also notice KafkaType.KAFKA_HEARTBEAT messages that we can track in our reducers.
Next step is to subscribe to a topic. For this we will use the subscribe action creator that we will import in the same way we did with connect and send as the payload an object with the sqls property (String).
An example sql would look like this:
On a successful subscription, the SUBSCRIBE_SUCCESS will be dispatched and the subscriptions array in the lenses reducer will be updated with the new subscription.
In order to listen to Kafka messages, we’ll add an action handler in our application reducer ( named sessionReducer in the example ) for KafkaType.KAFKA_MESSAGE . The payload will have a content property with the messages array, that we can visualize in a data grid or chart.
In order to unsubscribe, all we have to do is dispatch an unsubscribe action with the following payload:
In a similar way to subscribe, for publishing we will dispatch a publish action with the following payload.
The acknowledgement will be of PUBLISH_SUCCESS type. If you have already subscribed to that topic, you should see the message streamed back to the client.
We are working on adding additional features to the library, such as message batching, delay and timeout customization.
We recommend trying out the demo project and letting us know what features you would like to be implemented next.
How to explore data in Kafka topics with Lenses
Find out about Lenses for Apache Kafka
The easiest way to start with Lenses for Apache Kafka is to get the Development Environment, a docker that contains Kafka, Zookeeper, Schema Registry, Kafka Connect, Lenses and a collection of 25+ Kafka Connectors and example streaming data.