New Lenses Multi-Kafka Developer Experience

Download

Tutorial - Using D3 to visualize data in Apache Kafka

By

Mihalis Tsoukalos

Nov 20, 2019

This blog is a small tutorial on how you can export data that contains coordinates from a Kafka topic as JSON (irrespective of the original serialisation of the data in the topic - AVRO, Protobuf, etc.) and visualize it with D3.js.

To simplify the tutorial, we export the data as JSON before visualising it. But if you see my Kafka Data Access blog you can programatically stream data live off a Kafka topic using SQL and via a secure data access layer.

For those that don't know Lenses, it provides an enterprise-grade data access, monitoring, administration and governance portal for Apache Kafka which saves teams having to build custom tooling and accelerates Kafka adoption.

Pre-requisites

To make it easier to follow this guide just download the free all-in-one Lenses Docker image that also includes a working Kafka environment.

Why Visualize

There are many reasons for visualizing data including the following :

  • You get a higher level overview of your data.

  • You might be able to identify patterns on your data.

  • You might understand your data better, especially when you dealing with lots of data.

  • Beautiful visualizations can be put into company reports.

About Spatial Data

Spatial data contains information about the longitude and latitude, which is what it is going to be put on a map. When visualizing spatial data, you also get the following benefits:

  • You can see the current position of your objects.

  • You can verify the route of your objects.

  • You can detect delays or other kinds of problems.

What is D3.js

D3.js (Data Driven Documents) is a powerful JavaScript library for visualization.Despite its simple name, D3.js is a professional tool that offers functions that whenapplied to DOM elements can transform the resulting HTML, SVG or CSS documents.

You can learn more about D3.js here.

The first thing you should do is run the Lenses Box Docker image:

(NOTE: If you register for Box you will get an email with a unique free license key the above example has “xyz” as the license key which will not work)

Then, you will have to login to Lenses Box from a browser on port 3030 using admin as both username and password.

The Kafka topic that will be used is sea_vessel_position_reports. The format of the data inside is:

You will often be required to preprocess your data.

In this tutorial, we are just going to extract the fields that interest us and put them into a new Kafka topic. To keep things simple, we are only going to extract the data for a single sea vessel using SQL Studio. The code that will be executed in SQL Studio is the following:

spatial data image

The data in spatial_data Kafka topic has the following format:

Exporting Data

You will need to export it in JSON format. As we are using plain text format, we can export the contents of the spatial_data topic from the Lenses UI. This is illustrated in the image below:

export lenses

As I hinted at the beginning of this blog, you can also access data via CLI and API clients.

Once you press the Download button, you will be prompted to save the data as a JSON file on your local filesystem – in this case as /tmp/exported_data.json.

Note that if you have lots of data in your Kafka topic, you might need to adjust the value of the Records Limit field.

The format of the JSON records contained in /tmp/exported_data.json is the following:

As the exported records contain more data than needed, we are going to process it with the jq(1) tool, which you might need to install on your operating system:

cat /tmp/exported_data.json | jq '.[] | {value}' > /tmp/1.json

The new format will be as follows:

Then you will need to process the output file a little more in order to add a , character after each JSON record and embed the entire file into [] as we are talking about an array. So, you will need to execute the next commands:

The last thing that you will need to do with the JSON file is to put it on the Internet in order to read it from JavaScript. The reason for this is that JavaScript does not allow you to read files located at the filesystem of the local machine due to security reasons.

In this case, the JSON file will be put in a GitHub repository and will be stored in github.com/mactsouk/datasets/blob/master/data.json and will be accessed as raw.githubusercontent.com/mactsouk/datasets/master/data.json

You can put your JSON files in any place you want as long as they are accessible from the JavaScript code.

Visualizing Data

Now that you have your data in JSON format, you will use D3.js in order to visualize it. As JavaScript code is embedded in HTML files, the final version of the D3.js code can be found in visualize-spatial.html, which contains the following code:

If you are not familiar with D3.js you might find the JavaScript code quite complex. However, if you have your spatial data in the same format as data.json, then you can visualize it without any code changes. Additionally, visualize-spatial.html contains extra code that allows you to zoom and pan your maps!

The output of visualize-spatial.html can be seen in the image below:

output lenses

Conclusions

Once you have your data stored in Lenses, you can do whatever you want with it.

Discover the power of Lenses described by these four usecases for Apache Kafka & Kubernetes.

Additional Resources