Evan Rosenstein
Evan Rosenstein
I ordered a ride share recently from a beach; the app struggled to find a car, so I had to make several requests. After the fourth or fifth attempt, my bank alerted me to possible fraudulent activity on my credit card via SMS. Each time I ordered a ride, the service put a pending charge on my card. After I texted back that it was just me, the bank reactivated my account.
Though the process was annoying, I felt reassured my bank could detect possible fraudulence that quickly.
Fraudulence alerts are an example of ageing data’s diminishing returns; the Forrester graph highlights how time-critical decisions can only be made in a matter of minutes (at the latest). If my card information had been compromised, any delay in detection would only lead to more headache and higher cost to rectify.
This scenario highlights customer expectations in 2020. I expect my bank to alert and protect me from fraud in a matter of moments. While it may seem like table stakes to users, the truth is that flagging real-time data from a purchase as fraud is not simple.
For the reasons documented above, companies of all sorts turn to Apache Kafka to stream real-time data between their various business applications. However, Kafka is a beast. The model below showcases a simplified Kafka structure that would still daunt most developers to manage.
Not only does Kafka involve multiple distributed technologies, but then you need infrastructure to run and host your pipelines and microservices. And with that comes challenges around observability, governance, security, deployment.
More and more Kafka users turn to a managed cloud to host their Kafka, which provides a fully managed, secure and turn-key experience.
It would be wonderful if this was the end of your Kafka struggles, as a famous college football announcer says: “not so fast, my friends”. Kafka is also a black box, meaning zero visibility. Imagine a team trying to prevent your bank account breach with a blindfold on.
Enter DataOps to observe, secure, build and deploy your flows and services running on Kafka. Adding a layer of DataOps allows you to spend less time stressing about a project meant to solve a business problem. Time which you could be spending at the beach.
This leads us to a webinar we developed with our partners at AWS MSK, The Power of Two:
Watch the video to learn how to optimize your Kafka data stream with MSK and Lenses with several use cases demoed by our CTO, Andrew Stevenson. If you want to explore Kafka + DataOps for yourself, sign up to our portal.