Article

How Article is winning market share using Apache Kafka

A steep demand spike during the pandemic drove this digital retail leader to break up their monolith in weeks, to win market share from their competitors.

Article hero

220k
new customers

across North America drove teams to protect customer experience

70% YoY
revenue growth

saw engineering optimizing order processing & fulfillment

Weeks instead of years

to break up a monolith

25 software
engineers

access Kafka across 13 microservices without needing Kafka training

Article is focused on building the easiest way for people to furnish their spaces. They’re unique because they build software not only for themselves - but their suppliers, orchestrating the entire supply chain from customer order to final-mile delivery.

Back in 2019, the engineering team launched a 2-5 year project to accelerate development by breaking up their monolith into domain applications and modernizing their systems. They initially looked to use Mulesoft, but as their requirements grew, they decided to use Apache Kafka as part of their event-driven architecture.

No more sitting comfortably on streaming data

Then overnight, the world changed: They had to break up the monolith the next morning

COVID-19 forced the whole world to go digital, and the eCommerce share of the furniture market significantly increased. During lockdown, people wanted to adorn their homes with new furniture.

Luckily for Article, they were a digital-only retailer. But were they ready for the tsunami of orders to follow?

No one could ever be ready for three months’ worth of Black Fridays.

Engineering priorities had to change to protect customer experience at all costs. They had to optimize order processing, tracking and fulfillment with a new urgency. And not to mention improving communication services to ensure customers were constantly kept up-to-date with the progress of their orders.

Developers could no longer afford to waste time answering these kinds of questions about Kafka:

  • Why is there no consumer group for this topic?
  • Has my event been published to this Kafka topic?
  • Is my CDC Kafka Connect connector working correctly?
  • What is the lag on this consumer?
  • What is the schema for this topic?
Arcticle

Luckily Laurent and his team of 25 were well-prepared with Lenses to focus on developing more than 13 microservices, because they had immediate self-service to operate Apache Kafka without needing a deep understanding of the technology.

“We knew that Kafka was critical for our business. But we also knew it would be difficult to operate. Using Lenses helps us know where to look for the data we need so we can see what’s working across systems and apps.”

Laurent Pernot

Laurent Pernot, Software Engineering Manager - Article

Confidently optimizing truck routing, fulfilment sources & purchase orders

Lenses gave the team a way to visualize and operate their streaming data landscape:

  • Exploring Kafka topics: Navigating Kafka is made infinitely easier by using Lenses real-time data catalog & SQL instead of command line guesswork to troubleshoot their applications

  • Seeing Kafka consumers: Lenses allows the team to immediately see consumers, view the lag and check whether data is moving through their applications.

  • Viewing & evolving schemas: This can be done from a UI with full security and auditing, and without needing to understand a new set of APIs.

  • Deploying Kafka Connect flows: Engineers can use Lenses to view, alert and deploy CDC or other Kafka Connect connectors.

Saving time spent on searching for streaming data

The software engineers at Article can now spend their time more wisely. Lenses has helped them shift from having to learn and operate complex technologies to focusing on increasing the velocity and quality of event-driven applications.

Ultimately, it means the Article engineering team is doing more of what they do best: Building the best applications for ecommerce. This way, Article doesn’t only react in real-time to demand during a global crisis, but maintains market-leading customer experience for decades to come.

Get your DataOps workspace for Apache Kafka:

Let's go