What’s remarkable is how long Confluent competed at the highest level. Creating a category and type of application is hard; transitioning to cloud and surviving against hyper scalers is even harder. That alone is a huge achievement.
Some see this as a pressured exit. But another way to look at it is as a strategic purchase by IBM to strengthen its position in enterprise data movement and integration.
Let’s not underplay the significance of this, this is IBM’s, a 100+ year old company, second-biggest acquisition of all time.
IBM’s official messaging frames the deal as part of its broader data and AI strategy, emphasizing how Confluent’s platform helps connect applications and environments across complex enterprise landscapes. The SeekingAlpha analysis adds a commercial lens, highlighting the distribution and sales benefits IBM gains by bringing Confluent into its software portfolio.
But underneath the narratives, the real question is what this means for the streaming world.
Confluent’s Real Impact
Confluent’s achievements are undeniable.
They built a category and a global streaming community from scratch.
They invested heavily in Kafka and Flink, enabling their competitors.
They scaled a business to more than a billion dollars in revenue, something few companies with open source tech have pulled off.
Kafka changed how modern applications are architected. It powered large-scale, real-time operations at companies like Uber, Netflix, and Airbnb, and later became part of the architecture for most Fortune 500 enterprises.
At the same time, cloud providers built sizable managed Kafka businesses with far smaller engineering investments. Confluent created much of the value but captured a limited share of it. Combined with slowing growth and rising pressure to turn profitable, the conditions for a potential sale slowly took shape.
Where Confluent struggled
Confluent was a victim of its own success and in the end, trying to do too much.
This became more apparent after its IPO.
Under pressure from the market to be the next Databricks. Not being cloud-native, it had the hard work to “encourage” on-prem customers and contracts to migrate to their Cloud offering with new consumption models. Reducing investment in their on-prem products, leading to lack of parity and alienating many customers that needed infrastructure on-prem or in their own cloud.
Reducing their investment in the Kafka Stream project to support Apache Flink was a big moment. Their Flink service would take a very, very long time to gain true adoption that would satisfy the market.
They were fighting battles on all fronts - in a price war with AWS and Redpanda on the infrastructure, with open-source Flink, Ververica and countless other managed services on processing and with the likes of Lenses on Developer Experience, Governance and data replication.
And they had to support both on-prem and cloud products and try to offer feature parity on all fronts. And they had to carry the open-source Apache Kafka project almost on their own.
It was just all too much. And time ran out.
Why IBM Is a Logical Buyer
IBM has a long history in application integration and middleware. MQ, WebSphere, webMethods, StreamSets and Event Automation suite still underpin critical processes in industries from banking to manufacturing. Adding Confluent reinforces IBMs role in the systems-integration layer where enterprises continue to invest heavily.
Confluent provided great technical enablement but struggled to educate and message business modernisation. This was apparent in their most recent conferences that tried to marry both, neither of them well done. IBM will likely help with this.
There are also clear go-to-market synergies. IBM’s enterprise reach gives Confluent access to customers who often rely on established vendors to guide through big modernisation projects. This alone could expand the footprint of streaming into parts of the market where adoption has been slower.
What This Means for Streaming
The acquisition marks an inflection point. Streaming is no longer viewed as niche infrastructure but as core enterprise data plumbing. The deal suggests a few broader industry effects:
Streaming will fold deeper into established integration and middleware ecosystems.
Competition with cloud-native services will intensify as IBM invests in Confluent as a first-class product.
The level of commitment from IBM on this deal is a sign that streaming will play a very important role in the next-generation Agentic enterprise architectures.
But, we know what an acquisition of this type means, lost momentum for perhaps a year. This is a good opportunity for the likes of AWS, Aiven, Redpanda, Google and Oracle to take market share.
What This Means for Open Source
The question on most people’s minds. Since the birth of Confluent Cloud, Confluent have been cautious about their contributions to the Apache project, putting some enterprise features behind closed source. Also noticeable with the Kafka Connectors.
IBM has over the decades built open source into their core, most notably with the acquisition of Red Hat. They understand open source, it’s critical to their business of supporting hybrid customers.
And although IBM is not putting Confluent in their Red Hat org, the chances are that this acquisition is good for the open-source project.
What This Means for AI
Confluent have been aggressively marketing their Kafka being a strategic component of an Agentic architecture. Whilst we too very much believe in this, the way they have done it lacks some credibility by heavily promoting their Flink managed service.
Their marketing and DevRel assets and positioning seemed a little gimmicky, not speaking well to an AI or ML community. And rather than focusing on helping engineers build AI apps more easily with DevX on their platform, they wanted to go for gold and be part of the Agentic fabric.
It was striking that IBM’s press release did not mention Confluent’s product announcements in AI and Agents.
IBM, whilst being behind their direct competitors, is nonetheless a powerhouse AI player, notably with its WatsonX suite, will give Confluent more credibility. This can only be good for the overall market of AI Agents accessing real-time information.
What This Means for Lenses
The streaming market needs a mixture of different Kafka services - from IBM/Confluent Kafka for large critical real-time connectivity hybrid backbones, AWS MSK to support cloud-native services, Google BigQuery Kafka for analytics stacks and open source Kafka for edge infrastructure.
We see enterprises adopting a multi-Kafka and vendor strategy.
But the operating plane and Developer Experience needs to be unified and decoupled from the infrastructure vendor, avoiding vendor lock-in. AI Agents need to have safe and well governed access to both the infrastructure and the streaming data (without having to move the data). This is why many of Confluent’s customers select Lenses.






