Confluent has just announced a fresh shot of $250 million in Series capital, almost doubling its total funding to $456 million, and its valuation to $4.5 billion. Arguably, the new funding was probably the world’s worst-kept secret given a Bloomberg report last month revealing the likelihood of a forthcoming round.
While most later venture rounds like Confluent’s typically funnel more resource into the “go to market” side with sales, marketing, and customer support, Confluent is placing a stake for the new round in product development, specifically around “Project Metamorphosis” which is aimed at building up event streaming capabilities in the Confluent Cloud.
The funding caps a year where annual recurring revenue roughly doubled and revenue for Confluent’s managed cloud service more than quadrupled. The company also entered formal partnerships with Google Cloud and AWS to host its managed service, and last fall, became available through the Azure marketplace. With the Confluent Managed Cloud on each of the major cloud platforms, Confluent is also promoting it as a multi-cloud platform that can act as a PubSub bridge spanning each of the clouds.
And a few months back, Confluent released version 5.4 of its enterprise platform that addressed four key areas. Among them was a new multi-region cluster capability that supports asynchronous replication across different data centers, to promote more reliable failover. Security, beefed up with role-based access control, in place of the original access control, lists, plus new higher-level structured audit logs, replacing a more primitive construct with a format readable by Elasticsearch and Splunk, reflects Confluent’s goal to elevate its Kafka environment from its origins as low-level tool for developers to build PubSub data pipelines. Finally, schema validation that is performed at the topic level inside the Kafka broker, rather than in an external schema registry, should hopefully harden the schema and result in fewer broken applications.
Confluent is reaching out beyond its core audience of Java and Python developers to engage the SQL professional base by expanding KSQL, its capability to run SQL queries against streaming data, with a preview of a new event streaming database ksqlDB. It expands the footprint from streaming data to cached data at rest, in many ways mimicking the role of in-memory databases. With data persisted comes the ability to build materialized views that can provides more focused query and analytic capabilities on data that would normally be in motion. Conceivably, you could get the same capability by attaching an in-memory database, but Confluent promotes its alternative as providing a simpler architecture.
And that’s where Project Metamorphosis kicks in. Admittedly it’s a play on words and a bit of a stretch at that. Project Metamorphosis is supposed to transform Kafka event streaming into a cloud-native service that serves as the glue with on-premise data sources. Or more specifically, the develop the capabilities in Kafka that treat the origins of streams as events that can then be parsed, manipulated, and queried. Hold this thought, as it’s going to be the central theme for Confluent’s technology roadmap for the rest of the year.
Project Metamorphosis was named after a Franz Kafka tale about a poor guy who, after rough night, is transformed into an insect, gets ostracized and then is grudgingly accepted by his family, and several months later, passes away. You can always count on the open source crowd to tap obscure references to make their projects memorable — although the choice of this one has positively got us scratching our head a bit.
Over the balance of the year, Confluent is promising a wave of monthly announcements for its serpentine Project Metamorphosis initiative that its press release describes as “event streaming’s next act.” As Confluent drew a quarter of a billion dollars on this vision, we’re assuming that the plan is for results that will be a lot less ghastly.