Informatica’s launch of a cloud service under the new brand Cloud Lakehouse is more than a name change from its previous cloud service. “Lakehouse” introduces a new pricing model for Informatica’s second generation cloud services that moves away from previous schemes that were based on bundled pricing common in the on-premises software world.
Lakehouse is the latest chapter in the company’s pivot from a traditional on-premises software provider to the cloud. One of the rationales for the company’s 2015 deal to go private was to give it the financial space to refactor its portfolio to cloud-native microservices and transition the revenue model from perpetual licensing to subscriptions. Both were necessary before Informatica could entertain revamping pricing.
The Lakehouse pricing model is based on “Informatica Processing Units” (IPUs). While IPUs sound like a term borrowed from the craft brewing world, they provide a way for quantifying the compute and functionality of each service.
For instance, a single “agent’ or instance of Cloud Data Integration may be priced or rated at 28 units while an instance of Cloud Data Quality comes in at 67 units (these pricing units are subject to change). And by pricing in IPUs, that frees Informatica cloud customers to order tiers of service, from where they can mix and match functionality on demand, rather than have to commit to buying specific bundles ahead of time. And the pricing includes the option for an “elastic” option that is, essentially, serverless.
The new pricing scheme for data quality and data integration are just the first steps toward rolling out the pricing scheme to the rest of the Informatica cloud portfolio, with Data Governance set to be the next service to go to this pricing structure.
Rewinding the tape, Informatica’s cloud transition didn’t happen overnight, and it went through several iterations along the way. Way back in prehistoric times – 2008 to be specific – Informatica introduced a multi-tenant Data Integration cloud that also included wizard driven synchronization and replication cloud services. Predating microservices, this was based on a classic application software-centric monolithic architecture.
The next step for Informatica in its cloud journey was a temporary detour: get its core product portfolio unified with a common metadata and security foundation. Like most enterprise software providers, Informatica’s portfolio came together over the years through a combination of organic development and acquisitions. While it internally developed ETL and much of the data quality portfolio, a major piece – MDM – didn’t come until an acquisition in 2010. It took Informatica five years to integrate the metadata from MDM into the flagship, but in so doing, took a leap ahead by becoming one of the first enterprise software providers to incorporate graph databases as the underlying glue.
So Informatica’s next chapter in the cloud began almost a decade after its debut, in 2017. Once the portfolio got unified, Informatica refactored the cloud offering into bundles of multiple products starting with integration cloud, and followed a year later with MDM cloud, data governance cloud, and data security cloud. While the services resembled conventional software bundles externally, under the hood, they were built in a cloud-native fashion as microservices. Nonetheless, pricing still adhered to a more traditional bundled model, although as cloud services, they were now offered as subscriptions rather than perpetual licenses. That asset the stage for the current round, where Informatica is taking the next logical step and refactoring the pricing to better align with modern cloud architecture and requirements by customers to be able to have more flexibility on adjusting what they consume as business requirements dictate.
The branding to us is a bit confusing. If you’re a regular Big on Data reader, “Lakehouse” or “Data Lakehouse” may already sound familiar. A year ago, Databricks introduced this term, and we’ve more recently seen the term adopted by Snowflake. Data warehousing veteran Dave Wells views Data Lakehouse as “an interesting idea” that is “thought-provoking,” but “not ready for prime time.” Whatever. Yes, Informatica has also blogged about Cloud Lakehouse as a combo of data warehouse and data lake. But applying this label to pricing of cloud services doesn’t sound like the most logical brand extension. Could they have chosen branding that’s less ambiguous? That’s above our paygrade.
We expect that Lakehouse pricing for Informatica will be a learning experience.
For the customer, the prime advantage will be flexibility and agility – if you are going to use more than one Informatica tool, this makes credits truly fungible and you don’t have to go through new purchasing cycles when your needs shift. And that’s very much in sync with how modern cloud services are being priced.
But for now, Lakehouse pricing is still capacity, not usage based. Yes, customers can “burn down” IPUs and then redirect them, but this is still equivalent to reserved rather than on-demand capacity. This is something that we expect Informatica will get a better handle on once it gets a clearer idea on the actual consumption patterns for its service. That could open a number of options beyond reserved capacity that could include on-demand, or spot market pricing, for example, that could give Informatica cloud customers even more flexibility.
Regardless of what Informatica calls it, the new pricing is starting to gain customer traction. In its earliest rollout, with Data Integration, Informatica has already signed up a couple dozen customers with the new model and found that, free of having to make ironclad capacity commitments for specific tools or services, early customers have sprung for, on average, 30 – 50% larger contracts compared to what was customary before. Informatica expects to have the new pricing rolled to out to the bulk of its cloud portfolio by year end.