I am a believer that the pathway to innovation must end with putting a product or service into the market that provides economic value to customers, economics the branch of knowledge concerned with the production, consumption, and transfer of wealth. Innovation is about economics and economics is about value creation. It’s about the 4M’s…Make Me More Money!
Products or services that never reach the market may have been great ideas, but you can’t pay the mortgage or buy your Venti Chai Latte with great ideas. Yes, I believe for something to truly be an innovation, it must achieve market success; it must deliver economic value. I know that probably buts me in conflict with some traditional innovation thinkers, but that’s what makes my role as Chief Innovation Office at Hitachi Vantara so…interesting (see Figure 1).
Figure 1: “Is Analytics-driven Innovation the Ultimate Oxymoron?”
In my “Analytics-driven Pathway to Innovation” framework, an idea must pass through 3 stages:
- Stage 1: Curiosity – Couple inquisitive demeanor as to how things work with Descriptive Analytics to better understand, track and measure what customers are trying to accomplish.
- Stage 2: Creativity – Integrate imagination with Predictive Analytics to create new insights-driven product or service considering customer’s decisions, pains and gains.
- Stage 3: Innovation – Integrate AI into new “intelligent” product or service to continuously learn and evolve via product or service usage…that delivers economic value to customers and markets.
Stage 3 is where the resulting product or service provides economic value to customers and the market.
So, with such a strong fascination with economics, let me use this blog to bring together all of the different blogs (with their corresponding infographics) that I’ve written on the power of economics in the areas of big data, data science, artificial intelligence and digital transformation. Hey, it beats me from asking you to buy yet another book!!
This joint research paper with Professor Mouwafac Sidaoui at the University of San Francisco School of Management launched my passion into further understanding and codifying the role that economics play in deriving and driving new sources of customer, product and operational value with data.
Most organizations make business and operational decisions based upon Accounting GAAP rules, a retrospective methodology for determining valuation (value in exchange). Economics, on the other hand, brings a forward perspective on determining valuation. Organizations that use an economics frame to measure and manage their business operations focus on the value or wealth that an asset can create (value in use). If one wants to exploit data and analytics to enable “doing more with less”, then one must embrace an economics mentality.
A key inflection point for the University of San Francisco “Economic Value of Data” research turned on this observation:
Data is an unusual currency. Most currencies exhibit a one-to-one transactional relationship. For example, the quantifiable value of a dollar is considered to be finite – it can only be used to buy one item or service at a time, or a person can only do one paid job at a time. But measuring the value of data is not constrained by those transactional limitations. In fact, data currency exhibits a network (or multiplier) effect, where data can be used at the same time across multiple use cases thereby increasing its value to the organization. This makes data a powerful currency in which to invest.
The Multiplier Effect is one of the most important concepts in economics. The multiplier effect refers to the increase in final income arising from any new injection of spending. The size of the multiplier depends upon a household’s marginal propensity to consume (MPC). The Multiplier Effect is used to explain the cumulative upward and downward swings of value that occur in a free market system. When investment in an economy rises, it can have a multiple and cumulative effect on national income, output and employment.
The USF “Economic Value of Data” research work continued to evolve, leading to these three theorems on the economic value of data:
- Economic Value of Data Theorem #1: It isn’t the data that’s valuable; it’s the relationships and patterns (insights) gleaned from the data that are valuable.
- Economic Value of Data Theorem #2: It is from the quantification of the relationships and patterns that we can make predictions about what is likely to happen.
- Economic Value of Data Theorem #3: Predictions drive monetization opportunities through improved (optimized) strategic and operational use cases.
I fully expect the number of “Economic Value of Data” theorems to grow as the concepts mature, especially as organizations seek out new means to exploit the economics of data and analytics to the fuel their digital transformation initiatives.
This is the blog that first introduced Schmarzo’s “Economic Digital Asset Valuation Theorem”, and my journey to a Nobel Prize in Economics.
Economies of scale has historically given large enterprises unsurmountable market advantages through the exploitation of mass production, distribution and marketing. However, in Digital Transformation, “Economies of Learning” are more powerful than “Economies of Scale” because of the ability to learn and deploy those learnings within digital assets faster.
This blog furthered the discussion on Schmarzo Economic Digital Asset Valuation Theorem by drilling into the unique characteristics of digital assets, and how these digital assets manifest themselves at the macro-economic level in three unique “effects”:
- Effect #1: Economic Costs Flatten. The cumulative costs of the data and analytic digital assets flatten as the Margin Cost of the re-use of the data and analytic digital assets approaches zero.
- Effect #2: Economic Value Grows. Re-use of the data and analytics across future use cases accelerates time-to-value and de-risks those use cases.
- Effect #3: Economic Value Accelerates. The cumulative Economic Value of the digital assets eventually accelerates through the refinement of the digital asset. The analytic modules get more accurate through reuse that drives predictive model effectiveness improvements.
This blog highlighted Google’s audacious strategy to open-source TensorFlow in order to gain tens of thousands of more use cases to improve the predictive effectiveness of the platform that runs Google’s business. So why would Google open source TensorFlow and make it accessible to everyone, even its competitors? The Forbes article “Reasons Why Google’s Latest AI-TensorFlow is Open Sourced” gives us a glimpse into the answer:
“In order to keep up with this influx of data and expedite the evolution of its machine learning engine, Google has open sourced its engine TensorFlow.”
Google is using an open source strategy to get more folks to test and refine TensorFlow in new ways that ultimately will improve its predictive effectiveness and make the products in which TensorFlow is based even more effective.
“If you buy a Tesla today, I believe you’re buying an appreciating asset, not a depreciating asset.” – Elon Musk
It may be one of the most provocative and powerful statements in the nascent years of Artificial Intelligence (AI). An asset that appreciates in value through usage and learning is yet another example of how a leading organization can exploit the unique characteristics of digital assets that not only never deplete or wear out but can be used across an unlimited number of use cases at a near zero marginal cost.
The Elon Musk quote is yet another example of a company that is exploiting the unique economic characteristics of data and analytics – assets that not only never wear out or deplete and can be used across an unlimited number of use cases at near zero marginal cost, while the assets acquire more value – become more predictive, accurate, complete – through use!
The Economic Value Curve is a measure of the relationship between a dependent variable and independent variables to achieve a particular outcome such as retaining customers, increasing operational uptime, or optimizing inventory. The Economic Value Curve measures the impact that the independent variables (Maintenance Spend) has on the dependent variable (Uptime %).
The challenge with the Economic Value Curve is the Law of Diminishing Returns. The Law of Diminishing Returns is a measure of the decrease in the marginal (incremental) output of a production process as the amount of a single factor of production is incrementally increased, while the amounts of all other factors of production stay constant.
Transforming one’s Economic Value Curve is an area ripe for exploiting the economic value of data and analytics to transform one’s economic value curve; to“do more with less” with digital assets that not only never wear out, but actually become more valuable through usage.
There it is. All of my work on understanding and exploiting the single most powerful concept in business…economics. And trust me when I say that I’ve just got started, as there is so much more to explore, learn and exploit with digital assets that never wear out, but actually become more valuable through the learnings gleaned from usage.