Credit: Google News
The financial industry is experiencing technological disruption on a scale never seen before. Access to data is underpinning this rapid transformation. The volume, complexity. and diversity of data available to decision-makers have grown exponentially in recent years – but it has also become more disjointed. According to IBM, 90% of the data in the world today has been created in the last two years alone.
Three factors – data (and alternative data), machine learning and natural language processing – are converging to fundamentally change the way investors across global capital markets derive, consume and analyze the information available to them. In turn, this is reshaping investment firms’ trading strategies and their approach to differentiating business intelligence.
The data challenge – more diverse but more disjointed
The pool of data for financial institutions is vast and fragmented. It includes not only the classic fundamental data most are familiar with (financial results, securities prices) but data generated by business processes (such as commercial transactions), machine-generated data (like satellite information) and data from less traditional sources such as social media.
Investors are also turning to the latter alternative data sets with new techniques focused on finding new, relevant investment signals to capture alpha. Today’s investor may need to make sense not only of security pricing and financial performance but satellite images, supply chain information, ESG factors, and even Tweets.
While alternative data can add depth to an investment decision, history and common identifiers are essential context for translating the data into relevant information for trading books and portfolios.
The finance industry generates vast amounts of data. Bloomberg receives 100 billion market data messages a day and ingests two million new stories a day from 125,000 news sources. Predictive analytical tools powered by machine learning algorithms and natural language processing sift through all this data in order to find and deliver the most critical information to investors.
The greatest challenge for market participants and financial institutions today is to identify what data sets to use; how to ensure the data sets are high quality, consistent, linked and ready to use; and how to quickly make sense of that data to inform critical decisions.
Ready-to-use data enabling automation
The explosion in data is just a baseline driver of change and complexity. Given the size of the task at hand, it is no surprise that financial institutions are looking to technology to automate processes to help manage data better, and to generate alpha.
Machine learning has rapidly emerged at the forefront of those technologies. Advancements in natural language processing and data collection are driving trade automation across electronic execution platforms. Firms are tapping big data and machine learning to anticipate client demand and price swings.
On the buy-side, hedge fund and asset managers are using predictive analytics to assess risk based on market liquidity. As more workflows become automated, financial professionals can focus more on the cognitive side – including strategy and portfolio selection and formulating investment theses. Financial professionals are trying to solve many problems in finance using these methods, because they allow for the building of more sophisticated intelligence into trading and client-facing workflows.
In the Bloomberg ‘Machine Learning Decoded’ roadshow across four cities in the Asia Pacific region – Sydney, Singapore, Tokyo and Mumbai – we found the majority of financial professionals are using machine learning to generate signals and factors, as well as to optimize their execution strategies for greater trading efficiency. Of those markets, Japan was by far the most advanced in applying machine learning in trading models, while leveraging the most diverse range of data sets in investment strategies.
RELATED: Fighting cyber-powered financial crime
Data becoming a service
In the machine learning age, financial data and information will become richer, but more predictive and organized. For example, at Bloomberg, we currently standardize data through ‘tidy formats,’ which help increase efficiency and cross-reference-ability by providing standardized historical data to facilitate data analytics.
In view of the increasing complexity of information, data platforms are becoming more popular and customers are looking for it to be delivered in more innovative ways. For example, web-based data platforms seek to help customers explore, purchase and interact with bulk data sets in ways that are easier to manipulate, model and visualize. Alternatively, non-traditional datasets including insights on metals inventory, equities blogger sentiment, drug approvals, parking lot activity, construction permits, geopolitical risk, and app utilization are examples of data sets that we’ve included in our web-based data platforms – indicative of the growing demand for the data being made available today.
Users want flexibility in when and where to use data applications, and we see data soon becoming a service, with future platforms becoming lighter and more mobile.
Machine learning is fundamentally data-driven and can help investors capture very complicated relationships quickly. These then allow them to approach problems that heretofore were intractable – due to complicated interactions in the data, complexity of the problems and availability of data or computational resources.
The techniques and technologies available now are getting more sophisticated, but underpinning all this and the success of machine learning strategies is the growing importance of high quality, linked and actionable data. Companies who understand this now and put in place a data-driven enterprise strategy will be eventual winners in this new machine learning age.
Credit: Google News