To the average person, artificial intelligence (AI) is more science fiction than science fact. While people use virtual assistants like Siri and Cortana on a daily basis, most American workers don’t believe that synthetic systems would be able to replicate their decision-making process. Yet, advanced artificial intelligence is replacing writers, actuaries, and network administrators. This growth is expected to continue so that by 2025, the AI market will exceed $165 billion, according to a report by Research and Markets.
A decade ago, first movers in the AI space were companies doing their own advanced algorithmic research. Today, however, innovators bringing AI to new markets are not performing advanced computer science research or developing new parallel computers. Instead, innovators are typically firms that have decades of well-organized legacy data. Somehow, these innovators are turning dusty files and archival tapes into systems that can match the intelligence of experienced humans.
Trending AI Articles:
1. TOP 100 medium articles related with Artificial Intelligence
2. Neural networks for algorithmic trading. Multimodal and multitask deep learning
3. Tutorial: Stereo 3D reconstruction with openCV using an iPhone camera
4. Back-Propagation is very simple. Who made it Complicated ?
Case studies: Automating the insurance industry and smart agriculture
IBM’s Watson system made national headlines in 2011 when it beat Jeopardy champions Ken Jennings and Brad Rutter. Watson is considered a question-answering system that utilizes multiple techniques to analyze language, interpret questions, sort and classify data, and score potential responses. To prepare for Jeopardy, IBM scientists engaged in an extensive training regimen — giving Watson access to the collected works of Shakespeare and more than 25,000 previous questions from the game show.
Yet, Watson is not simply a quiz show gimmick. It is a modular system that IBM uses as the core of its services business. When Fukoku Mutual Life insurance wanted to develop a system that could automate the tasks of several insurance analysts, they worked with IBM, training Watson with historical information about claims data, payments made to policyholders, and other relevant information. The results were a single computer system that replaced a staff of 34 individuals. Fukoku cut payroll expenses by 90% and boosted productivity by 30% simultaneously. Notably, Fukoku didn’t need to perform research into AI. Instead, the company only needed to work with IBM to train an existing system with data they already had in house.
Small companies can achieve similar results. Tractable, a company based in the United Kingdom, has released an app that can automatically process automotive and personal property damage claims. By feeding its “patented machine learning technology” information about past claims, the mobile app is able to visually appraise claim damage and increase processing speeds for policyholders by a factor of 10. Other conventional insurance companies are implementing similar AI features in their own mobile apps.
Interestingly, while Tractable developed their own AI to solve this problem, replicating their success in other spaces would not require solving a “hard” computer science problem. Instead, neural networks and AI systems have become fully developed modules offered by leading cloud service providers. IBM enables users to write software that calls upon Watson. Google makes TensorFlow available to its customers, and Amazon has developed its own SageMaker to perform similar functions. Training these systems for a specific algorithm is possible with minimal programming experience — and a large quantity of historical data. Tractable fed its neural network information about millions of historical claims and millions of images in order to produce a workable product.
This represents a significant shift in the market for AI. Now that the computational side of these systems has become well established, possessing data to train the systems has become the most important factor in automation. Do you want to become the first to automate your industry? Start digitizing your legacy data, applying labels and categories as you go. Then feed that data into an off-the-shelf system before your competitors do.
Many sectors of agriculture have been slow to adopt digitization. Animal husbandry has been notoriously difficult, as each animal can have significant biological differences from its peers. Yet, Connecterra has developed one of the first AI systems that uses machine learning to diagnose illness in cows early on by looking for anomalies in behavior, among other factors. The company created a wearable similar to a Fitbit, but was able to use Google’s TensorFlow to rapidly aggregate all of the data collected by those sensors and make accurate predictions: If a cow has not eaten for 12 hours, should it be treated for a mouth infection?
While the innovators behind Connecterra were certainly technically adept, they were also able to stand on the shoulders of giants and go to market quickly by not inventing everything themselves. By creating an innovative sensor package and letting open-source TensorFlow form the core of the machine learning system, Connecterra was able to shave years off of the development time and use a Google-optimized algorithm, rather than relying on an unproven system fraught with bugs.
What can you do with your legacy data?
What business processes could your company be automating? Could you be the one responsible for millions in cost savings? PreScouter’s Research Support Service can help connect you with Open Innovation and Advanced Degree Researchers who know the ins and outs of technologies like TensorFlow. Get started today!