Friday, April 23, 2021
  • Setup menu at Appearance » Menus and assign menu to Top Bar Navigation
Advertisement
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
No Result
View All Result
Home Artificial Intelligence

AI Bouncing Off the Walls as Growing Models Max Out Hardware

December 20, 2019
in Artificial Intelligence
AI Bouncing Off the Walls as Growing Models Max Out Hardware
586
SHARES
3.3k
VIEWS
Share on FacebookShare on Twitter

The growing size of AI models is bumping into the limits of hardware needed to process it, meaning current AI may be hitting the wall. (GETTY IMAGES)

By John P. Desmond, AI Trends Editor

You might also like

European Values Confront AI Innovation in EU’s Proposed AI Act  

Ethics Leadership Continues to Churn at Google; Bengio Out, Dr. Croak is In 

QScout Quantum Computer from Sandia Labs Open for Research Business  – AI Trends

Has AI hit the wall? Recent evidence suggests it might be the case.

At the recent NeurIPS event in Vancouver, software engineer Blaise Aguera y Arcas, the head of AI for Google, recognized the progress in the use of deep learning techniques to get smartphones to recognize faces and voices. And he called attention to limitations of deep learning.

Blaise Aguera y Arcas, the head of AI for Google

“We’re kind of like the dog who caught the car,” Aguera y Arcas said in an account reported in Wired. Problems that involve more reasoning or social intelligence, like sizing up a potential hire, may be out of reach of today’s AI. “All of the models that we have learned how to train are about passing a test or winning a game with a score, [but] so many things that intelligences do aren’t covered by that rubric at all,” he stated.

A similar theme was struck in an address by Yoshua Bengio, director of Mila, an AI institute in Montreal, known for his work in artificial neural networks and deep learning. He noted how today’s deep learning systems yield highly specialized results. “We have machines that learn in a very narrow way,” Bengio said. “They need much more data to learn a task than human examples of intelligence, and they still make stupid mistakes.”

Both speakers recommended AI developers seek inspiration from the biological roots of natural intelligence, so that for example, deep learning systems could be flexible enough to handle situations different from the ones they were trained on.

A similar alarm was sounded by Jerome Pesenti, VP of AI at Facebook, also in a recent account in Wired on AI hitting the wall. Pesenti joined Facebook in January 2018, inheriting a research lab created by Yann Lecun, a French-American computer scientist known for his work on machine learning and computer vision. Before Facebook, Pesenti had worked on IBM’s Watson AI platform and at Benevolent AI, a company applying the technology to medicine.

Jerome Pesenti, VP of AI at Facebook

“Deep learning and current AI, if you are really honest, has a lot of limitations. We are very very far from human intelligence, and there are some criticisms that are valid: It can propagate human biases, it’s not easy to explain, it doesn’t have common sense, it’s more on the level of pattern matching than robust semantic understanding. But we’re making progress in addressing some of these, and the field is still progressing pretty fast. You can apply deep learning to mathematics, to understanding proteins, there are so many things you can do with it,” Pesenti stated in the interview.

The compute power hardware requirement, the sheer volume of equipment needed, continues to grow for advanced AI. This continuation of this growth rate appears to be unrealistic. “Clearly the rate of progress is not sustainable. If you look at top experiments, each year the cost it going up 10-fold. Right now, an experiment might be in seven figures, but it’s not going to go to nine or ten figures, it’s not possible, nobody can afford that,” Pesenti stated. “It means that at some point we’re going to hit the wall. In many ways we already have.”

The way forward is to work on optimization, getting the most out of the available compute power.

Similar observations are being made by Intel’s Naveen Rao VP and general manager of Intel’s AI Products Group. He suggested at the company’s recent AI Summit, from an account in datanami, that the growth in the size of neural networks is outpacing the ability of the hardware to keep up. Solving the problem will require new thinking about how processing, network, and memory work together.

Naveen Rao, VP and general manager of Intel’s AI Products Group

“Over the last 20 years we’ve gotten a lot better at storing data,” Rao stated. “We have bigger datasets than ever before. Moore’s Law has led to much greater compute capability in a single place. And that allowed us to build better and bigger neural network models. This is kind of a virtuous cycle and it’s opened up new capabilities.”

More data translates to better deep learning models for recognizing speech, text, and images. Computers that can accurately identify images and chatbots that can carry on fairly natural conversations, are primary examples of how deep learning is having an impact on daily life. However this cutting edge AI is only available to the biggest tech firms—Google, Facebook, Amazon, Microsoft. Still, we might be at the max.

It could be application-specific integrated circuits (ASIC) could help move more AI processing to the edge. Discrete graphics processing units (GPUs) are also being planned at Intel and a vision processing unit (VPU) chip was recently unveiled.

“There’s a clear trend where the industry is headed to build ASICS for AI,” Rao stated. “It’s because the growth of demand is actually outpacing what we can build in some of our other product lines.”

Facebook AI researchers recently published a report on their XLM-R project, a natural language model based on the Transformer model from Google.  XLM-R is engineering to be able to perform translations between 100 different languages, according to an account in ZDNet.

XLM-R runs on 500 of NVIDIA’s V100 GPUs, and it is hitting the wall, running into resource constraints. The application has 24 layers, 16 “attention heads” and 500 million parameters. Still, it has a finite capacity and reaches its limit.

“Model capacity (i.e. the number of parameters in the model) is constrained due to practical considerations such as memory and speed during training and inference,” the authors wrote.

The experience exemplifies two trends in AI on a collision course. One is the intent of scientists to build bigger and bigger models to get better results; the other is roadblocks in computing capacity.

Read the source articles in Wired,  datanami and ZDNet.

Credit: AI Trends By: Benjamin Ross

Previous Post

Creating Better Drugs With This ‘Lemon’ for Machine Learning

Next Post

Seven differences between academia and industry for building machine learning and deep learning models

Related Posts

European Values Confront AI Innovation in EU’s Proposed AI Act  
Artificial Intelligence

European Values Confront AI Innovation in EU’s Proposed AI Act  

April 23, 2021
Ethics Leadership Continues to Churn at Google; Bengio Out, Dr. Croak is In 
Artificial Intelligence

Ethics Leadership Continues to Churn at Google; Bengio Out, Dr. Croak is In 

April 16, 2021
QScout Quantum Computer from Sandia Labs Open for Research Business  – AI Trends
Artificial Intelligence

QScout Quantum Computer from Sandia Labs Open for Research Business  – AI Trends

April 16, 2021
Scientists Working on Continual Learning to Overcome ‘Catastrophic Forgetting’ 
Artificial Intelligence

Scientists Working on Continual Learning to Overcome ‘Catastrophic Forgetting’ 

April 16, 2021
Reports of Death of Moore’s Law Are Greatly Exaggerated as AI Expands 
Artificial Intelligence

Reports of Death of Moore’s Law Are Greatly Exaggerated as AI Expands 

April 16, 2021
Next Post
Seven differences between academia and industry for building machine learning and deep learning models

Seven differences between academia and industry for building machine learning and deep learning models

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

January 6, 2019
Microsoft, Google Use Artificial Intelligence to Fight Hackers

Microsoft, Google Use Artificial Intelligence to Fight Hackers

January 6, 2019

Categories

  • Artificial Intelligence
  • Big Data
  • Blockchain
  • Crypto News
  • Data Science
  • Digital Marketing
  • Internet Privacy
  • Internet Security
  • Learn to Code
  • Machine Learning
  • Marketing Technology
  • Neural Networks
  • Technology Companies

Don't miss it

Evolving ITOps with AIOps with no-code AI training with Cloud Pak for Watson AIOps – IBM Developer
Technology Companies

Evolving ITOps with AIOps with no-code AI training with Cloud Pak for Watson AIOps – IBM Developer

April 23, 2021
Best free PC antivirus software in 2021
Internet Security

Best free PC antivirus software in 2021

April 23, 2021
Cybercriminals Using Telegram Messenger to Control ToxicEye Malware
Internet Privacy

Cybercriminals Using Telegram Messenger to Control ToxicEye Malware

April 23, 2021
Strategies for a successful Voice of the Customer program
Data Science

Strategies for a successful Voice of the Customer program

April 23, 2021
European Values Confront AI Innovation in EU’s Proposed AI Act  
Artificial Intelligence

European Values Confront AI Innovation in EU’s Proposed AI Act  

April 23, 2021
Artificial Intelligence and Machine Learning: Demographics & Firmographics
Machine Learning

Global Federated Learning Solutions Market (2020 to 2028)

April 23, 2021
NikolaNews

NikolaNews.com is an online News Portal which aims to share news about blockchain, AI, Big Data, and Data Privacy and more!

What’s New Here?

  • Evolving ITOps with AIOps with no-code AI training with Cloud Pak for Watson AIOps – IBM Developer April 23, 2021
  • Best free PC antivirus software in 2021 April 23, 2021
  • Cybercriminals Using Telegram Messenger to Control ToxicEye Malware April 23, 2021
  • Strategies for a successful Voice of the Customer program April 23, 2021

Subscribe to get more!

© 2019 NikolaNews.com - Global Tech Updates

No Result
View All Result
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News

© 2019 NikolaNews.com - Global Tech Updates