Sunday, February 28, 2021
  • Setup menu at Appearance » Menus and assign menu to Top Bar Navigation
Advertisement
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
No Result
View All Result
Home Artificial Intelligence

Research Into Hardware Aims to Lower Demands and Expense of AI Software

May 22, 2020
in Artificial Intelligence
Research Into Hardware Aims to Lower Demands and Expense of AI Software
585
SHARES
3.3k
VIEWS
Share on FacebookShare on Twitter

Research into new hardware aims to make the high demands AI makes on software and energy consumption more practical. (Credit: Getty Images)

By AI Trends Staff  

You might also like

Asimov’s Three Laws Of Robotics And AI Autonomous Cars 

Tesla Working on Full Self-Driving Mode, Extending AI Lead 

RAND Corp. Finds DoD “Significantly Challenged” in AI Posture 

With the energy and compute demands of AI machine learning models trending at what appears to be an unsustainable rate, researchers at Purdue University are experimenting with specialized hardware aimed at offloading some of the AI demands on software. 

The approach exploits features of quantum computing, especially proton transport. 

“Software is taking on most of the challenges in AI. If you could incorporate intelligence into the circuit components in addition to what is happening in software, you could do things that simply cannot be done today,” stated Shriram Ramanathan, a professor of materials engineering at Purdue University, in an account from Purdue University published on sciencesprings. 

Shriram Ramanathan, Professor of Materials Engineering, Purdue University

The reliance on software with massive energy needs to make AI work is not sustainable, suggested Ramanathan. If hardware and software could share intelligence features, the silicon might be able to achieve more with a given input of energy.  

The hardware the Purdue team is developing is made of quantum material, with properties the team is working to understand and apply to solving problems in electronics. The way software uses tree-like memory to organize information into various “branches,” is how the human brain categorizes information and makes decisions. 

“Humans memorize things in a tree structure of categories. We memorize ‘apple’ under the category of ‘fruit’ and ‘elephant’ under the category of ‘animal,’ for example,” said Hai-Tian Zhang, a Lillian Gilbreth postdoctoral fellow in Purdue’s College of Engineering. “Mimicking these features in hardware is potentially interesting for brain-inspired computing.” 

The team introduced a proton to a quantum material called neodymium nickel oxide. They discovered that applying an electric pulse to the material moves around the proton. Each new position of the proton creates a memory state; multiple electric pulses create a branch made up of memory states. 

“We can build up many thousands of memory states in the material by taking advantage of quantum mechanical effects. The material stays the same. We are simply shuffling around protons,” Ramanathan said. 

The team showed that the material is capable of learning the numbers 0 through 9, a baseline test of AI. The demonstration of these tasks at room temperature in a material is a step toward showing that hardware could offload tasks from software.  

“This discovery opens up new frontiers for AI that have been largely ignored because implementing this kind of intelligence into electronic hardware didn’t exist,” Ramanathan said. 

The results of this study are published in the journal Nature Communications. 

Whether AI machine learning is making unreasonable power demands is also being investigated by John Naughton, professor of the public understanding of technology at the Open University, a public research university, the largest university in the UK for undergraduate education. He is the author of “From Gutenberg to Zuckerberg: What You Really Need to Know About the Internet.” 

John Naughton, Professor of the Public Understanding of Technology, the Open University.

Researchers at Nvidia, the manufacturer of GPUs (graphic processing units) now used in most machine-learning systems, developed a natural language model that was 24 times bigger than its predecessor, and 34 percent better at its learning task, Naughton wrote in a recent account in The Guardian: Training the final model took 512 V100 GPUs running continuously for 9.2 days. One expert calculated that would be three times the yearly energy consumption of the average American. 

“You don’t have to be Einstein to realize that machine learning can’t continue on its present path, especially given the industry’s frenetic assurances that tech giants are heading for an ‘AI everywhere’ future,” Naughton stated. 

He, too, suggests that advances in hardware could help make the demands of AI more practical, noting that the new Apple iPhone 11 includes Apple’s A13 chip, which incorporates neural network software behind recent advances in natural language and image processing. 

IBM Experimenting with The Neural Computer  

Meanwhile at IBM, research is going on into the “neural computer,” a new type of computer designed to develop AI algorithms and assist in computational neuroscience. The Neural Computer is a deep “neuroevolution” system that combines the hardware implementation of an Atari 2600, image preprocessing, and AI algorithms in an optimized pipeline, according to a recent account in VentureBeat. (The Atari 2600, originally branded as the Atari Video Computer Systems, was introduced in 1977.) 

The research team reports in an IBM technical research paper released earlier this year, that results so far have achieved a record training time of 1.2 million image frames per second. 

Video games are a well-established platform for AI and machine learning research. In certain domains like reinforcement learning, the AI learns optimal behaviors by interacting with the environment in pursuit of rewards, such as good game scores. AI algorithms developed within games have been shown to be adaptable to practical uses, including protein folding prediction. If the results from IBM’s Neural Computer prove to be repeatable, the system could be used to accelerate the development of AI algorithms. 

Over the course of five experiments, IBM researchers ran 59 Atari 2600 games on the Neural Computer. It required 6 billion game frames in total and failed at challenging exploration games like Montezuma’s Revenge and Pitfall. But it managed to outperform a popular baseline — a Deep Q-network, an architecture pioneered by DeepMind — in 30 out of 59 games after 6 minutes of training (200 million training frames). This compares to the Deep-Q network’s 10 days of training. With 6 billion training frames, it surpassed the Deep Q-network in 36 games while taking 2 orders of magnitude less training time (2 hours and 30 minutes). 

Read the source articles in sciencesprings, The Guardian, VentureBeat and at IBM Research. 

Credit: AI Trends By: Allison Proffitt

Previous Post

Machine Learning as a Service (MLaaS) Market size and Key Trends in terms of volume and value 2020-2025

Next Post

Dow Futures Rally Despite Drastic Escalation in U.S.-China Trade War

Related Posts

Asimov’s Three Laws Of Robotics And AI Autonomous Cars 
Artificial Intelligence

Asimov’s Three Laws Of Robotics And AI Autonomous Cars 

February 26, 2021
Tesla Working on Full Self-Driving Mode, Extending AI Lead 
Artificial Intelligence

Tesla Working on Full Self-Driving Mode, Extending AI Lead 

February 25, 2021
RAND Corp. Finds DoD “Significantly Challenged” in AI Posture 
Artificial Intelligence

RAND Corp. Finds DoD “Significantly Challenged” in AI Posture 

February 25, 2021
SolarWinds Hackers Targeted Cloud Services as a Key Objective 
Artificial Intelligence

SolarWinds Hackers Targeted Cloud Services as a Key Objective 

February 25, 2021
IBM Reportedly Retreating from Healthcare with Watson 
Artificial Intelligence

IBM Reportedly Retreating from Healthcare with Watson 

February 25, 2021
Next Post
Dow Futures Rally Despite Drastic Escalation in U.S.-China Trade War

Dow Futures Rally Despite Drastic Escalation in U.S.-China Trade War

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

January 6, 2019
Microsoft, Google Use Artificial Intelligence to Fight Hackers

Microsoft, Google Use Artificial Intelligence to Fight Hackers

January 6, 2019

Categories

  • Artificial Intelligence
  • Big Data
  • Blockchain
  • Crypto News
  • Data Science
  • Digital Marketing
  • Internet Privacy
  • Internet Security
  • Learn to Code
  • Machine Learning
  • Marketing Technology
  • Neural Networks
  • Technology Companies

Don't miss it

Accurate classification of COVID‐19 patients with different severity via machine learning – Sun – 2021 – Clinical and Translational Medicine
Machine Learning

Accurate classification of COVID‐19 patients with different severity via machine learning – Sun – 2021 – Clinical and Translational Medicine

February 28, 2021
Privacy Commissioner asks for clarity on minister’s powers in Critical Infrastructure Bill
Internet Security

Privacy Commissioner asks for clarity on minister’s powers in Critical Infrastructure Bill

February 28, 2021
Top Master’s Programs In Machine Learning In The US
Machine Learning

Top Master’s Programs In Machine Learning In The US

February 28, 2021
TikTok agrees to pay $92 million to settle teen privacy class-action lawsuit
Internet Security

TikTok agrees to pay $92 million to settle teen privacy class-action lawsuit

February 28, 2021
Machine Learning as a Service (MLaaS) Market 2020 Emerging Trend and Advancement Outlook 2025
Machine Learning

Key Company Profile, Production Revenue, Product Picture and Specifications 2025

February 28, 2021
Cybercrime groups are selling their hacking skills. Some countries are buying
Internet Security

Cybercrime groups are selling their hacking skills. Some countries are buying

February 28, 2021
NikolaNews

NikolaNews.com is an online News Portal which aims to share news about blockchain, AI, Big Data, and Data Privacy and more!

What’s New Here?

  • Accurate classification of COVID‐19 patients with different severity via machine learning – Sun – 2021 – Clinical and Translational Medicine February 28, 2021
  • Privacy Commissioner asks for clarity on minister’s powers in Critical Infrastructure Bill February 28, 2021
  • Top Master’s Programs In Machine Learning In The US February 28, 2021
  • TikTok agrees to pay $92 million to settle teen privacy class-action lawsuit February 28, 2021

Subscribe to get more!

© 2019 NikolaNews.com - Global Tech Updates

No Result
View All Result
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News

© 2019 NikolaNews.com - Global Tech Updates