Friday, February 26, 2021
  • Setup menu at Appearance » Menus and assign menu to Top Bar Navigation
Advertisement
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
No Result
View All Result
Home Machine Learning

Google experiments with AI to design its in-house computer chips

February 19, 2020
in Machine Learning
Google experiments with AI to design its in-house computer chips
587
SHARES
3.3k
VIEWS
Share on FacebookShare on Twitter

Alphabet’s Google unit is trying out artificial intelligence programs to advance its internal development of dedicated chips to accelerate its software, according to Google’s head of AI research, Jeff Dean. 

“We are using it internally for a few chip design projects,” said Dean in an interview with ZDNet Monday, following a keynote talk he gave at the International Solid State Circuits Conference, an annual technical symposium held in San Francisco. 

You might also like

Something’s Fishy — New Funding To Tackle Illegal Activities At Sea Using Machine Learning And Data Analytics

Cloudera aims to fast track enterprise machine learning use cases with Applied ML Prototypes

Machine learning‐based analysis of alveolar and vascular injury in SARS‐CoV‐2 acute respiratory failure – Calabrese – – The Journal of Pathology

Google has over the course of several years developed a family of AI hardware, its Tensor Processing Unit, or TPU, chip, for processing AI in its server computers. 

Using AI to design those chips would represent a kind of virtuous cycle, where AI makes chips better, and then those improved chips boost the power of the AI algorithms, and so on. 

During his keynote, Dean described to the audience how a machine learning program can be used to make some decisions about how to lay out circuits of a computer chip, with the resultant design having equal or greater acumen compared to a human chip designer. 

Also: AI on steroids: Much bigger neural nets to come with new hardware, say Bengio, Hinton, and LeCun

In the traditional “place and route” task, chip designers use software to determine the layout in a chip of the circuits that form the chip’s operations, analogous to designing the floor plan of a building. A number of variables come into play to find an optimal layout that fulfills several objectives, including delivering chip performance, but also avoiding unnecessary complexity that can drive up the cost to manufacture the chip. That balancing act requires a lot of human heuristics about how best to pursue design. Now, AI algorithms may be able to experiment in ways that can be competitive with those heuristics. 

In one example, Dean told the audience that a deep learning neural network, after only twenty-four hours on the problem, found a better solution than human designers six to eight weeks on the problem. The design resulted in a reduction of the total wiring needed in the chip, an improvement. 

Dean told the audience a machine learning model that worked on the problem for only 24 hours came up with a chip design that improved upon human efforts that took upward of eight weeks to construct. 

The deep learning program is akin to the AlphaZero program developed by Google’s DeepMind unit to conquer the game of Go. Like AlphaZero, the chip design program is a form of what’s called reinforcement learning. In order to achieve a goal, the program tries various steps to see which ones lead to better results. Rather than pieces on a game board, the moves are choices of how to place the right circuit layout in the total chip design. 

Unlike in Go, however, the solution “space,” the number of possible circuit layouts, are vastly larger. And, as mentioned above, numerous objectives have to be accommodated, rather than the single objective in Go of winning the game. 

Dean, talking with ZDNet, described the internal efforts as being in the early stages of understanding the utility of the technology. “We’re getting our designers to experiment with it and see how they start to make use of it in their workflows,” said Dean. 

Also: AI is changing the entire nature of compute

“We’re trying to understand how it’s useful, and what areas does it improve on.” 

Google’s foray into AI design comes amidst a renaissance in chip production, as companies large and small design dedicated silicon to run machine learning faster. Dedicated AI hardware can lead to larger and more efficient machine learning software projects, according to some machine learning scientists. 

The diversity created by AI hardware startup companies, such as Cerebras Systems and Graphcore, can be expected to continue apace, said Dean, even as Google expands its own efforts. 

Dean said the variety that’s emerging is intriguing. 

“I’m not sure if they’re all going to survive, but it’s pretty interesting because many of them are taking very different design points in the design space,” Dean said of the startups. “Just as one distinction, some are accelerating models that are very small, that can fit in on-chip SRAM,” he said, meaning, the size of the machine learning model is so small it doesn’t need external memory. 

“And if your model fits in SRAM, those things are going to be very effective, but if you’re model doesn’t, that’s not the chip for you.”

google-ai-novel-circuit-design.png

Google said the machine learning program produced some novel circuit designs not conceived of by human designers. 


Google

Asked if the chips will converge on some standard design, Dean suggested diversity is more likely, at least for the time being. 

“I do think there’s going to be more heterogeneity in the kind of approaches used, not less,” he said, “because if you look at the explosion in machine learning research, and uses of machine learning in lots of different kinds of problems, it’s going to be a large enough set of things in the world that you’re not going to want just one design, you’re going to want five or six — not a thousand, but five or six different design points.”

Added Dean, “It’ll be interesting to see which ones hold up, in terms of, are they generally useful for a lot of things, or are they very specialized and accelerate one kind of thing but don’t do well on others.”

As for Google’s own efforts beyond the TPU, Dean indicated there’s an appetite for more and more dedicated silicon at Google. Asked if the trend to AI hardware at Google “has legs,” meaning, can extend beyond its current offerings, Dean replied, “Oh, yeah.”

“Definitely there’s growing use of machine learning across Google products, both data-center-based services, but also much more of our stuff is running on device on the phone,” said Dean. The Google Translate application is an escape of a sophisticated program, now at seventy different languages, that can run on a phone even in airplane mode, he noted, when there’s no connection back to the data center.  

The family of Google silicon for AI has already broadened, he indicated. The “Edge TPU,” for example, is a designation that covers “different design points,” said Dean, including low-power applications, on the one hand, and high-performance applications at the heart of the data center. Asked if the variety could broaden still further, Dean replied, “I think it could.”

“Even within non-data-center things, you’re already seeing a distinction of higher power environments like autonomous vehicles, things that don’t have to be at the 1-watt level, they can be fifty or a hundred watts,” he said. “So you want different parts for that versus something on a phone.” At the same time, there will be ultra-low-power applications like sensors in agriculture that do some AI processing without sending any data to the cloud. Equipped with AI, such a sensor can assess whether there is any data of interest being picked up, say, via a camera, and stream those individual data points back to the cloud for analysis. 

Credit: Google News

Previous Post

Convolutional Neural Network — A brief introduction

Next Post

Will Domino’s Stock Punish Investors With a Brutal Q4 Earnings Report?

Related Posts

Something’s Fishy — New Funding To Tackle Illegal Activities At Sea Using Machine Learning And Data Analytics
Machine Learning

Something’s Fishy — New Funding To Tackle Illegal Activities At Sea Using Machine Learning And Data Analytics

February 26, 2021
Cloudera aims to fast track enterprise machine learning use cases with Applied ML Prototypes
Machine Learning

Cloudera aims to fast track enterprise machine learning use cases with Applied ML Prototypes

February 25, 2021
Machine learning‐based analysis of alveolar and vascular injury in SARS‐CoV‐2 acute respiratory failure – Calabrese – – The Journal of Pathology
Machine Learning

Machine learning‐based analysis of alveolar and vascular injury in SARS‐CoV‐2 acute respiratory failure – Calabrese – – The Journal of Pathology

February 25, 2021
Zorroa Boon AI: No-Code Machine Learning Now Open for Media Use
Machine Learning

Zorroa Boon AI: No-Code Machine Learning Now Open for Media Use

February 25, 2021
Using machine learning to identify blood biomarkers for early diagnosis of autism
Machine Learning

Using machine learning to identify blood biomarkers for early diagnosis of autism

February 25, 2021
Next Post
Will Domino’s Stock Punish Investors With a Brutal Q4 Earnings Report?

Will Domino’s Stock Punish Investors With a Brutal Q4 Earnings Report?

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

January 6, 2019
Microsoft, Google Use Artificial Intelligence to Fight Hackers

Microsoft, Google Use Artificial Intelligence to Fight Hackers

January 6, 2019

Categories

  • Artificial Intelligence
  • Big Data
  • Blockchain
  • Crypto News
  • Data Science
  • Digital Marketing
  • Internet Privacy
  • Internet Security
  • Learn to Code
  • Machine Learning
  • Marketing Technology
  • Neural Networks
  • Technology Companies

Don't miss it

Asimov’s Three Laws Of Robotics And AI Autonomous Cars 
Artificial Intelligence

Asimov’s Three Laws Of Robotics And AI Autonomous Cars 

February 26, 2021
Something’s Fishy — New Funding To Tackle Illegal Activities At Sea Using Machine Learning And Data Analytics
Machine Learning

Something’s Fishy — New Funding To Tackle Illegal Activities At Sea Using Machine Learning And Data Analytics

February 26, 2021
Role of Image Annotation in Applying Machine Learning for Precision Agriculture | by ANOLYTICS
Neural Networks

Role of Image Annotation in Applying Machine Learning for Precision Agriculture | by ANOLYTICS

February 26, 2021
60+ free martech sessions. The agenda is live!
Digital Marketing

60+ free martech sessions. The agenda is live!

February 26, 2021
SolarWinds cybersecurity spending tops $3 million in Q4, sees $20 million to $25 million in 2021
Internet Security

SolarWinds cybersecurity spending tops $3 million in Q4, sees $20 million to $25 million in 2021

February 26, 2021
Chinese Hackers Using Firefox Extension to Spy On Tibetan Organizations
Internet Privacy

Chinese Hackers Using Firefox Extension to Spy On Tibetan Organizations

February 25, 2021
NikolaNews

NikolaNews.com is an online News Portal which aims to share news about blockchain, AI, Big Data, and Data Privacy and more!

What’s New Here?

  • Asimov’s Three Laws Of Robotics And AI Autonomous Cars  February 26, 2021
  • Something’s Fishy — New Funding To Tackle Illegal Activities At Sea Using Machine Learning And Data Analytics February 26, 2021
  • Role of Image Annotation in Applying Machine Learning for Precision Agriculture | by ANOLYTICS February 26, 2021
  • 60+ free martech sessions. The agenda is live! February 26, 2021

Subscribe to get more!

© 2019 NikolaNews.com - Global Tech Updates

No Result
View All Result
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News

© 2019 NikolaNews.com - Global Tech Updates