Wednesday, March 3, 2021
  • Setup menu at Appearance » Menus and assign menu to Top Bar Navigation
Advertisement
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
No Result
View All Result
Home Machine Learning

Federated Learning Explained Simply – Nanalyze

January 28, 2020
in Machine Learning
Federated Learning Explained Simply – Nanalyze
586
SHARES
3.3k
VIEWS
Share on FacebookShare on Twitter

Having held Google (GOOG) shares since shortly after their IPO, we’ve always taken a “set it and forget it” approach to our investment in the company which acts as a gatekeeper to the world’s information. Lately though, we can’t help but notice Google is letting a minuscule number of employees dictate who they ought to be doing business with. Instead of focusing on doing their jobs, some of Google’s employees think they’re being paid to be activists. It’s a sad state of affairs. Google is no longer the great company it used to be since they started letting the stench of politics permeate their organization. Focus seems to have switched from execution to pacification. Everything seems to be getting dumber by the day.

Maybe you’ve heard the new term being thrown around lately by tech pundits – “federated learning.” Instead of just having one of their engineers explain the concept in a simple and concise manner, Google decided to squander our future dividend payments on hiring an “Adventure Cartoonist” who some poor engineer was forced to work with. The resulting comic strip is supposed to explain the concept of “federated learning,” yet it mainly just succeeds in placating Gwyneth in Human Resources who thinks inclusive comic strips are what engineering needs more of. Today, we’re going to explain federated learning to the adults in the room with help from some of the adults left at Google who published a great blog post titled “Federated Learning: Collaborative Machine Learning without Centralized Training Data.”

You might also like

Yum! Brands Acquires AI Company

Cloudera: An Enterprise-Level Play On Machine Learning And Big Data – Seeking Alpha

An open-source machine learning framework to carry out systematic reviews

How Machines Learn

Standard machine learning models become intelligent when you train them using big data. Typically, this involves centralizing the training data on one machine or distributing the data evenly across multiple machines in a datacenter. Give an algorithm enough cat pictures and it will soon be able to identify what a cat looks like, what breed a cat is, and perhaps even the cat’s age. The traditional process of training a model meant that all the big data would be uploaded to a single location, a process that’s bandwidth intensive and comes with certain privacy implications. If you consider an Internet of Things (IoT) use case such as creating a digital twin, then all that big data would come from sensors out in the field that would all need to maintain connectivity with the cloud. This need to send all your data to a central location so that your algorithms can be trained is cumbersome. Now, there’s a new way of doing things that doesn’t require all the data to be transferred to a central location.

Federated Learning vs. Distributed Machine Learning

Distributed machine learning is the notion of breaking down the training workload across multiple machines, with each machine handling around the same amount of training data. Federated learning is different in that each machine will be handling a different amount of data. Consequently, federated learning is a subset of distributed machine learning. It’s an approach that decouples the ability to do machine learning from the need to store the data in the cloud. From Google’s blog post:

It works like this: your device downloads the current model, improves it by learning from data on your phone, and then summarizes the changes as a small focused update. Only this update to the model is sent to the cloud, using encrypted communication, where it is immediately averaged with other user updates to improve the shared model. All the training data remains on your device, and no individual updates are stored in the cloud.

It’s not just about running AI algorithms “on the edge,” it’s about training them “on the edge.” In a previous article, we talked about Fog Computing vs. Cloud Computing vs. Edge Computing. Essentially, edge computing is just about moving portions of your cloud-based applications closer to the devices that use them. Our recent article on Ambarella talked about how they’ve developed better AI chips that consume less power and can now enable security cameras with computer vision capabilities. While security cameras could be edge devices that use federated learning, the more common edge devices for federated learning are the smartphones we carry around with us. There are a few implications to using smartphones as edge devices for federated learning.

Smartphones as Edge Devices

In most countries around the world, internet bandwidth – or data as its often called – is expensive and unreliable. In order to not piss people off, federated learning needs to consume as little bandwidth as possible when communicating with the cloud and at the same time handle the unpredictability of connectivity. “Federated learning applies best in situations where the on-device data is more relevant than the data that exists on servers (e.g., the devices generate the data in the first place), is privacy-sensitive, or otherwise undesirable or infeasible to transmit to servers,” says Google. They’re now using it for things like content suggestions for on-device keyboards. In a March 2019 paper last year, Google researchers stated:

We have reached a state of maturity sufficient to deploy the system in production and solve applied learning problems over tens of millions of real-world devices; we anticipate uses where the number of devices reaches billions.

One of the things developers find most exciting about federated learning is that it addresses the fundamental problems of privacy, ownership, and locality of data. Let’s say you developed a smartphone app for distinguishing between cancerous or benign skin spots. Using federated learning means you won’t have to send the user’s image to the cloud. More importantly, you’ll be able to improve your algorithms without actually needing to look at all the images being analyzed across your smartphone population. The appeal to smartphone makers is evident in Apple’s recent decision to acquire some federated learning technology for their own devices.

Federated Learning Startups

A few weeks ago, Apple acquired a Seattle startup called Xnor.ai which has developed a platform that “allows companies to run complex deep learning algorithms, formerly restricted to the cloud, locally on a range of devices including mobile phones, drones, and wearables.” An article by GeekWire discusses some of Xnor’s notable accomplishments in 2019 like a standalone AI chip that could run on solar power for years or an edge-based person recognition technology built into low-cost security cameras which is reminiscent of the technology Ambarella is deploying right now.

There are also other startups out there bringing federated learning to the masses. Snips is developing federated learning for voice platforms, XAIN is applying it to automated invoice processing, Owkin is working on federated learning for medical research, and S20 is talking about how multiple third parties – like banks – can work together to train algorithms for applications like fraud without having to exchange data. In future articles, we might dive deeper into some of these applications to try and separate the hype from the substance.

Conclusion

Federated learning, edge computing, distributed machine learning, fog computing, it’s hard to keep up with the constant barrage of nomenclature from the tech world. For decades now, engineers have been moving software and hardware further apart, then closer together, then further apart. The advantages of moving everything to a central server are high processing speeds, lower latency, and unlimited power. When you do things on smartphone edge devices, you are limited by battery power, bandwidth, and processing speed. Each of these three parameters are being improved as we develop better lithium batteries, more sophisticated AI processors for mobile, and 5G ushers in an era of unprecedented bandwidth. That supercomputer you keep at your side every waking hour is becoming an indispensable extension of your physical body. And it just got a whole lot smarter.

If you enjoyed this article, then sign up for our free newsletter – Nanalyze Weekly. About every week, we’ll send you a simple summary of all our new articles. If you didn’t enjoy this article, share it on Twitter and tell everyone how much you hated it.



Credit: Google News

Previous Post

areas to implement recommendations’ mechanism

Next Post

Best Languages for Data Science and Statistics in One Picture

Related Posts

Yum! Brands Acquires AI Company
Machine Learning

Yum! Brands Acquires AI Company

March 3, 2021
Cloudera: An Enterprise-Level Play On Machine Learning And Big Data – Seeking Alpha
Machine Learning

Cloudera: An Enterprise-Level Play On Machine Learning And Big Data – Seeking Alpha

March 3, 2021
An open-source machine learning framework to carry out systematic reviews
Machine Learning

An open-source machine learning framework to carry out systematic reviews

March 3, 2021
Microsoft’s Azure Arc multi-cloud platform now supports machine learning workloads – TechCrunch
Machine Learning

Microsoft’s Azure Arc multi-cloud platform now supports machine learning workloads – TechCrunch

March 2, 2021
Opportunity, Trends, Share, Top Companies Analysis (Based on 2021 COVID-19 Worldwide Spread) – NeighborWebSJ
Machine Learning

Opportunity, Trends, Share, Top Companies Analysis (Based on 2021 COVID-19 Worldwide Spread) – NeighborWebSJ

March 2, 2021
Next Post
Best Languages for Data Science and Statistics in One Picture

Best Languages for Data Science and Statistics in One Picture

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

January 6, 2019
Microsoft, Google Use Artificial Intelligence to Fight Hackers

Microsoft, Google Use Artificial Intelligence to Fight Hackers

January 6, 2019

Categories

  • Artificial Intelligence
  • Big Data
  • Blockchain
  • Crypto News
  • Data Science
  • Digital Marketing
  • Internet Privacy
  • Internet Security
  • Learn to Code
  • Machine Learning
  • Marketing Technology
  • Neural Networks
  • Technology Companies

Don't miss it

Remote work: 5 things every business needs to know
Internet Security

Remote work: 5 things every business needs to know

March 3, 2021
Yum! Brands Acquires AI Company
Machine Learning

Yum! Brands Acquires AI Company

March 3, 2021
Customer Experience Management and Improvement
Marketing Technology

Customer Experience Management and Improvement

March 3, 2021
New app rollout helps reduce paperwork for NSW frontline child protection caseworkers
Internet Security

New app rollout helps reduce paperwork for NSW frontline child protection caseworkers

March 3, 2021
Cloudera: An Enterprise-Level Play On Machine Learning And Big Data – Seeking Alpha
Machine Learning

Cloudera: An Enterprise-Level Play On Machine Learning And Big Data – Seeking Alpha

March 3, 2021
The Symbolic World: Raising A Turing’s Child Machine (1/2) | by Puttatida Mahapattanakul | Feb, 2021
Neural Networks

The Symbolic World: Raising A Turing’s Child Machine (1/2) | by Puttatida Mahapattanakul | Feb, 2021

March 3, 2021
NikolaNews

NikolaNews.com is an online News Portal which aims to share news about blockchain, AI, Big Data, and Data Privacy and more!

What’s New Here?

  • Remote work: 5 things every business needs to know March 3, 2021
  • Yum! Brands Acquires AI Company March 3, 2021
  • Customer Experience Management and Improvement March 3, 2021
  • New app rollout helps reduce paperwork for NSW frontline child protection caseworkers March 3, 2021

Subscribe to get more!

© 2019 NikolaNews.com - Global Tech Updates

No Result
View All Result
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News

© 2019 NikolaNews.com - Global Tech Updates