1. Federated
Learning
A new framework for Artificial Intelligence (AI)
model development that is distributed over
millions of mobile devices.
W.S. Shashipraba Perera
2. Introduction
• Standard machine learning approaches necessarily require
storing training data on a single machine or in a
datacenter. Federated Learning for models is a new
approach trained from user interaction with mobile
devices (FL).
• FL allows mobile phones to learn a shared prediction
model collaboratively while keeping all training data on the
device, decoupling machine learning from the need to
store data in the cloud.
• This goes beyond the use of local models on mobile
devices to make predictions by bringing model training to
the device as well.
3. How Does It Work?
It works as follows:
1. The devices will be given a training model which is usually only a few
megabytes in size.
2. The devices train on the local data.
3. The devices send encrypted updates on the parameters to the server.
4. The server groups the devices. For each group, the server aggregates
the updates it received from the group of devices to perform one
update to the current model.
5. After rounds of training, the new updated model is sent to the
devices for on-device testing (again, the theme of decentralization is
at play here) and a new round of training.
4. • Rather than sending user data to the
server, we send some of the training and
prediction to the user's device.
• Aside from security concerns, on-device
inference improves latency, works offline,
and conserves battery life.
5. • To ensure that an application's user
experience is not affected, model training
occurs while the user's device is connected
to the power, connected to free WiFi, and
idle.
https://youtu.be/gbRJPa9d-VU
6. When compared to centralized machine learning, federated learning has a
few distinct advantages:
• Because the data remains on the user's
device, privacy is ensured.
• Reduced latency because the updated
model can make predictions on the
user's device.
• Given the collaborative training process,
smarter models.
• Less power is consumed because
models are trained on the user's device.
7. Where Did It Come
From?
• FL is a relatively new type of learning that Google introduced
in 2016.
• Google decided to develop a learning technique that teaches
models to train on a single device, and if multiple models do
so, you combine all of the results.
8. Disadvantages
1.Implementation cost is higher than collecting
the information and processing it centrally
•During the early phases of R&D when the training
method and process are still being iterated on.
2.Convergence time of the federated
learning process may be slower than normal.
•As there are potentially maybe billions of that
model device trying to do the same thing at the same
time.
•Also, it relies on the wireless signal.
9. What Is FL Capable of in the
Future? How Can It Be Used?
• Federated learning has the potential to destabilize
cloud computing, which is currently the dominant
computing paradigm.
• Machine learning models can be trained without
relying on the compute resources owned by large
AI firms, and users will not have to give up their
privacy in exchange for better services.