Skip to content

YannFra/FedAvg_MNIST_working_example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Federated Learning simple implementation

Working implementation of FedAvg (mu=0) and FedProx using PyTorch. FedAvg is the default optimizer solver. No sampling is available and all the clients are considered at every iteration.

Two examples on which to build are considered:

  • FL_MNIST.py: fully connected 2 layers neural network tested on MNIST.
  • FL_MNIST_custom.py: neural network tested on what we call custom MNIST.

The computed federated learning saves by default a couple of training parameters in saved_exp_info:

  • the accuracy of the different participants at every iterations in acc. The saved object is a list of list where the first index is for the training iteration and the second index is for the considered client.
  • the loss of the different participants at every iterations in loss.
  • the local models of all the clients at every iteration in local_model_history.
  • the global model obtained at the end of the training in final_model.
  • the global models at every iteration in server_history.

Custom MNIST enables creating for each client a MNIST dataset where each client can have a different font and/or a rotation angle. We propose in FL_MNIST_custom.py a simple example for FedAvg with two clients having same digits but rotated by 30 degrees for one client. More info about how the font impacts the obtained dataset can be found here.

A lot of fonts are available. We recommend 'InconsolataN' and 'jsMath-cmti10'. 'InconsolataN' have continuous 4 and bars on th 0 while 'jsMath-cmti10' have discontinuous 4s and no bar on the 0.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages