John Martinsson

Specialized federated learning using a mixture of experts

Authors: Edvin Listo Zec, Olof Mogren, John Martinsson, Leon René Sütfeld, Daniel Gillblad

Published in: arXiv

Year: 2020

Location:

Abstract

In federated learning, clients share a global model that has been trained on decentralized local client data. Although federated learning shows significant promise as a key approach when data cannot be shared or centralized, current methods show limited privacy properties and have shortcomings when applied to common real-world scenarios, especially when client data is heterogeneous. In this paper, we propose an alternative method to learn a personalized model for each client in a federated setting, with greater generalization abilities than previous methods. To achieve this personalization we propose a federated learning framework using a mixture of experts to combine the specialist nature of a locally trained model with the generalist knowledge of a global model. We evaluate our method on a variety of datasets with different levels of data heterogeneity, and our results show that the mixture of experts model is better suited as a personalized model for devices in these settings, outperforming both fine-tuned global models and local specialists.

BibTeX

@article{ListoZec2020,
  author       = {Edvin Listo Zec and
                  Olof Mogren and
                  John Martinsson and
                  Leon Ren{\'{e}} S{\"{u}}tfeld and
                  Daniel Gillblad},
  title        = {Federated learning using a mixture of experts},
  journal      = {CoRR},
  volume       = {abs/2010.02056},
  year         = {2020},
  url          = {https://arxiv.org/abs/2010.02056},
  eprinttype    = {arXiv},
  eprint       = {2010.02056},
  timestamp    = {Mon, 12 Oct 2020 17:53:10 +0200},
  biburl       = {https://dblp.org/rec/journals/corr/abs-2010-02056.bib},
  bibsource    = {dblp computer science bibliography, https://dblp.org}
}