Pretrained Models for Multilingual Federated Learning

Orion Weller, Marc Marone, Vladimir Braverman, Dawn Lawrie, Benjamin Van Durme


Abstract
Since the advent of Federated Learning (FL), research has applied these methods to natural language processing (NLP) tasks. Despite a plethora of papers in FL for NLP, no previous works have studied how multilingual text impacts FL algorithms. Furthermore, multilingual text provides an interesting avenue to examine the impact of non-IID text (e.g. different languages) on FL in naturally occurring data. We explore three multilingual language tasks, language modeling, machine translation, and text classification using differing federated and non-federated learning algorithms. Our results show that using pretrained models reduces the negative effects of FL, helping them to perform near or better than centralized (no privacy) learning, even when using non-IID partitioning.
Anthology ID:
2022.naacl-main.101
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1413–1421
Language:
URL:
https://aclanthology.org/2022.naacl-main.101
DOI:
10.18653/v1/2022.naacl-main.101
Bibkey:
Cite (ACL):
Orion Weller, Marc Marone, Vladimir Braverman, Dawn Lawrie, and Benjamin Van Durme. 2022. Pretrained Models for Multilingual Federated Learning. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 1413–1421, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Pretrained Models for Multilingual Federated Learning (Weller et al., NAACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.naacl-main.101.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2022.naacl-main.101.mp4
Code
 orionw/multilingual-federated-learning
Data
MTNT