Load What You Need: Smaller Versions of Mutililingual BERT

Amine Abdaoui, Camille Pradel, Grégoire Sigel


Abstract
Pre-trained Transformer-based models are achieving state-of-the-art results on a variety of Natural Language Processing data sets. However, the size of these models is often a drawback for their deployment in real production applications. In the case of multilingual models, most of the parameters are located in the embeddings layer. Therefore, reducing the vocabulary size should have an important impact on the total number of parameters. In this paper, we propose to extract smaller models that handle fewer number of languages according to the targeted corpora. We present an evaluation of smaller versions of multilingual BERT on the XNLI data set, but we believe that this method may be applied to other multilingual transformers. The obtained results confirm that we can generate smaller models that keep comparable results, while reducing up to 45% of the total number of parameters. We compared our models with DistilmBERT (a distilled version of multilingual BERT) and showed that unlike language reduction, distillation induced a 1.7% to 6% drop in the overall accuracy on the XNLI data set. The presented models and code are publicly available.
Anthology ID:
2020.sustainlp-1.16
Volume:
Proceedings of SustaiNLP: Workshop on Simple and Efficient Natural Language Processing
Month:
November
Year:
2020
Address:
Online
Editors:
Nafise Sadat Moosavi, Angela Fan, Vered Shwartz, Goran Glavaš, Shafiq Joty, Alex Wang, Thomas Wolf
Venue:
sustainlp
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
119–123
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2020.sustainlp-1.16/
DOI:
10.18653/v1/2020.sustainlp-1.16
Bibkey:
Cite (ACL):
Amine Abdaoui, Camille Pradel, and Grégoire Sigel. 2020. Load What You Need: Smaller Versions of Mutililingual BERT. In Proceedings of SustaiNLP: Workshop on Simple and Efficient Natural Language Processing, pages 119–123, Online. Association for Computational Linguistics.
Cite (Informal):
Load What You Need: Smaller Versions of Mutililingual BERT (Abdaoui et al., sustainlp 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2020.sustainlp-1.16.pdf
Video:
 https://slideslive.com/38939438
Code
 Geotrend-research/smaller-transformers