Adaptation of Deep Bidirectional Transformers for Afrikaans Language

Sello Ralethe


Abstract
The recent success of pretrained language models in Natural Language Processing has sparked interest in training such models for languages other than English. Currently, training of these models can either be monolingual or multilingual based. In the case of multilingual models, such models are trained on concatenated data of multiple languages. We introduce AfriBERT, a language model for the Afrikaans language based on Bidirectional Encoder Representation from Transformers (BERT). We compare the performance of AfriBERT against multilingual BERT in multiple downstream tasks, namely part-of-speech tagging, named-entity recognition, and dependency parsing. Our results show that AfriBERT improves the current state-of-the-art in most of the tasks we considered, and that transfer learning from multilingual to monolingual model can have a significant performance improvement on downstream tasks. We release the pretrained model for AfriBERT.
Anthology ID:
2020.lrec-1.301
Volume:
Proceedings of the Twelfth Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
2475–2478
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.301
DOI:
Bibkey:
Cite (ACL):
Sello Ralethe. 2020. Adaptation of Deep Bidirectional Transformers for Afrikaans Language. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 2475–2478, Marseille, France. European Language Resources Association.
Cite (Informal):
Adaptation of Deep Bidirectional Transformers for Afrikaans Language (Ralethe, LREC 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2020.lrec-1.301.pdf