Comparing Pre-Training Schemes for Luxembourgish BERT Models

Cedric Lothritz, Saad Ezzini, Christoph Purschke, Tegawendé Bissyandé, Jacques Klein, Isabella Olariu, Andrey Boytsov, Clément LeFebvre, Anne Goujon


Anthology ID:
2023.konvens-main.2
Volume:
Proceedings of the 19th Conference on Natural Language Processing (KONVENS 2023)
Month:
September
Year:
2023
Address:
Ingolstadt, Germany
Editors:
Munir Georges, Aaricia Herygers, Annemarie Friedrich, Benjamin Roth
Venue:
KONVENS
SIG:
Publisher:
Association for Computational Lingustics
Note:
Pages:
17–27
Language:
URL:
https://aclanthology.org/2023.konvens-main.2
DOI:
Bibkey:
Cite (ACL):
Cedric Lothritz, Saad Ezzini, Christoph Purschke, Tegawendé Bissyandé, Jacques Klein, Isabella Olariu, Andrey Boytsov, Clément LeFebvre, and Anne Goujon. 2023. Comparing Pre-Training Schemes for Luxembourgish BERT Models. In Proceedings of the 19th Conference on Natural Language Processing (KONVENS 2023), pages 17–27, Ingolstadt, Germany. Association for Computational Lingustics.
Cite (Informal):
Comparing Pre-Training Schemes for Luxembourgish BERT Models (Lothritz et al., KONVENS 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.konvens-main.2.pdf