BERToldo, the Historical BERT for Italian

Alessio Palmero Aprosio, Stefano Menini, Sara Tonelli


Abstract
Recent works in historical language processing have shown that transformer-based models can be successfully created using historical corpora, and that using them for analysing and classifying data from the past can be beneficial compared to standard transformer models. This has led to the creation of BERT-like models for different languages trained with digital repositories from the past. In this work we introduce the Italian version of historical BERT, which we call BERToldo. We evaluate the model on the task of PoS-tagging Dante Alighieri’s works, considering not only the tagger performance but also the model size and the time needed to train it. We also address the problem of duplicated data, which is rather common for languages with a limited availability of historical corpora. We show that deduplication reduces training time without affecting performance. The model and its smaller versions are all made available to the research community.
Anthology ID:
2022.lt4hala-1.10
Volume:
Proceedings of the Second Workshop on Language Technologies for Historical and Ancient Languages
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Rachele Sprugnoli, Marco Passarotti
Venue:
LT4HALA
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
68–72
Language:
URL:
https://aclanthology.org/2022.lt4hala-1.10
DOI:
Bibkey:
Cite (ACL):
Alessio Palmero Aprosio, Stefano Menini, and Sara Tonelli. 2022. BERToldo, the Historical BERT for Italian. In Proceedings of the Second Workshop on Language Technologies for Historical and Ancient Languages, pages 68–72, Marseille, France. European Language Resources Association.
Cite (Informal):
BERToldo, the Historical BERT for Italian (Palmero Aprosio et al., LT4HALA 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2022.lt4hala-1.10.pdf
Code
 dhfbk/historical-bert