PortBERT: Navigating the Depths of Portuguese Language Models

Raphael Scheible-Schmitt, Henry He, Armando B. Mendes


Abstract
Transformer models dominate modern NLP, but efficient, language-specific models remain scarce. In Portuguese, most focus on scale or accuracy, often neglecting training and deployment efficiency. In the present work, we introduce PortBERT, a family of RoBERTa-based language models for Portuguese, designed to balance performance and efficiency. Trained from scratch on over 450 GB of deduplicated and filtered mC4 and OSCAR23 from CulturaX using fairseq, PortBERT leverages byte-level BPE tokenization and stable pre-training routines across both GPU and TPU processors. We release two variants, PortBERT base and PortBERT large, and evaluate them on ExtraGLUE, a suite of translated GLUE and SuperGLUE tasks. Both models perform competitively, matching or surpassing existing monolingual and multilingual models. Beyond accuracy, we report training and inference times as well as fine-tuning throughput, providing practical insights into model efficiency. PortBERT thus complements prior work by addressing the underexplored dimension of compute-performance tradeoffs in Portuguese NLP. We release all models on Huggingface and provide fairseq checkpoints to support further research and applications.
Anthology ID:
2025.globalnlp-1.8
Volume:
Proceedings of the Workshop on Beyond English: Natural Language Processing for all Languages in an Era of Large Language Models
Month:
September
Year:
2025
Address:
Varna, Bulgaria
Editors:
Sudhansu Bala Das, Pruthwik Mishra, Alok Singh, Shamsuddeen Hassan Muhammad, Asif Ekbal, Uday Kumar Das
Venues:
GlobalNLP | WS
SIG:
Publisher:
INCOMA Ltd., Shoumen, BULGARIA
Note:
Pages:
59–71
Language:
URL:
https://preview.aclanthology.org/corrections-2026-01/2025.globalnlp-1.8/
DOI:
Bibkey:
Cite (ACL):
Raphael Scheible-Schmitt, Henry He, and Armando B. Mendes. 2025. PortBERT: Navigating the Depths of Portuguese Language Models. In Proceedings of the Workshop on Beyond English: Natural Language Processing for all Languages in an Era of Large Language Models, pages 59–71, Varna, Bulgaria. INCOMA Ltd., Shoumen, BULGARIA.
Cite (Informal):
PortBERT: Navigating the Depths of Portuguese Language Models (Scheible-Schmitt et al., GlobalNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2026-01/2025.globalnlp-1.8.pdf