Towards compact and efficient Slovak summarization models

Sebastian Petrik, Giang Nguyen


Abstract
Language models, especially LLMs, often face significant limitations due to their high resource demands. While various model compression methods have emerged, their application to smaller models in multilingual and low-resource settings remains understudied. Our work evaluates selected decoder and embedding pruning methods on T5-based models for abstractive summarization in English and Slovak using a parallel dataset. The results reveal differences in model performance degradation and expand the limited Slovak summarization resources and models.
Anthology ID:
2025.bsnlp-1.7
Volume:
Proceedings of the 10th Workshop on Slavic Natural Language Processing (Slavic NLP 2025)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Jakub Piskorski, Pavel Přibáň, Preslav Nakov, Roman Yangarber, Michal Marcinczuk
Venues:
BSNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
58–68
Language:
URL:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.bsnlp-1.7/
DOI:
Bibkey:
Cite (ACL):
Sebastian Petrik and Giang Nguyen. 2025. Towards compact and efficient Slovak summarization models. In Proceedings of the 10th Workshop on Slavic Natural Language Processing (Slavic NLP 2025), pages 58–68, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Towards compact and efficient Slovak summarization models (Petrik & Nguyen, BSNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.bsnlp-1.7.pdf