@inproceedings{petrik-nguyen-2025-towards,
    title = "Towards compact and efficient {S}lovak summarization models",
    author = "Petrik, Sebastian  and
      Nguyen, Giang",
    editor = "Piskorski, Jakub  and
      P{\v{r}}ib{\'a}{\v{n}}, Pavel  and
      Nakov, Preslav  and
      Yangarber, Roman  and
      Marcinczuk, Michal",
    booktitle = "Proceedings of the 10th Workshop on Slavic Natural Language Processing (Slavic NLP 2025)",
    month = jul,
    year = "2025",
    address = "Vienna, Austria",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.bsnlp-1.7/",
    doi = "10.18653/v1/2025.bsnlp-1.7",
    pages = "58--68",
    ISBN = "978-1-959429-57-9",
    abstract = "Language models, especially LLMs, often face significant limitations due to their high resource demands. While various model compression methods have emerged, their application to smaller models in multilingual and low-resource settings remains understudied. Our work evaluates selected decoder and embedding pruning methods on T5-based models for abstractive summarization in English and Slovak using a parallel dataset. The results reveal differences in model performance degradation and expand the limited Slovak summarization resources and models."
}Markdown (Informal)
[Towards compact and efficient Slovak summarization models](https://preview.aclanthology.org/ingest-emnlp/2025.bsnlp-1.7/) (Petrik & Nguyen, BSNLP 2025)
ACL