Train More Parameters But Mind Their Placement: Insights into Language Adaptation with PEFT

Jenny Kunz


Abstract
Smaller LLMs still face significant challenges even in medium-resourced languages, particularly when it comes to language-specific knowledge – a problem not easily resolved with machine-translated data. In this case study on Icelandic, we aim to enhance the generation performance of an LLM by specialising it using unstructured text corpora. A key focus is on preventing interference with the models’ capabilities of handling longer context during this adaptation. Through ablation studies using various parameter-efficient fine-tuning (PEFT) methods and setups, we find that increasing the number of trainable parameters leads to better and more robust language adaptation. LoRAs placed in the feed-forward layers and bottleneck adapters show promising results with sufficient parameters, while prefix tuning and (IA)3 are not suitable. Although improvements are consistent in 0-shot summarisation, some adapted models struggle with longer context lengths, an issue that can be mitigated by adapting only the final layers.
Anthology ID:
2025.nodalida-1.35
Volume:
Proceedings of the Joint 25th Nordic Conference on Computational Linguistics and 11th Baltic Conference on Human Language Technologies (NoDaLiDa/Baltic-HLT 2025)
Month:
march
Year:
2025
Address:
Tallinn, Estonia
Editors:
Richard Johansson, Sara Stymne
Venue:
NoDaLiDa
SIG:
Publisher:
University of Tartu Library
Note:
Pages:
323–330
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.nodalida-1.35/
DOI:
Bibkey:
Cite (ACL):
Jenny Kunz. 2025. Train More Parameters But Mind Their Placement: Insights into Language Adaptation with PEFT. In Proceedings of the Joint 25th Nordic Conference on Computational Linguistics and 11th Baltic Conference on Human Language Technologies (NoDaLiDa/Baltic-HLT 2025), pages 323–330, Tallinn, Estonia. University of Tartu Library.
Cite (Informal):
Train More Parameters But Mind Their Placement: Insights into Language Adaptation with PEFT (Kunz, NoDaLiDa 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.nodalida-1.35.pdf