PEFT-Factory: Unified Parameter-Efficient Fine-Tuning of Autoregressive Large Language Models

Robert Belanec, Ivan Srba, Maria Bielikova


Abstract
Parameter-Efficient Fine-Tuning (PEFT) methods address the increasing size of Large Language Models (LLMs). Currently, many newly introduced PEFT methods are challenging to replicate, deploy, or compare with one another. To address this, we introduce PEFT-Factory, a unified framework for efficient fine-tuning LLMs using both off-the-shelf and custom PEFT methods. While its modular design supports extensibility, it natively provides a representative set of 19 PEFT methods, 27 classification and text generation datasets addressing 12 tasks, and both standard and PEFT-specific evaluation metrics. As a result, PEFT-Factory provides a ready-to-use, controlled, and stable environment, improving replicability and benchmarking of PEFT methods. PEFT-Factory is a downstream framework that originates from the popular LLaMA-Factory, and is publicly available at https://github.com/kinit-sk/PEFT-Factory.
Anthology ID:
2026.eacl-demo.15
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 3: System Demonstrations)
Month:
March
Year:
2026
Address:
Rabat, Marocco
Editors:
Danilo Croce, Jochen Leidner, Nafise Sadat Moosavi
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
188–202
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-demo.15/
DOI:
Bibkey:
Cite (ACL):
Robert Belanec, Ivan Srba, and Maria Bielikova. 2026. PEFT-Factory: Unified Parameter-Efficient Fine-Tuning of Autoregressive Large Language Models. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 3: System Demonstrations), pages 188–202, Rabat, Marocco. Association for Computational Linguistics.
Cite (Informal):
PEFT-Factory: Unified Parameter-Efficient Fine-Tuning of Autoregressive Large Language Models (Belanec et al., EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-demo.15.pdf