PLEX: Adaptive Parameter-Efficient Fine-Tuning for Code LLMs using Lottery-Tickets

Jaeseong Lee, Hojae Han, Jongyoon Kim, Seung-won Hwang, Naun Kang, KyungJun An, Sungho Jang


Abstract
Fine-tuning large language models (LLMs) for code generation is challenging due to computational costs and the underrepresentation of some programming languages (PLs) in pre-training. We propose PLEX, a lottery-ticket based parameter-efficient fine-tuning (PEFT) method that adapts LLMs to either well-supported and underrepresented PLs.During lottery ticket selection, PLEX employs a dual strategy: for well-represented PLs, it leverages the LLM’s full parametric knowledge by selecting from full layers, while for underrepresented PLs, it narrows the selection scope to dense layers, prioritizing the most influential parameters.Additionally, PLEX-E, a low-rank extension of PLEX, further reduces computational costs by limiting the scope of fine-tuning. On MultiPL-E benchmarks, PLEX achieves state-of-the-art performance among PEFT methods, while PLEX-E maintains competitive results with reduced computational overhead. Both variants demonstrate effective adaptation across diverse programming languages, particularly for those underrepresented in pre-training.
Anthology ID:
2025.naacl-industry.60
Volume:
Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 3: Industry Track)
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Weizhu Chen, Yi Yang, Mohammad Kachuee, Xue-Yong Fu
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
784–793
Language:
URL:
https://preview.aclanthology.org/Ingest-2025-COMPUTEL/2025.naacl-industry.60/
DOI:
Bibkey:
Cite (ACL):
Jaeseong Lee, Hojae Han, Jongyoon Kim, Seung-won Hwang, Naun Kang, KyungJun An, and Sungho Jang. 2025. PLEX: Adaptive Parameter-Efficient Fine-Tuning for Code LLMs using Lottery-Tickets. In Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 3: Industry Track), pages 784–793, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
PLEX: Adaptive Parameter-Efficient Fine-Tuning for Code LLMs using Lottery-Tickets (Lee et al., NAACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/Ingest-2025-COMPUTEL/2025.naacl-industry.60.pdf