Jiaru Li
2025
Tiny Budgets, Big Gains: Parameter Placement Strategy in Parameter Super-Efficient Fine-Tuning
Jinman Zhao
|
Xueyan Zhang
|
Jiaru Li
|
Jingcheng Niu
|
Yulan Hu
|
Erxue Min
|
Gerald Penn
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
In this work, we propose FoRA-UA, a novel method that, using only 1–5% of the standard LoRA’s parameters, achieves state-of-the-art performance across a wide range of tasks. Specifically, we explore scenarios with extremely limited parameter budgets and derive two key insights: (1) fix-sized sparse frequency representations approximate small matrices more accurately; and (2) with a fixed number of trainable parameters, introducing a smaller intermediate representation to approximate larger matrices results in lower construction error. These findings form the foundation of our FoRA-UA method. By inserting a small intermediate parameter set, we achieve greater model compression without sacrificing performance. We evaluate FoRA-UA across diverse tasks, including natural language understanding (NLU), natural language generation (NLG), instruction tuning, and image classification, demonstrating strong generalisation and robustness under extreme compression.
Search
Fix author
Co-authors
- Yulan Hu 1
- Erxue Min 1
- Jingcheng Niu 1
- Gerald Penn 1
- Xueyan Zhang 1
- show all...