Parameter-Efficient Fine-Tuning via Circular Convolution
Aochuan Chen, Jiashun Cheng, Zijing Liu, Ziqi Gao, Fugee Tsung, Yu Li, Jia Li
Abstract
Low-Rank Adaptation (LoRA) has gained popularity for fine-tuning large foundation models, leveraging low-rank matrices \mathbf A and \mathbf B to represent weight changes (i.e., 𝛥 \mathbf W = \mathbf B \mathbf A). This method reduces trainable parameters and mitigates heavy memory consumption associated with full delta matrices by sequentially multiplying \mathbf A and \mathbf B with the activation. Despite its success, the intrinsic low-rank characteristic may limit its performance. Although several variants have been proposed to address this issue, they often overlook the crucial computational and memory efficiency brought by LoRA. In this paper, we propose Circular Convolution Adaptation (C3A), which not only achieves high-rank adaptation with enhanced performance but also excels in both computational power and memory utilization. Extensive experiments demonstrate that C3A consistently outperforms LoRA and its variants across various fine-tuning tasks.- Anthology ID:
- 2025.findings-acl.102
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2025
- Month:
- July
- Year:
- 2025
- Address:
- Vienna, Austria
- Editors:
- Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2004–2019
- Language:
- URL:
- https://preview.aclanthology.org/landing_page/2025.findings-acl.102/
- DOI:
- Cite (ACL):
- Aochuan Chen, Jiashun Cheng, Zijing Liu, Ziqi Gao, Fugee Tsung, Yu Li, and Jia Li. 2025. Parameter-Efficient Fine-Tuning via Circular Convolution. In Findings of the Association for Computational Linguistics: ACL 2025, pages 2004–2019, Vienna, Austria. Association for Computational Linguistics.
- Cite (Informal):
- Parameter-Efficient Fine-Tuning via Circular Convolution (Chen et al., Findings 2025)
- PDF:
- https://preview.aclanthology.org/landing_page/2025.findings-acl.102.pdf