Towards More Efficient Post-training via Fourier Domain Adapter Framework

Yijia Fan, Jusheng Zhang, Keze Wang


Abstract
We introduce Fourier Domain Adapter (FDA), a novel and parameter-efficient framework for fine-tuning large-scale pre-trained language models. FDA reparameterizes the core projection operation of the adapter module directly in the Fourier domain. This involves transforming the input features via discrete Fourier transform (DFT), applying sparse learnable complex modulations in frequency space, and then back-transforming via inverse DFT, supplemented by highly compact auxiliary linear layers. This approach significantly reduces the number of trainable parameters while enhancing the model’s ability to capture salient frequency-based semantic information. Comprehensive experiments on GLUE, E2E NLG, and instruction tuning benchmarks show that our FDA consistently outperforms existing parameter-efficient fine-tuning (PEFT) methods. It can achieve better performance with nearly 100x fewer training parameters than traditional fine-tuning methods such as LoRA and AdapterH. Our results demonstrate that FDA is a robust and efficient solution for developing efficient and powerful language models.
Anthology ID:
2025.findings-emnlp.328
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6175–6193
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.328/
DOI:
10.18653/v1/2025.findings-emnlp.328
Bibkey:
Cite (ACL):
Yijia Fan, Jusheng Zhang, and Keze Wang. 2025. Towards More Efficient Post-training via Fourier Domain Adapter Framework. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 6175–6193, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Towards More Efficient Post-training via Fourier Domain Adapter Framework (Fan et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.328.pdf
Checklist:
 2025.findings-emnlp.328.checklist.pdf