ResLoRA: Identity Residual Mapping in Low-Rank Adaption

Shuhua Shi, Shaohan Huang, Minghui Song, Zhoujun Li, Zihan Zhang, Haizhen Huang, Furu Wei, Weiwei Deng, Feng Sun, Qi Zhang


Abstract
As one of the most popular parameter-efficient fine-tuning (PEFT) methods, low-rank adaptation (LoRA) is commonly applied to fine-tune large language models (LLMs). However, updating the weights of LoRA blocks effectively and expeditiously is challenging due to the long calculation path in the original model. To address this, we propose ResLoRA, an improved framework of LoRA. By adding residual paths during training and using merging approaches to eliminate these extra paths during inference, our method can achieve better results in fewer training steps without any extra trainable parameters or inference cost compared to LoRA. The experiments on NLG, NLU, and text-to-image tasks demonstrate the effectiveness of our method. To the best of our knowledge, ResLoRA is the first work that combines the residual path with LoRA. The code of our method is available at [this url](https://github.com/microsoft/LMOps/tree/main/reslora).
Anthology ID:
2024.findings-acl.525
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8870–8884
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-acl.525/
DOI:
10.18653/v1/2024.findings-acl.525
Bibkey:
Cite (ACL):
Shuhua Shi, Shaohan Huang, Minghui Song, Zhoujun Li, Zihan Zhang, Haizhen Huang, Furu Wei, Weiwei Deng, Feng Sun, and Qi Zhang. 2024. ResLoRA: Identity Residual Mapping in Low-Rank Adaption. In Findings of the Association for Computational Linguistics: ACL 2024, pages 8870–8884, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
ResLoRA: Identity Residual Mapping in Low-Rank Adaption (Shi et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-acl.525.pdf