@inproceedings{qiu-etal-2025-orthogonal,
    title = "Orthogonal Finetuning Made Scalable",
    author = {Qiu, Zeju  and
      Liu, Weiyang  and
      Weller, Adrian  and
      Sch{\"o}lkopf, Bernhard},
    editor = "Christodoulopoulos, Christos  and
      Chakraborty, Tanmoy  and
      Rose, Carolyn  and
      Peng, Violet",
    booktitle = "Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2025",
    address = "Suzhou, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1627/",
    pages = "31934--31951",
    ISBN = "979-8-89176-332-6",
    abstract = "Orthogonal finetuning (OFT) offers highly parameter-efficient adaptation while preventing catastrophic forgetting, but its high runtime and memory demands limit practical deployment. We identify the core computational bottleneck in OFT as its weight-centric implementation, which relies on costly matrix-matrix multiplications with cubic complexity. To overcome this, we propose OFTv2, an input-centric reformulation that instead uses matrix-vector multiplications (i.e., matrix-free computation), reducing the computational cost to quadratic. We further introduce the Cayley{--}Neumann parameterization, an efficient orthogonal parameterization that approximates the matrix inversion in the Cayley transform via a truncated Neumann series. These modifications allow OFTv2 to achieve up to 10x faster training and 3x lower GPU memory usage without compromising performance. In addition, we extend OFTv2 to support finetuning quantized foundation models and show that it outperforms the popular QLoRA in training stability, efficiency, and memory usage."
}Markdown (Informal)
[Orthogonal Finetuning Made Scalable](https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1627/) (Qiu et al., EMNLP 2025)
ACL
- Zeju Qiu, Weiyang Liu, Adrian Weller, and Bernhard Schölkopf. 2025. Orthogonal Finetuning Made Scalable. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 31934–31951, Suzhou, China. Association for Computational Linguistics.