Ekaterina Grishina


2025

pdf bib
ProcrustesGPT: Compressing LLMs with Structured Matrices and Orthogonal Transformations
Ekaterina Grishina | Mikhail Gorbunov | Maxim Rakhuba
Findings of the Association for Computational Linguistics: ACL 2025

Large language models (LLMs) demonstrate impressive results in natural language processing tasks but require a significant amount of computational and memory resources. Structured matrix representations are a promising way for reducing the number of parameters of these models. However, it seems unrealistic to expect that weight matrices of pretrained models can be accurately represented by structured matrices without any fine-tuning.To overcome this issue, we utilize the fact that LLM output is invariant under certain orthogonal transformations of weight matrices.This insight can be leveraged to identify transformations that significantly improve the compressibility of weights within structured classes.The proposed approach is applicable to various types of structured matrices that support efficient projection operations. Code is available at: https://github.com/GrishKate/ProcrustesGPT.