On Pruning State-Space LLMs

Tamer Ghattas, Michael Hassid, Roy Schwartz


Abstract
Recent work proposed state-space models (SSMs) as an efficient alternative to transformer-based LLMs. Can these models be pruned to further reduce their computation costs? We adapt several pruning methods to the SSM structure, and apply them to four SSM-based LLMs across multiple tasks. We find that such models are quite robust to some pruning methods (e.g., WANDA), while using other methods lead to fast performance degradation.
Anthology ID:
2025.emnlp-main.950
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
18811–18825
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.950/
DOI:
Bibkey:
Cite (ACL):
Tamer Ghattas, Michael Hassid, and Roy Schwartz. 2025. On Pruning State-Space LLMs. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 18811–18825, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
On Pruning State-Space LLMs (Ghattas et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.950.pdf
Checklist:
 2025.emnlp-main.950.checklist.pdf