LaTIM: Measuring Latent Token-to-Token Interactions in Mamba Models

Hugo Pitorro, Marcos Vinicius Treviso


Abstract
State space models (SSMs), such as Mamba, have emerged as an efficient alternative to transformers for long-context sequence modeling. However, despite their growing adoption, SSMs lack the interpretability tools that have been crucial for understanding and improving attention-based architectures. While recent efforts provide insights into Mamba’s internal mechanisms, they struggle to capture precisetoken-level interactions at the layer level, leaving gaps in understanding how Mamba selectively processes sequences across layers. In this work, we introduce LaTIM, a novel token-level decomposition method for both Mamba-1 and Mamba-2 that enables fine-grained interpretability. We extensively evaluate our method across diverse tasks, including machine translation, copying, and retrieval-based generation, demonstrating its effectiveness in revealing Mamba’s token-to-token interaction patterns.
Anthology ID:
2025.acl-long.1194
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24478–24493
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1194/
DOI:
Bibkey:
Cite (ACL):
Hugo Pitorro and Marcos Vinicius Treviso. 2025. LaTIM: Measuring Latent Token-to-Token Interactions in Mamba Models. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 24478–24493, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
LaTIM: Measuring Latent Token-to-Token Interactions in Mamba Models (Pitorro & Treviso, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1194.pdf