Revealing and Mitigating the Local Pattern Shortcuts of Mamba

WangJie You, Zecheng Tang, Juntao Li, Lili Yao, Min Zhang


Abstract
Large language models (LLMs) have advanced significantly due to the attention mechanism, but their quadratic complexity and linear memory demands limit their performance on long-context tasks. Recently, researchers introduced Mamba, an advanced model built upon State Space Models (SSMs) that offers linear complexity and constant memory. Although Mamba is reported to match or surpass the performance of attention-based models, our analysis reveals a performance gap: Mamba excels in tasks that involve localized key information but faces challenges with tasks that require handling distributed key information. Our controlled experiments suggest that the inconsistency arises from Mamba’s reliance on **local pattern shortcuts** across model scales (10M to 1.4B), which enable Mamba to remember local key information within its limited memory but hinder its ability to retain more dispersed information. Therefore, we introduce a global gate module into the Mamba model to address this issue. Experiments on extensive synthetic tasks, as well as real-world tasks, demonstrate the effectiveness of our method. Notably, with the introduction of only 4M extra parameters, our approach enables the Mamba model (130M) to achieve a significant improvement on tasks with distributed information, increasing its performance from **below 5% to 80%**.
Anthology ID:
2025.findings-acl.629
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12156–12178
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.findings-acl.629/
DOI:
Bibkey:
Cite (ACL):
WangJie You, Zecheng Tang, Juntao Li, Lili Yao, and Min Zhang. 2025. Revealing and Mitigating the Local Pattern Shortcuts of Mamba. In Findings of the Association for Computational Linguistics: ACL 2025, pages 12156–12178, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Revealing and Mitigating the Local Pattern Shortcuts of Mamba (You et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.findings-acl.629.pdf