State Space Models are Strong Text Rerankers

Zhichao Xu, Jinghua Yan, Ashim Gupta, Vivek Srikumar


Abstract
Transformers dominate NLP and IR; but their inference inefficiencies and challenges in extrapolating to longer contexts have sparked interest in alternative model architectures. Among these, state space models (SSMs) like Mamba offer promising advantages, particularly time complexity in inference. Despite their potential, SSMs’ effectiveness at text reranking — a task requiring fine-grained query-document interaction and long-context understanding — remains underexplored.This study benchmarks SSM-based architectures (specifically, Mamba-1 and Mamba-2) against transformer-based models across various scales, architectures, and pre-training objectives, focusing on performance and efficiency in text reranking tasks. We find that (1) Mamba architectures achieve competitive text ranking performance, comparable to transformer-based models of similar size; (2) they are less efficient in training and inference compared to transformers with flash attention; and (3) Mamba-2 outperforms Mamba-1 in both performance and efficiency. These results underscore the potential of state space models as a transformer alternative and highlight areas for improvement in future IR applications.
Anthology ID:
2025.repl4nlp-1.12
Volume:
Proceedings of the 10th Workshop on Representation Learning for NLP (RepL4NLP-2025)
Month:
May
Year:
2025
Address:
Albuquerque, NM
Editors:
Vaibhav Adlakha, Alexandra Chronopoulou, Xiang Lorraine Li, Bodhisattwa Prasad Majumder, Freda Shi, Giorgos Vernikos
Venues:
RepL4NLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
152–169
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.repl4nlp-1.12/
DOI:
Bibkey:
Cite (ACL):
Zhichao Xu, Jinghua Yan, Ashim Gupta, and Vivek Srikumar. 2025. State Space Models are Strong Text Rerankers. In Proceedings of the 10th Workshop on Representation Learning for NLP (RepL4NLP-2025), pages 152–169, Albuquerque, NM. Association for Computational Linguistics.
Cite (Informal):
State Space Models are Strong Text Rerankers (Xu et al., RepL4NLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.repl4nlp-1.12.pdf