Token Pruning for Improving Graph-Generating State Space Model Performance

Monish Beegamudre, Jack Zheng, Margaret Capetz


Abstract
State Space Models (SSMs) have recently emerged as efficient alternatives to Transformers for sequence modeling, yet extending them to two-dimensional vision tasks remains challenging. The Graph-Generating State Space Model (GG-SSM) addresses this challenge by constructing an adaptive graph, achieving competitive performance on vision benchmarks. However, state propagation over the resulting graph introduces substantial inference overhead, limiting scalability to high-resolution inputs. In this work, we introduce a leaf-guided computation pruning strategy that accelerates GG-SSM inference without modifying the underlying graph topology. Rather than removing nodes or edges, our approach selectively scales or bypasses secondary refinement computations associated with high-dissimilarity leaf nodes, while preserving the low-weight MST backbone. Experiments on multiple long-term time series forecasting benchmarks demonstrate consistent throughput improvements with controlled accuracy degradation across a range of pruning ratios. These results indicate that structure-aware computation pruning is an effective mechanism for improving the scalability of graph-based state space models.
Anthology ID:
2026.eacl-srw.35
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Selene Baez Santamaria, Sai Ashish Somayajula, Atsuki Yamaguchi
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
476–482
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-srw.35/
DOI:
Bibkey:
Cite (ACL):
Monish Beegamudre, Jack Zheng, and Margaret Capetz. 2026. Token Pruning for Improving Graph-Generating State Space Model Performance. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 476–482, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Token Pruning for Improving Graph-Generating State Space Model Performance (Beegamudre et al., EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-srw.35.pdf