ResFormer: All-Time Reservoir Memory for Long Sequence Classification

Hongbo Liu, Jia Xu


Abstract
Sequence classification is essential in NLP for understanding and categorizing language patterns in tasks like sentiment analysis, intent detection, and topic classification. Transformer-based models, despite achieving state-of-the-art performance, have inherent limitations due to quadratic time and memory complexity, restricting their input length. Although extensive efforts have aimed at reducing computational demands, processing extensive contexts remains challenging. To overcome these limitations, we propose ResFormer, a novel neural network architecture designed to model varying context lengths efficiently through a cascaded methodology. ResFormer integrates an reservoir computing network featuring a nonlinear readout to effectively capture long-term contextual dependencies in linear time. Concurrently, short-term dependencies within sentences are modeled using a conventional Transformer architecture with fixed-length inputs. Experiments demonstrate that ResFormer significantly outperforms baseline models of DeepSeek-Qwen and ModernBERT, delivering an accuracy improvement of up to +22.3% on the EmoryNLP dataset and consistent gains on MultiWOZ, MELD, and IEMOCAP. In addition, ResFormer exhibits reduced memory consumption, underscoring its effectiveness and efficiency in modeling extensive contextual information.
Anthology ID:
2025.emnlp-main.566
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11255–11267
Language:
URL:
https://preview.aclanthology.org/name-variant-enfa-fane/2025.emnlp-main.566/
DOI:
10.18653/v1/2025.emnlp-main.566
Bibkey:
Cite (ACL):
Hongbo Liu and Jia Xu. 2025. ResFormer: All-Time Reservoir Memory for Long Sequence Classification. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 11255–11267, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
ResFormer: All-Time Reservoir Memory for Long Sequence Classification (Liu & Xu, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/name-variant-enfa-fane/2025.emnlp-main.566.pdf
Checklist:
 2025.emnlp-main.566.checklist.pdf