Efficient Classification of Long Documents via State-Space Models
Peng Lu, Suyuchen Wang, Mehdi Rezagholizadeh, Bang Liu, Ivan Kobyzev
Abstract
Transformer-based models have achieved state-of-the-art performance on numerous NLP applications. However, long documents which are prevalent in real-world scenarios cannot be efficiently processed by transformers with the vanilla self-attention module due to their quadratic computation complexity and limited length extrapolation ability. Instead of tackling the computation difficulty for self-attention with sparse or hierarchical structures, in this paper, we investigate the use of State-Space Models (SSMs) for long document classification tasks. We conducted extensive experiments on six long document classification datasets, including binary, multi-class, and multi-label classification, comparing SSMs (with and without pre-training) to self-attention-based models. We also introduce the SSM-pooler model and demonstrate that it achieves comparable performance while being on average 36% more efficient. Additionally our method exhibits higher robustness to the input noise even in the extreme scenario of 40%.- Anthology ID:
- 2023.emnlp-main.404
- Volume:
- Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 6559–6565
- Language:
- URL:
- https://preview.aclanthology.org/Author-page-Marten-During-lu/2023.emnlp-main.404/
- DOI:
- 10.18653/v1/2023.emnlp-main.404
- Cite (ACL):
- Peng Lu, Suyuchen Wang, Mehdi Rezagholizadeh, Bang Liu, and Ivan Kobyzev. 2023. Efficient Classification of Long Documents via State-Space Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 6559–6565, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Efficient Classification of Long Documents via State-Space Models (Lu et al., EMNLP 2023)
- PDF:
- https://preview.aclanthology.org/Author-page-Marten-During-lu/2023.emnlp-main.404.pdf