A Simple but Effective Context Retrieval for Sequential Sentence Classification in Long Legal Documents

Anas Belfathi, Nicolas Hernandez, Monceaux Laura, Richard Dufour


Abstract
Sequential sentence classification extends traditional classification, especially useful when dealing with long documents. However, state-of-the-art approaches face two major challenges: pre-trained language models struggle with input-length constraints, while proposed hierarchical models often introduce irrelevant content. To address these limitations, we propose a simple and effective document-level retrieval approach that extracts only the most relevant context. Specifically, we introduce two heuristic strategies: Sequential, which captures local information, and Selective, which retrieves the semantically similar sentences. Experiments on legal domain datasets show that both heuristics lead to consistent improvements over the baseline, with an average increase of ∼5.5 weighted-F1 points. Sequential heuristics outperform hierarchical models on two out of three datasets, with gains of up to ∼1.5, demonstrating the benefits of targeted context.
Anthology ID:
2025.argmining-1.15
Volume:
Proceedings of the 12th Argument mining Workshop
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Elena Chistova, Philipp Cimiano, Shohreh Haddadan, Gabriella Lapesa, Ramon Ruiz-Dolz
Venues:
ArgMining | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
160–167
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.argmining-1.15/
DOI:
10.18653/v1/2025.argmining-1.15
Bibkey:
Cite (ACL):
Anas Belfathi, Nicolas Hernandez, Monceaux Laura, and Richard Dufour. 2025. A Simple but Effective Context Retrieval for Sequential Sentence Classification in Long Legal Documents. In Proceedings of the 12th Argument mining Workshop, pages 160–167, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
A Simple but Effective Context Retrieval for Sequential Sentence Classification in Long Legal Documents (Belfathi et al., ArgMining 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.argmining-1.15.pdf