ChuLo: Chunk-Level Key Information Representation for Long Document Understanding

Yan Li, Caren Han, Yue Dai, Feiqi Cao


Abstract
Transformer-based models have achieved remarkable success in various Natural Language Processing (NLP) tasks, yet their ability to handle long documents is constrained by computational limitations. Traditional approaches, such as truncating inputs, sparse self-attention, and chunking, attempt to mitigate these issues, but they often lead to information loss and hinder the model’s ability to capture long-range dependencies. In this paper, we introduce ChuLo, a novel chunk representation method for long document understanding that addresses these limitations. Our ChuLo groups input tokens using unsupervised keyphrase extraction, emphasizing semantically important keyphrase based chunks to retain core document content while reducing input length. This approach minimizes information loss and improves the efficiency of Transformer-based models. Preserving all tokens in long document understanding, especially token classification tasks, is important to ensure that fine-grained annotations, which depend on the entire sequence context, are not lost. We evaluate our method on multiple long document classification tasks and long document token classification tasks, demonstrating its effectiveness through comprehensive qualitative and quantitative analysis.
Anthology ID:
2025.findings-acl.762
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14756–14773
Language:
URL:
https://preview.aclanthology.org/corrections-2025-08/2025.findings-acl.762/
DOI:
10.18653/v1/2025.findings-acl.762
Bibkey:
Cite (ACL):
Yan Li, Caren Han, Yue Dai, and Feiqi Cao. 2025. ChuLo: Chunk-Level Key Information Representation for Long Document Understanding. In Findings of the Association for Computational Linguistics: ACL 2025, pages 14756–14773, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
ChuLo: Chunk-Level Key Information Representation for Long Document Understanding (Li et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2025-08/2025.findings-acl.762.pdf