LADM: Long-context Training Data Selection with Attention-based Dependency Measurement for LLMs

Jianghao Chen, Junhong Wu, Yangyifan Xu, Jiajun Zhang


Abstract
Long-context modeling has drawn more and more attention in the area of Large Language Models (LLMs). Continual training with long-context data becomes the de-facto method to equip LLMs with the ability to process long inputs. However, it still remains an open challenge to measure the quality of long-context training data. To address this issue, we propose a Long-context data selection framework with Attention-based Dependency Measurement (LADM), which can efficiently identify high-quality long-context data from a large-scale, multi-domain pre-training corpus. LADM leverages the retrieval capabilities of the attention mechanism to capture contextual dependencies, ensuring a comprehensive quality measurement of long-context data. Experimental results show that our LADM framework significantly boosts the performance of LLMs on multiple long-context tasks with only 1B tokens for continual training.
Anthology ID:
2025.acl-long.154
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3076–3090
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.154/
DOI:
Bibkey:
Cite (ACL):
Jianghao Chen, Junhong Wu, Yangyifan Xu, and Jiajun Zhang. 2025. LADM: Long-context Training Data Selection with Attention-based Dependency Measurement for LLMs. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3076–3090, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
LADM: Long-context Training Data Selection with Attention-based Dependency Measurement for LLMs (Chen et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.154.pdf