CAPSTONE: Curriculum Sampling for Dense Retrieval with Document Expansion

Xingwei He, Yeyun Gong, A-Long Jin, Hang Zhang, Anlei Dong, Jian Jiao, Siu Yiu, Nan Duan


Abstract
The dual-encoder has become the de facto architecture for dense retrieval. Typically, it computes the latent representations of the query and document independently, thus failing to fully capture the interactions between the query and document. To alleviate this, recent research has focused on obtaining query-informed document representations. During training, it expands the document with a real query, but during inference, it replaces the real query with a generated one. This inconsistency between training and inference causes the dense retrieval model to prioritize query information while disregarding the document when computing the document representation. Consequently, it performs even worse than the vanilla dense retrieval model because its performance heavily relies on the relevance between the generated queries and the real query. In this paper, we propose a curriculum sampling strategy that utilizes pseudo queries during training and progressively enhances the relevance between the generated query and the real query. By doing so, the retrieval model learns to extend its attention from the document alone to both the document and query, resulting in high-quality query-informed document representations. Experimental results on both in-domain and out-of-domain datasets demonstrate that our approach outperforms previous dense retrieval models.
Anthology ID:
2023.emnlp-main.651
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10531–10541
Language:
URL:
https://aclanthology.org/2023.emnlp-main.651
DOI:
10.18653/v1/2023.emnlp-main.651
Bibkey:
Cite (ACL):
Xingwei He, Yeyun Gong, A-Long Jin, Hang Zhang, Anlei Dong, Jian Jiao, Siu Yiu, and Nan Duan. 2023. CAPSTONE: Curriculum Sampling for Dense Retrieval with Document Expansion. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 10531–10541, Singapore. Association for Computational Linguistics.
Cite (Informal):
CAPSTONE: Curriculum Sampling for Dense Retrieval with Document Expansion (He et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2023.emnlp-main.651.pdf
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2023.emnlp-main.651.mp4