Query-as-context Pre-training for Dense Passage Retrieval

Xing W, Guangyuan Ma, Wanhui Qian, Zijia Lin, Songlin Hu


Abstract
Recently, methods have been developed to improve the performance of dense passage retrieval by using context-supervised pre-training. These methods simply consider two passages from the same document to be relevant, without taking into account the potential negative impacts of weakly correlated pairs. Thus, this paper proposes query-as-context pre-training, a simple yet effective pre-training technique to alleviate the issue. Query-as-context pre-training assumes that the query derived from a passage is more likely to be relevant to that passage and forms a passage-query pair. These passage-query pairs are then used in contrastive or generative context-supervised pre-training. The pre-trained models are evaluated on large-scale passage retrieval benchmarks and out-of-domain zero-shot benchmarks. Experimental results show that query-as-context pre-training brings considerable gains for retrieval performances, demonstrating its effectiveness and efficiency.
Anthology ID:
2023.emnlp-main.118
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1906–1916
Language:
URL:
https://aclanthology.org/2023.emnlp-main.118
DOI:
10.18653/v1/2023.emnlp-main.118
Bibkey:
Cite (ACL):
Xing W, Guangyuan Ma, Wanhui Qian, Zijia Lin, and Songlin Hu. 2023. Query-as-context Pre-training for Dense Passage Retrieval. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 1906–1916, Singapore. Association for Computational Linguistics.
Cite (Informal):
Query-as-context Pre-training for Dense Passage Retrieval (W et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.emnlp-main.118.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2023.emnlp-main.118.mp4