Multi-stage Training with Improved Negative Contrast for Neural Passage Retrieval

Jing Lu, Gustavo Hernandez Abrego, Ji Ma, Jianmo Ni, Yinfei Yang


Abstract
In the context of neural passage retrieval, we study three promising techniques: synthetic data generation, negative sampling, and fusion. We systematically investigate how these techniques contribute to the performance of the retrieval system and how they complement each other. We propose a multi-stage framework comprising of pre-training with synthetic data, fine-tuning with labeled data, and negative sampling at both stages. We study six negative sampling strategies and apply them to the fine-tuning stage and, as a noteworthy novelty, to the synthetic data that we use for pre-training. Also, we explore fusion methods that combine negatives from different strategies. We evaluate our system using two passage retrieval tasks for open-domain QA and using MS MARCO. Our experiments show that augmenting the negative contrast in both stages is effective to improve passage retrieval accuracy and, importantly, they also show that synthetic data generation and negative sampling have additive benefits. Moreover, using the fusion of different kinds allows us to reach performance that establishes a new state-of-the-art level in two of the tasks we evaluated.
Anthology ID:
2021.emnlp-main.492
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6091–6103
Language:
URL:
https://aclanthology.org/2021.emnlp-main.492
DOI:
10.18653/v1/2021.emnlp-main.492
Bibkey:
Cite (ACL):
Jing Lu, Gustavo Hernandez Abrego, Ji Ma, Jianmo Ni, and Yinfei Yang. 2021. Multi-stage Training with Improved Negative Contrast for Neural Passage Retrieval. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 6091–6103, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Multi-stage Training with Improved Negative Contrast for Neural Passage Retrieval (Lu et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2021.emnlp-main.492.pdf
Video:
 https://preview.aclanthology.org/ingest-2024-clasp/2021.emnlp-main.492.mp4
Data
MS MARCONatural Questions