Abstract
Prompt-based learning’s efficacy across numerous natural language processing tasks has led to its integration into dense passage retrieval. Prior research has mainly focused on enhancing the semantic understanding of pre-trained language models by optimizing a single vector as a continuous prompt. This approach, however, leads to a semantic space collapse; identical semantic information seeps into all representations, causing their distributions to converge in a restricted region. This hinders differentiation between relevant and irrelevant passages during dense retrieval. To tackle this issue, we present Topic-DPR, a dense passage retrieval model that uses topic-based prompts. Unlike the single prompt method, multiple topic-based prompts are established over a probabilistic simplex and optimized simultaneously through contrastive learning. This encourages representations to align with their topic distributions, improving space uniformity. Furthermore, we introduce a novel positive and negative sampling strategy, leveraging semi-structured data to boost dense retrieval efficiency. Experimental results from two datasets affirm that our method surpasses previous state-of-the-art retrieval techniques.- Anthology ID:
- 2023.findings-emnlp.480
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 7216–7225
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.480
- DOI:
- 10.18653/v1/2023.findings-emnlp.480
- Cite (ACL):
- Qingfa Xiao, Shuangyin Li, and Lei Chen. 2023. Topic-DPR: Topic-based Prompts for Dense Passage Retrieval. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 7216–7225, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Topic-DPR: Topic-based Prompts for Dense Passage Retrieval (Xiao et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2023.findings-emnlp.480.pdf