Investigating Multi-layer Representations for Dense Passage Retrieval

Zhongbin Xie, Thomas Lukasiewicz


Abstract
Dense retrieval models usually adopt vectors from the last hidden layer of the document encoder to represent a document, which is in contrast to the fact that representations in different layers of a pre-trained language model usually contain different kinds of linguistic knowledge, and behave differently during fine-tuning. Therefore, we propose to investigate utilizing representations from multiple encoder layers to make up the representation of a document, which we denote Multi-layer Representations (MLR). We first investigate how representations in different layers affect MLR’s performance under the multi-vector retrieval setting, and then propose to leverage pooling strategies to reduce multi-vector models to single-vector ones to improve retrieval efficiency. Experiments demonstrate the effectiveness of MLR over dual encoder, ME-BERT and ColBERT in the single-vector retrieval setting, as well as demonstrate that it works well with other advanced training techniques such as retrieval-oriented pre-training and hard negative mining.
Anthology ID:
2025.findings-emnlp.1333
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24522–24536
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1333/
DOI:
10.18653/v1/2025.findings-emnlp.1333
Bibkey:
Cite (ACL):
Zhongbin Xie and Thomas Lukasiewicz. 2025. Investigating Multi-layer Representations for Dense Passage Retrieval. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 24522–24536, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Investigating Multi-layer Representations for Dense Passage Retrieval (Xie & Lukasiewicz, Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1333.pdf
Checklist:
 2025.findings-emnlp.1333.checklist.pdf