CoRT: Complementary Rankings from Transformers

Marco Wrzalik, Dirk Krechel


Abstract
Many recent approaches towards neural information retrieval mitigate their computational costs by using a multi-stage ranking pipeline. In the first stage, a number of potentially relevant candidates are retrieved using an efficient retrieval model such as BM25. Although BM25 has proven decent performance as a first-stage ranker, it tends to miss relevant passages. In this context we propose CoRT, a simple neural first-stage ranking model that leverages contextual representations from pretrained language models such as BERT to complement term-based ranking functions while causing no significant delay at query time. Using the MS MARCO dataset, we show that CoRT significantly increases the candidate recall by complementing BM25 with missing candidates. Consequently, we find subsequent re-rankers achieve superior results with less candidates. We further demonstrate that passage retrieval using CoRT can be realized with surprisingly low latencies.
Anthology ID:
2021.naacl-main.331
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4194–4204
Language:
URL:
https://aclanthology.org/2021.naacl-main.331
DOI:
10.18653/v1/2021.naacl-main.331
Bibkey:
Cite (ACL):
Marco Wrzalik and Dirk Krechel. 2021. CoRT: Complementary Rankings from Transformers. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4194–4204, Online. Association for Computational Linguistics.
Cite (Informal):
CoRT: Complementary Rankings from Transformers (Wrzalik & Krechel, NAACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.naacl-main.331.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2021.naacl-main.331.mp4
Code
 lavis-nlp/CoRT
Data
MS MARCO