BERT meets Cranfield: Uncovering the Properties of Full Ranking on Fully Labeled Data

Negin Ghasemi, Djoerd Hiemstra


Abstract
Recently, various information retrieval models have been proposed based on pre-trained BERT models, achieving outstanding performance. The majority of such models have been tested on data collections with partial relevance labels, where various potentially relevant documents have not been exposed to the annotators. Therefore, evaluating BERT-based rankers may lead to biased and unfair evaluation results, simply because a relevant document has not been exposed to the annotators while creating the collection. In our work, we aim to better understand a BERT-based ranker’s strengths compared to a BERT-based re-ranker and the initial ranker. To this aim, we investigate BERT-based rankers performance on the Cranfield collection, which comes with full relevance judgment on all documents in the collection. Our results demonstrate the BERT-based full ranker’s effectiveness, as opposed to the BERT-based re-ranker and BM25. Also, analysis shows that there are documents that the BERT-based full-ranker finds that were not found by the initial ranker.
Anthology ID:
2021.eacl-srw.9
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop
Month:
April
Year:
2021
Address:
Online
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
58–64
Language:
URL:
https://aclanthology.org/2021.eacl-srw.9
DOI:
10.18653/v1/2021.eacl-srw.9
Bibkey:
Cite (ACL):
Negin Ghasemi and Djoerd Hiemstra. 2021. BERT meets Cranfield: Uncovering the Properties of Full Ranking on Fully Labeled Data. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop, pages 58–64, Online. Association for Computational Linguistics.
Cite (Informal):
BERT meets Cranfield: Uncovering the Properties of Full Ranking on Fully Labeled Data (Ghasemi & Hiemstra, EACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.eacl-srw.9.pdf
Data
MS MARCO