Quantified Reproducibility Assessment of NLP Results

Anya Belz, Maja Popovic, Simon Mille


Abstract
This paper describes and tests a method for carrying out quantified reproducibility assessment (QRA) that is based on concepts and definitions from metrology. QRA produces a single score estimating the degree of reproducibility of a given system and evaluation measure, on the basis of the scores from, and differences between, different reproductions. We test QRA on 18 different system and evaluation measure combinations (involving diverse NLP tasks and types of evaluation), for each of which we have the original results and one to seven reproduction results. The proposed QRA method produces degree-of-reproducibility scores that are comparable across multiple reproductions not only of the same, but also of different, original studies. We find that the proposed method facilitates insights into causes of variation between reproductions, and as a result, allows conclusions to be drawn about what aspects of system and/or evaluation design need to be changed in order to improve reproducibility.
Anthology ID:
2022.acl-long.2
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16–28
Language:
URL:
https://aclanthology.org/2022.acl-long.2
DOI:
10.18653/v1/2022.acl-long.2
Bibkey:
Cite (ACL):
Anya Belz, Maja Popovic, and Simon Mille. 2022. Quantified Reproducibility Assessment of NLP Results. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 16–28, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Quantified Reproducibility Assessment of NLP Results (Belz et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2022.acl-long.2.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/2022.acl-long.2.mp4