NLP Reproducibility For All: Understanding Experiences of Beginners

Shane Storks, Keunwoo Yu, Ziqiao Ma, Joyce Chai


Abstract
As natural language processing (NLP) has recently seen an unprecedented level of excitement, and more people are eager to enter the field, it is unclear whether current research reproducibility efforts are sufficient for this group of beginners to apply the latest developments. To understand their needs, we conducted a study with 93 students in an introductory NLP course, where students reproduced the results of recent NLP papers. Surprisingly, we find that their programming skill and comprehension of research papers have a limited impact on their effort spent completing the exercise. Instead, we find accessibility efforts by research authors to be the key to success, including complete documentation, better coding practice, and easier access to data files. Going forward, we recommend that NLP researchers pay close attention to these simple aspects of open-sourcing their work, and use insights from beginners’ feedback to provide actionable ideas on how to better support them.
Anthology ID:
2023.acl-long.568
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10199–10219
Language:
URL:
https://aclanthology.org/2023.acl-long.568
DOI:
10.18653/v1/2023.acl-long.568
Bibkey:
Cite (ACL):
Shane Storks, Keunwoo Yu, Ziqiao Ma, and Joyce Chai. 2023. NLP Reproducibility For All: Understanding Experiences of Beginners. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 10199–10219, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
NLP Reproducibility For All: Understanding Experiences of Beginners (Storks et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.acl-long.568.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2023.acl-long.568.mp4