Annotation Curricula to Implicitly Train Non-Expert Annotators

Ji-Ung Lee, Jan-Christoph Klie, Iryna Gurevych


Abstract
Annotation studies often require annotators to familiarize themselves with the task, its annotation scheme, and the data domain. This can be overwhelming in the beginning, mentally taxing, and induce errors into the resulting annotations; especially in citizen science or crowdsourcing scenarios where domain expertise is not required. To alleviate these issues, this work proposes annotation curricula, a novel approach to implicitly train annotators. The goal is to gradually introduce annotators into the task by ordering instances to be annotated according to a learning curriculum. To do so, this work formalizes annotation curricula for sentence- and paragraph-level annotation tasks, defines an ordering strategy, and identifies well-performing heuristics and interactively trained models on three existing English datasets. Finally, we provide a proof of concept for annotation curricula in a carefully designed user study with 40 voluntary participants who are asked to identify the most fitting misconception for English tweets about the Covid-19 pandemic. The results indicate that using a simple heuristic to order instances can already significantly reduce the total annotation time while preserving a high annotation quality. Annotation curricula thus can be a promising research direction to improve data collection. To facilitate future research—for instance, to adapt annotation curricula to specific tasks and expert annotation scenarios—all code and data from the user study consisting of 2,400 annotations is made available.1
Anthology ID:
2022.cl-2.4
Volume:
Computational Linguistics, Volume 48, Issue 2 - June 2022
Month:
June
Year:
2022
Address:
Cambridge, MA
Venue:
CL
SIG:
Publisher:
MIT Press
Note:
Pages:
343–373
Language:
URL:
https://aclanthology.org/2022.cl-2.4
DOI:
10.1162/coli_a_00436
Bibkey:
Cite (ACL):
Ji-Ung Lee, Jan-Christoph Klie, and Iryna Gurevych. 2022. Annotation Curricula to Implicitly Train Non-Expert Annotators. Computational Linguistics, 48(2):343–373.
Cite (Informal):
Annotation Curricula to Implicitly Train Non-Expert Annotators (Lee et al., CL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.cl-2.4.pdf
Code
 ukplab/annotation-curriculum