CoRefi: A Crowd Sourcing Suite for Coreference Annotation

Ari Bornstein, Arie Cattan, Ido Dagan


Abstract
Coreference annotation is an important, yet expensive and time consuming, task, which often involved expert annotators trained on complex decision guidelines. To enable cheaper and more efficient annotation, we present CoRefi, a web-based coreference annotation suite, oriented for crowdsourcing. Beyond the core coreference annotation tool, CoRefi provides guided onboarding for the task as well as a novel algorithm for a reviewing phase. CoRefi is open source and directly embeds into any website, including popular crowdsourcing platforms. CoRefi Demo: aka.ms/corefi Video Tour: aka.ms/corefivideo Github Repo: https://github.com/aribornstein/corefi
Anthology ID:
2020.emnlp-demos.27
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Month:
October
Year:
2020
Address:
Online
Editors:
Qun Liu, David Schlangen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
205–215
Language:
URL:
https://aclanthology.org/2020.emnlp-demos.27
DOI:
10.18653/v1/2020.emnlp-demos.27
Bibkey:
Cite (ACL):
Ari Bornstein, Arie Cattan, and Ido Dagan. 2020. CoRefi: A Crowd Sourcing Suite for Coreference Annotation. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 205–215, Online. Association for Computational Linguistics.
Cite (Informal):
CoRefi: A Crowd Sourcing Suite for Coreference Annotation (Bornstein et al., EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2020.emnlp-demos.27.pdf
Code
 aribornstein/corefi +  additional community code
Data
ECB+