Textinator: an Internationalized Tool for Annotation and Human Evaluation in Natural Language Processing and Generation

Dmytro Kalpakchi, Johan Boye


Abstract
We release an internationalized annotation and human evaluation bundle, called Textinator, along with documentation and video tutorials. Textinator allows annotating data for a wide variety of NLP tasks, and its user interface is offered in multiple languages, lowering the entry threshold for domain experts. The latter is, in fact, quite a rare feature among the annotation tools, that allows controlling for possible unintended biases introduced due to hiring only English-speaking annotators. We illustrate the rarity of this feature by presenting a thorough systematic comparison of Textinator to previously published annotation tools along 9 different axes (with internationalization being one of them). To encourage researchers to design their human evaluation before starting to annotate data, Textinator offers an easy-to-use tool for human evaluations allowing importing surveys with potentially hundreds of evaluation items in one click. We finish by presenting several use cases of annotation and evaluation projects conducted using pre-release versions of Textinator. The presented use cases do not represent Textinator’s full annotation or evaluation capabilities, and interested readers are referred to the online documentation for more information.
Anthology ID:
2022.lrec-1.90
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
856–866
Language:
URL:
https://aclanthology.org/2022.lrec-1.90
DOI:
Bibkey:
Cite (ACL):
Dmytro Kalpakchi and Johan Boye. 2022. Textinator: an Internationalized Tool for Annotation and Human Evaluation in Natural Language Processing and Generation. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 856–866, Marseille, France. European Language Resources Association.
Cite (Informal):
Textinator: an Internationalized Tool for Annotation and Human Evaluation in Natural Language Processing and Generation (Kalpakchi & Boye, LREC 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2022.lrec-1.90.pdf
Code
 dkalpakchi/textinator