Sensing and Learning Human Annotators Engaged in Narrative Sensemaking

McKenna Tornblad, Luke Lapresi, Christopher Homan, Raymond Ptucha, Cecilia Ovesdotter Alm


Abstract
While labor issues and quality assurance in crowdwork are increasingly studied, how annotators make sense of texts and how they are personally impacted by doing so are not. We study these questions via a narrative-sorting annotation task, where carefully selected (by sequentiality, topic, emotional content, and length) collections of tweets serve as examples of everyday storytelling. As readers process these narratives, we measure their facial expressions, galvanic skin response, and self-reported reactions. From the perspective of annotator well-being, a reassuring outcome was that the sorting task did not cause a measurable stress response, however readers reacted to humor. In terms of sensemaking, readers were more confident when sorting sequential, target-topical, and highly emotional tweets. As crowdsourcing becomes more common, this research sheds light onto the perceptive capabilities and emotional impact of human readers.
Anthology ID:
N18-4019
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop
Month:
June
Year:
2018
Address:
New Orleans, Louisiana, USA
Editors:
Silvio Ricardo Cordeiro, Shereen Oraby, Umashanthi Pavalanathan, Kyeongmin Rim
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
136–143
Language:
URL:
https://aclanthology.org/N18-4019
DOI:
10.18653/v1/N18-4019
Bibkey:
Cite (ACL):
McKenna Tornblad, Luke Lapresi, Christopher Homan, Raymond Ptucha, and Cecilia Ovesdotter Alm. 2018. Sensing and Learning Human Annotators Engaged in Narrative Sensemaking. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop, pages 136–143, New Orleans, Louisiana, USA. Association for Computational Linguistics.
Cite (Informal):
Sensing and Learning Human Annotators Engaged in Narrative Sensemaking (Tornblad et al., NAACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-bitext-workshop/N18-4019.pdf