Another Approach to Agreement Measurement and Prediction with Emotion Annotations

Quanqi Du, Veronique Hoste


Abstract
Emotion annotation, as an inherently subjective task, often suffers from significant inter-annotator disagreement when evaluated using traditional metrics like kappa or alpha. These metrics often fall short of capturing the nuanced nature of disagreement, especially in multimodal settings. This study introduces Absolute Annotation Difference (AAD), a novel metric offering a complementary perspective on inter- and intra-annotator agreement across different modalities. Our analysis reveals that AAD not only identifies overall agreement levels but also uncovers fine-grained disagreement patterns across modalities often overlooked by conventional metrics. Furthermore, we propose an AAD-based RMSE variant for predicting annotation disagreement. Through extensive experiments on the large-scale DynaSent corpus, we demonstrate that our approach significantly improves disagreement prediction accuracy, rising from 41.71% to 51.64% and outperforming existing methods. Cross-dataset prediction results suggest good generalization. These findings underscore AAD’s potential to enhance annotation agreement analysis and provide deeper insights into subjective NLP tasks. Future work will investigate its applicability to broader emotion-related tasks and other subjective annotation scenarios.
Anthology ID:
2025.law-1.7
Volume:
Proceedings of the 19th Linguistic Annotation Workshop (LAW-XIX-2025)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Siyao Peng, Ines Rehbein
Venues:
LAW | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
87–102
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.law-1.7/
DOI:
10.18653/v1/2025.law-1.7
Bibkey:
Cite (ACL):
Quanqi Du and Veronique Hoste. 2025. Another Approach to Agreement Measurement and Prediction with Emotion Annotations. In Proceedings of the 19th Linguistic Annotation Workshop (LAW-XIX-2025), pages 87–102, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Another Approach to Agreement Measurement and Prediction with Emotion Annotations (Du & Hoste, LAW 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.law-1.7.pdf