twinhter at LeWiDi-2025: Integrating Annotator Perspectives into BERT for Learning with Disagreements

Nguyen Huu Dang Nguyen, Dang Van Thin


Abstract
Annotator-provided information during labeling can reflect differences in how texts are understood and interpreted, though such variation may also arise from inconsistencies or errors. To make use of this information, we build a BERT-based model that integrates annotator perspectives and evaluate it on four datasets from the third edition of the Learning With Disagreements (LeWiDi) shared task. For each original data point, we create a new (text, annotator) pair, optionally modifying the text to reflect the annotator’s perspective when additional information is available. The text and annotator features are embedded separately and concatenated before classification, enabling the model to capture individual interpretations of the same input. Our model achieves first place on both tasks for the Par and VariErrNLI datasets. More broadly, it performs very well on datasets where annotators provide rich information and the number of annotators is relatively small, while still maintaining competitive results on datasets with limited annotator information and a larger annotator pool.
Anthology ID:
2025.nlperspectives-1.22
Volume:
Proceedings of the The 4th Workshop on Perspectivist Approaches to NLP
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Gavin Abercrombie, Valerio Basile, Simona Frenda, Sara Tonelli, Shiran Dudy
Venues:
NLPerspectives | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
249–255
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.nlperspectives-1.22/
DOI:
Bibkey:
Cite (ACL):
Nguyen Huu Dang Nguyen and Dang Van Thin. 2025. twinhter at LeWiDi-2025: Integrating Annotator Perspectives into BERT for Learning with Disagreements. In Proceedings of the The 4th Workshop on Perspectivist Approaches to NLP, pages 249–255, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
twinhter at LeWiDi-2025: Integrating Annotator Perspectives into BERT for Learning with Disagreements (Nguyen & Thin, NLPerspectives 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.nlperspectives-1.22.pdf