BERT, are you paying attention? Attention regularization with human-annotated rationales

Elize Herrewijnen, Dong Nguyen, Floris Bex, Albert Gatt


Abstract
Attention regularisation aims to supervise the attention patterns in language models like BERT. Various studies have shown that using human-annotated rationales, in the form of highlights that explain why a text has a specific label, can have positive effects on model generalisability. In this work, we ask to what extent attention regularisation with human-annotated rationales improve model performance and model robustness, as well as susceptibility to spurious correlations. We compare regularisation on human rationales with randomly selected tokens, a baseline which has hitherto remained unexplored.Our results suggest that often, attention regularisation with randomly selected tokens yields similar improvements to attention regularisation with human-annotated rationales. Nevertheless, we find that human-annotated rationales surpass randomly selected tokens when it comes to reducing model sensitivity to strong spurious correlations.
Anthology ID:
2026.eacl-long.31
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
720–751
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.31/
DOI:
Bibkey:
Cite (ACL):
Elize Herrewijnen, Dong Nguyen, Floris Bex, and Albert Gatt. 2026. BERT, are you paying attention? Attention regularization with human-annotated rationales. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 720–751, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
BERT, are you paying attention? Attention regularization with human-annotated rationales (Herrewijnen et al., EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.31.pdf