Relation Classification with Cognitive Attention Supervision

Erik McGuire, Noriko Tomuro


Abstract
Many current language models such as BERT utilize attention mechanisms to transform sequence representations. We ask whether we can influence BERT’s attention with human reading patterns by using eye-tracking and brain imaging data. We fine-tune BERT for relation extraction with auxiliary attention supervision in which BERT’s attention weights are supervised by cognitive data. Through a variety of metrics we find that this attention supervision can be used to increase similarity between model attention distributions over sequences and the cognitive data without significantly affecting classification performance while making unique errors from the baseline. In particular, models with cognitive attention supervision more often correctly classified samples misclassified by the baseline.
Anthology ID:
2021.cmcl-1.26
Volume:
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics
Month:
June
Year:
2021
Address:
Online
Editors:
Emmanuele Chersoni, Nora Hollenstein, Cassandra Jacobs, Yohei Oseki, Laurent Prévot, Enrico Santus
Venue:
CMCL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
222–232
Language:
URL:
https://aclanthology.org/2021.cmcl-1.26
DOI:
10.18653/v1/2021.cmcl-1.26
Bibkey:
Cite (ACL):
Erik McGuire and Noriko Tomuro. 2021. Relation Classification with Cognitive Attention Supervision. In Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, pages 222–232, Online. Association for Computational Linguistics.
Cite (Informal):
Relation Classification with Cognitive Attention Supervision (McGuire & Tomuro, CMCL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2021.cmcl-1.26.pdf
Optional supplementary material:
 2021.cmcl-1.26.OptionalSupplementaryMaterial.zip