Telling BERT’s Full Story: from Local Attention to Global Aggregation

Damian Pascual, Gino Brunner, Roger Wattenhofer


Abstract
We take a deep look into the behaviour of self-attention heads in the transformer architecture. In light of recent work discouraging the use of attention distributions for explaining a model’s behaviour, we show that attention distributions can nevertheless provide insights into the local behaviour of attention heads. This way, we propose a distinction between local patterns revealed by attention and global patterns that refer back to the input, and analyze BERT from both angles. We use gradient attribution to analyze how the output of an attention head depends on the input tokens, effectively extending the local attention-based analysis to account for the mixing of information throughout the transformer layers. We find that there is a significant mismatch between attention and attribution distributions, caused by the mixing of context inside the model. We quantify this discrepancy and observe that interestingly, there are some patterns that persist across all layers despite the mixing.
Anthology ID:
2021.eacl-main.9
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
105–124
Language:
URL:
https://aclanthology.org/2021.eacl-main.9
DOI:
10.18653/v1/2021.eacl-main.9
Bibkey:
Cite (ACL):
Damian Pascual, Gino Brunner, and Roger Wattenhofer. 2021. Telling BERT’s Full Story: from Local Attention to Global Aggregation. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 105–124, Online. Association for Computational Linguistics.
Cite (Informal):
Telling BERT’s Full Story: from Local Attention to Global Aggregation (Pascual et al., EACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2021.eacl-main.9.pdf