Have Attention Heads in BERT Learned Constituency Grammar?

Ziyang Luo


Abstract
With the success of pre-trained language models in recent years, more and more researchers focus on opening the “black box” of these models. Following this interest, we carry out a qualitative and quantitative analysis of constituency grammar in attention heads of BERT and RoBERTa. We employ the syntactic distance method to extract implicit constituency grammar from the attention weights of each head. Our results show that there exist heads that can induce some grammar types much better than baselines, suggesting that some heads act as a proxy for constituency grammar. We also analyze how attention heads’ constituency grammar inducing (CGI) ability changes after fine-tuning with two kinds of tasks, including sentence meaning similarity (SMS) tasks and natural language inference (NLI) tasks. Our results suggest that SMS tasks decrease the average CGI ability of upper layers, while NLI tasks increase it. Lastly, we investigate the connections between CGI ability and natural language understanding ability on QQP and MNLI tasks.
Anthology ID:
2021.eacl-srw.2
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop
Month:
April
Year:
2021
Address:
Online
Editors:
Ionut-Teodor Sorodoc, Madhumita Sushil, Ece Takmaz, Eneko Agirre
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8–15
Language:
URL:
https://aclanthology.org/2021.eacl-srw.2
DOI:
10.18653/v1/2021.eacl-srw.2
Bibkey:
Cite (ACL):
Ziyang Luo. 2021. Have Attention Heads in BERT Learned Constituency Grammar?. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop, pages 8–15, Online. Association for Computational Linguistics.
Cite (Informal):
Have Attention Heads in BERT Learned Constituency Grammar? (Luo, EACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2021.eacl-srw.2.pdf
Data
GLUEMultiNLIQNLI