What Does BERT Learn about the Structure of Language?

Ganesh Jawahar, Benoît Sagot, Djamé Seddah


Abstract
BERT is a recent language representation model that has surprisingly performed well in diverse language understanding benchmarks. This result indicates the possibility that BERT networks capture structural information about language. In this work, we provide novel support for this claim by performing a series of experiments to unpack the elements of English language structure learned by BERT. Our findings are fourfold. BERT’s phrasal representation captures the phrase-level information in the lower layers. The intermediate layers of BERT compose a rich hierarchy of linguistic information, starting with surface features at the bottom, syntactic features in the middle followed by semantic features at the top. BERT requires deeper layers while tracking subject-verb agreement to handle long-term dependency problem. Finally, the compositional scheme underlying BERT mimics classical, tree-like structures.
Anthology ID:
P19-1356
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3651–3657
Language:
URL:
https://aclanthology.org/P19-1356
DOI:
10.18653/v1/P19-1356
Bibkey:
Cite (ACL):
Ganesh Jawahar, Benoît Sagot, and Djamé Seddah. 2019. What Does BERT Learn about the Structure of Language?. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3651–3657, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
What Does BERT Learn about the Structure of Language? (Jawahar et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/P19-1356.pdf
Video:
 https://preview.aclanthology.org/ingest-2024-clasp/P19-1356.mp4
Data
SNLISentEval