Data-driven Cross-lingual Syntax: An Agreement Study with Massively Multilingual Models

Andrea Gregor de Varda, Marco Marelli


Abstract
Massively multilingual models such as mBERT and XLM-R are increasingly valued in Natural Language Processing research and applications, due to their ability to tackle the uneven distribution of resources available for different languages. The models’ ability to process multiple languages relying on a shared set of parameters raises the question of whether the grammatical knowledge they extracted during pre-training can be considered as a data-driven cross-lingual grammar. The present work studies the inner workings of mBERT and XLM-R in order to test the cross-lingual consistency of the individual neural units that respond to a precise syntactic phenomenon, that is, number agreement, in five languages (English, German, French, Hebrew, Russian). We found that there is a significant overlap in the latent dimensions that encode agreement across the languages we considered. This overlap is larger (a) for long- vis-à-vis short-distance agreement and (b) when considering XLM-R as compared to mBERT, and peaks in the intermediate layers of the network. We further show that a small set of syntax-sensitive neurons can capture agreement violations across languages; however, their contribution is not decisive in agreement processing.
Anthology ID:
2023.cl-2.1
Volume:
Computational Linguistics, Volume 49, Issue 2 - June 2023
Month:
June
Year:
2023
Address:
Cambridge, MA
Venue:
CL
SIG:
Publisher:
MIT Press
Note:
Pages:
261–299
Language:
URL:
https://aclanthology.org/2023.cl-2.1
DOI:
10.1162/coli_a_00472
Bibkey:
Cite (ACL):
Andrea Gregor de Varda and Marco Marelli. 2023. Data-driven Cross-lingual Syntax: An Agreement Study with Massively Multilingual Models. Computational Linguistics, 49(2):261–299.
Cite (Informal):
Data-driven Cross-lingual Syntax: An Agreement Study with Massively Multilingual Models (de Varda & Marelli, CL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2023.cl-2.1.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2023.cl-2.1.mp4