Vector Norms as an Approximation of Syntactic Complexity

Adam Ek, Nikolai Ilinykh


Abstract
Internal representations in transformer models can encode useful linguistic knowledge about syntax. Such knowledge could help optimise the data annotation process. However, identifying and extracting such representations from big language models is challenging. In this paper we evaluate two multilingual transformers for the presence of knowledge about the syntactic complexity of sentences and examine different vector norms. We provide a fine-grained evaluation of different norms in different layers and for different languages. Our results suggest that no single part in the models would be the primary source for the knowledge of syntactic complexity. But some norms show a higher degree of sensitivity to syntactic complexity, depending on the language and model used.
Anthology ID:
2023.resourceful-1.15
Volume:
Proceedings of the Second Workshop on Resources and Representations for Under-Resourced Languages and Domains (RESOURCEFUL-2023)
Month:
May
Year:
2023
Address:
Tórshavn, the Faroe Islands
Editors:
Nikolai Ilinykh, Felix Morger, Dana Dannélls, Simon Dobnik, Beáta Megyesi, Joakim Nivre
Venue:
RESOURCEFUL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
121–131
Language:
URL:
https://aclanthology.org/2023.resourceful-1.15
DOI:
Bibkey:
Cite (ACL):
Adam Ek and Nikolai Ilinykh. 2023. Vector Norms as an Approximation of Syntactic Complexity. In Proceedings of the Second Workshop on Resources and Representations for Under-Resourced Languages and Domains (RESOURCEFUL-2023), pages 121–131, Tórshavn, the Faroe Islands. Association for Computational Linguistics.
Cite (Informal):
Vector Norms as an Approximation of Syntactic Complexity (Ek & Ilinykh, RESOURCEFUL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2023.resourceful-1.15.pdf