Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models

Bastian Bunzeck, Sina Zarrieß


Abstract
The success of large language models (LMs) has also prompted a push towards smaller models, but the differences in functionality and encodings between these two types of models are not yet well understood. In this paper, we employ a perturbed masking approach to investigate differences in token influence patterns on the sequence embeddings of larger and smaller RoBERTa models. Specifically, we explore how token properties like position, length or part of speech influence their sequence embeddings. We find that there is a general tendency for sequence-final tokens to exert a higher influence. Among part-of-speech tags, nouns, numerals and punctuation marks are the most influential, with smaller deviations for individual models. These findings also align with usage-based linguistic evidence on the effect of entrenchment. Finally, we show that the relationship between data size and model size influences the variability and brittleness of these effects, hinting towards a need for holistically balanced models.
Anthology ID:
2023.clasp-1.3
Volume:
Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD)
Month:
September
Year:
2023
Address:
Gothenburg, Sweden
Editors:
Ellen Breitholtz, Shalom Lappin, Sharid Loaiciga, Nikolai Ilinykh, Simon Dobnik
Venue:
CLASP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
25–37
Language:
URL:
https://aclanthology.org/2023.clasp-1.3
DOI:
Bibkey:
Cite (ACL):
Bastian Bunzeck and Sina Zarrieß. 2023. Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models. In Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD), pages 25–37, Gothenburg, Sweden. Association for Computational Linguistics.
Cite (Informal):
Entrenchment Matters: Investigating Positional and Constructional Sensitivity in Small and Large Language Models (Bunzeck & Zarrieß, CLASP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.clasp-1.3.pdf