Linguistically Grounded Analysis of Language Models using Shapley Head Values

Marcell Fekete, Johannes Bjerva


Abstract
Understanding how linguistic knowledge is encoded in language models is crucial for improving their generalisation capabilities. In this paper, we investigate the processing of morphosyntactic phenomena, by leveraging a recently proposed method for probing language models via Shapley Head Values (SHVs). Using the English language BLiMP dataset, we test our approach on two widely used models, BERT and RoBERTa, and compare how linguistic constructions such as anaphor agreement and filler-gap dependencies are handled. Through quantitative pruning and qualitative clustering analysis, we demonstrate that attention heads responsible for processing related linguistic phenomena cluster together. Our results show that SHV-based attributions reveal distinct patterns across both models, providing insights into how language models organize and process linguistic information. These findings support the hypothesis that language models learn subnetworks corresponding to linguistic theory, with potential implications for cross-linguistic model analysis and interpretability in Natural Language Processing (NLP).
Anthology ID:
2025.findings-naacl.49
Volume:
Findings of the Association for Computational Linguistics: NAACL 2025
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
850–865
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.49/
DOI:
Bibkey:
Cite (ACL):
Marcell Fekete and Johannes Bjerva. 2025. Linguistically Grounded Analysis of Language Models using Shapley Head Values. In Findings of the Association for Computational Linguistics: NAACL 2025, pages 850–865, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Linguistically Grounded Analysis of Language Models using Shapley Head Values (Fekete & Bjerva, Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.49.pdf