Where Does Linguistic Information Emerge in Neural Language Models? Measuring Gains and Contributions across Layers

Jenny Kunz, Marco Kuhlmann


Abstract
Probing studies have extensively explored where in neural language models linguistic information is located. The standard approach to interpreting the results of a probing classifier is to focus on the layers whose representations give the highest performance on the probing task. We propose an alternative method that asks where the task-relevant information emerges in the model. Our framework consists of a family of metrics that explicitly model local information gain relative to the previous layer and each layer’s contribution to the model’s overall performance. We apply the new metrics to two pairs of syntactic probing tasks with different degrees of complexity and find that the metrics confirm the expected ordering only for one of the pairs. Our local metrics show a massive dominance of the first layers, indicating that the features that contribute the most to our probing tasks are not as high-level as global metrics suggest.
Anthology ID:
2022.coling-1.413
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4664–4676
Language:
URL:
https://aclanthology.org/2022.coling-1.413
DOI:
Bibkey:
Cite (ACL):
Jenny Kunz and Marco Kuhlmann. 2022. Where Does Linguistic Information Emerge in Neural Language Models? Measuring Gains and Contributions across Layers. In Proceedings of the 29th International Conference on Computational Linguistics, pages 4664–4676, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Where Does Linguistic Information Emerge in Neural Language Models? Measuring Gains and Contributions across Layers (Kunz & Kuhlmann, COLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.coling-1.413.pdf
Code
 jekunz/emergent_info