What Has Been Enhanced in my Knowledge-Enhanced Language Model?

Yifan Hou, Guoji Fu, Mrinmaya Sachan


Abstract
A number of knowledge integration (KI) methods have recently been proposed to incorporate external knowledge into pretrained language models (LMs). Even though knowledge-enhanced LMs (KELMs) outperform base LMs on knowledge-intensive tasks, the inner-workings of these KI methods are not well-understood. For instance, it is unclear which knowledge is effectively integrated into KELMs and which is not; and if such integration led to catastrophic forgetting of already learned knowledge. We show that existing model interpretation methods such as linear probes and prompts have some key limitations in answering these questions. Then, we revisit KI from an information-theoretic view and propose a new theoretically sound probe model called Graph Convolution Simulator (GCS) for KI interpretation. GCS is eventually quite simple – it uses graph attention on the corresponding knowledge graph for interpretation.We conduct various experiments to verify that GCS provides reasonable interpretation results for two well-known KELMs: ERNIE and K-Adapter. Our experiments reveal that only little knowledge is successfully integrated in these models, and simply increasing the size of the KI corpus may not lead to better KELMs.
Anthology ID:
2022.findings-emnlp.102
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1417–1438
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.102
DOI:
10.18653/v1/2022.findings-emnlp.102
Bibkey:
Cite (ACL):
Yifan Hou, Guoji Fu, and Mrinmaya Sachan. 2022. What Has Been Enhanced in my Knowledge-Enhanced Language Model?. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 1417–1438, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
What Has Been Enhanced in my Knowledge-Enhanced Language Model? (Hou et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2022.findings-emnlp.102.pdf