KALA: Knowledge-Augmented Language Model Adaptation

Minki Kang, Jinheon Baek, Sung Ju Hwang


Abstract
Pre-trained language models (PLMs) have achieved remarkable success on various natural language understanding tasks. Simple fine-tuning of PLMs, on the other hand, might be suboptimal for domain-specific tasks because they cannot possibly cover knowledge from all domains. While adaptive pre-training of PLMs can help them obtain domain-specific knowledge, it requires a large training cost. Moreover, adaptive pre-training can harm the PLM’s performance on the downstream task by causing catastrophic forgetting of its general knowledge. To overcome such limitations of adaptive pre-training for PLM adaption, we propose a novel domain adaption framework for PLMs coined as Knowledge-Augmented Language model Adaptation (KALA), which modulates the intermediate hidden representations of PLMs with domain knowledge, consisting of entities and their relational facts. We validate the performance of our KALA on question answering and named entity recognition tasks on multiple datasets across various domains. The results show that, despite being computationally efficient, our KALA largely outperforms adaptive pre-training.
Anthology ID:
2022.naacl-main.379
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5144–5167
Language:
URL:
https://aclanthology.org/2022.naacl-main.379
DOI:
10.18653/v1/2022.naacl-main.379
Bibkey:
Cite (ACL):
Minki Kang, Jinheon Baek, and Sung Ju Hwang. 2022. KALA: Knowledge-Augmented Language Model Adaptation. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5144–5167, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
KALA: Knowledge-Augmented Language Model Adaptation (Kang et al., NAACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2022.naacl-main.379.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2022.naacl-main.379.mp4
Code
 nardien/kala
Data
CoNLL 2003NCBI DiseaseNewsQAWNUT 2017