Efficient Entity Embedding Construction from Type Knowledge for BERT

Yukun Feng, Amir Fayazi, Abhinav Rastogi, Manabu Okumura


Abstract
Recent work has shown advantages of incorporating knowledge graphs (KGs) into BERT for various NLP tasks. One common way is to feed entity embeddings as an additional input during pre-training. There are two limitations to such a method. First, to train the entity embeddings to include rich information of factual knowledge, it typically requires access to the entire KG. This is challenging for KGs with daily changes (e.g., Wikidata). Second, it requires a large scale pre-training corpus with entity annotations and high computational cost during pre-training. In this work, we efficiently construct entity embeddings only from the type knowledge, that does not require access to the entire KG. Although the entity embeddings contain only local information, they perform very well when combined with context. Furthermore, we show that our entity embeddings, constructed from BERT’s input embeddings, can be directly incorporated into the fine-tuning phase without requiring any specialized pre-training. In addition, these entity embeddings can also be constructed on the fly without requiring a large memory footprint to store them. Finally, we propose task-specific models that incorporate our entity embeddings for entity linking, entity typing, and relation classification. Experiments show that our models have comparable or superior performance to existing models while being more resource efficient.
Anthology ID:
2022.findings-aacl.1
Volume:
Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022
Month:
November
Year:
2022
Address:
Online only
Editors:
Yulan He, Heng Ji, Sujian Li, Yang Liu, Chua-Hui Chang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–10
Language:
URL:
https://aclanthology.org/2022.findings-aacl.1
DOI:
Bibkey:
Cite (ACL):
Yukun Feng, Amir Fayazi, Abhinav Rastogi, and Manabu Okumura. 2022. Efficient Entity Embedding Construction from Type Knowledge for BERT. In Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022, pages 1–10, Online only. Association for Computational Linguistics.
Cite (Informal):
Efficient Entity Embedding Construction from Type Knowledge for BERT (Feng et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2022.findings-aacl.1.pdf