Transformer-based Entity Typing in Knowledge Graphs

Zhiwei Hu, Victor Gutierrez-Basulto, Zhiliang Xiang, Ru Li, Jeff Pan


Abstract
We investigate the knowledge graph entity typing task which aims at inferring plausible entity types. In this paper, we propose a novel Transformer-based Entity Typing (TET) approach, effectively encoding the content of neighbours of an entity by means of a transformer mechanism. More precisely, TET is composed of three different mechanisms: a local transformer allowing to infer missing entity types by independently encoding the information provided by each of its neighbours; a global transformer aggregating the information of all neighbours of an entity into a single long sequence to reason about more complex entity types; and a context transformer integrating neighbours content in a differentiated way through information exchange between neighbour pairs, while preserving the graph structure. Furthermore, TET uses information about class membership of types to semantically strengthen the representation of an entity. Experiments on two real-world datasets demonstrate the superior performance of TET compared to the state-of-the-art.
Anthology ID:
2022.emnlp-main.402
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5988–6001
Language:
URL:
https://aclanthology.org/2022.emnlp-main.402
DOI:
10.18653/v1/2022.emnlp-main.402
Bibkey:
Cite (ACL):
Zhiwei Hu, Victor Gutierrez-Basulto, Zhiliang Xiang, Ru Li, and Jeff Pan. 2022. Transformer-based Entity Typing in Knowledge Graphs. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 5988–6001, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Transformer-based Entity Typing in Knowledge Graphs (Hu et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2022.emnlp-main.402.pdf