AprilE: Attention with Pseudo Residual Connection for Knowledge Graph Embedding

Yuzhang Liu, Peng Wang, Yingtai Li, Yizhan Shao, Zhongkai Xu


Abstract
Knowledge graph embedding maps entities and relations into low-dimensional vector space. However, it is still challenging for many existing methods to model diverse relational patterns, especially symmetric and antisymmetric relations. To address this issue, we propose a novel model, AprilE, which employs triple-level self-attention and pseudo residual connection to model relational patterns. The triple-level self-attention treats head entity, relation, and tail entity as a sequence and captures the dependency within a triple. At the same time the pseudo residual connection retains primitive semantic features. Furthermore, to deal with symmetric and antisymmetric relations, two schemas of score function are designed via a position-adaptive mechanism. Experimental results on public datasets demonstrate that our model can produce expressive knowledge embedding and significantly outperforms most of the state-of-the-art works.
Anthology ID:
2020.coling-main.44
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
508–518
Language:
URL:
https://aclanthology.org/2020.coling-main.44
DOI:
10.18653/v1/2020.coling-main.44
Bibkey:
Cite (ACL):
Yuzhang Liu, Peng Wang, Yingtai Li, Yizhan Shao, and Zhongkai Xu. 2020. AprilE: Attention with Pseudo Residual Connection for Knowledge Graph Embedding. In Proceedings of the 28th International Conference on Computational Linguistics, pages 508–518, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
AprilE: Attention with Pseudo Residual Connection for Knowledge Graph Embedding (Liu et al., COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.coling-main.44.pdf