Relation Specific Transformations for Open World Knowledge Graph Completion

Haseeb Shah, Johannes Villmow, Adrian Ulges


Abstract
We propose an open-world knowledge graph completion model that can be combined with common closed-world approaches (such as ComplEx) and enhance them to exploit text-based representations for entities unseen in training. Our model learns relation-specific transformation functions from text-based to graph-based embedding space, where the closed-world link prediction model can be applied. We demonstrate state-of-the-art results on common open-world benchmarks and show that our approach benefits from relation-specific transformation functions (RST), giving substantial improvements over a relation-agnostic approach.
Anthology ID:
2020.textgraphs-1.9
Volume:
Proceedings of the Graph-based Methods for Natural Language Processing (TextGraphs)
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Dmitry Ustalov, Swapna Somasundaran, Alexander Panchenko, Fragkiskos D. Malliaros, Ioana Hulpuș, Peter Jansen, Abhik Jana
Venue:
TextGraphs
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
79–84
Language:
URL:
https://aclanthology.org/2020.textgraphs-1.9
DOI:
10.18653/v1/2020.textgraphs-1.9
Bibkey:
Cite (ACL):
Haseeb Shah, Johannes Villmow, and Adrian Ulges. 2020. Relation Specific Transformations for Open World Knowledge Graph Completion. In Proceedings of the Graph-based Methods for Natural Language Processing (TextGraphs), pages 79–84, Barcelona, Spain (Online). Association for Computational Linguistics.
Cite (Informal):
Relation Specific Transformations for Open World Knowledge Graph Completion (Shah et al., TextGraphs 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2020.textgraphs-1.9.pdf
Optional supplementary material:
 2020.textgraphs-1.9.OptionalSupplementaryMaterial.pdf
Code
 haseebs/rst-owe
Data
FB15kFB15k-237