Abstract
Recent studies on knowledge base completion, the task of recovering missing relationships based on recorded relations, demonstrate the importance of learning embeddings from multi-step relations. However, due to the size of knowledge bases, learning multi-step relations directly on top of observed triplets could be costly. Hence, a manually designed procedure is often used when training the models. In this paper, we propose Implicit ReasoNets (IRNs), which is designed to perform multi-step inference implicitly through a controller and shared memory. Without a human-designed inference procedure, IRNs use training data to learn to perform multi-step inference in an embedding neural space through the shared memory and controller. While the inference procedure does not explicitly operate on top of observed triplets, our proposed model outperforms all previous approaches on the popular FB15k benchmark by more than 5.7%.- Anthology ID:
- W17-2608
- Volume:
- Proceedings of the 2nd Workshop on Representation Learning for NLP
- Month:
- August
- Year:
- 2017
- Address:
- Vancouver, Canada
- Editors:
- Phil Blunsom, Antoine Bordes, Kyunghyun Cho, Shay Cohen, Chris Dyer, Edward Grefenstette, Karl Moritz Hermann, Laura Rimell, Jason Weston, Scott Yih
- Venue:
- RepL4NLP
- SIG:
- SIGREP
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 57–68
- Language:
- URL:
- https://aclanthology.org/W17-2608
- DOI:
- 10.18653/v1/W17-2608
- Cite (ACL):
- Yelong Shen, Po-Sen Huang, Ming-Wei Chang, and Jianfeng Gao. 2017. Modeling Large-Scale Structured Relationships with Shared Memory for Knowledge Base Completion. In Proceedings of the 2nd Workshop on Representation Learning for NLP, pages 57–68, Vancouver, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Modeling Large-Scale Structured Relationships with Shared Memory for Knowledge Base Completion (Shen et al., RepL4NLP 2017)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-3/W17-2608.pdf