@inproceedings{shen-etal-2017-modeling,
    title = "Modeling Large-Scale Structured Relationships with Shared Memory for Knowledge Base Completion",
    author = "Shen, Yelong  and
      Huang, Po-Sen  and
      Chang, Ming-Wei  and
      Gao, Jianfeng",
    editor = "Blunsom, Phil  and
      Bordes, Antoine  and
      Cho, Kyunghyun  and
      Cohen, Shay  and
      Dyer, Chris  and
      Grefenstette, Edward  and
      Hermann, Karl Moritz  and
      Rimell, Laura  and
      Weston, Jason  and
      Yih, Scott",
    booktitle = "Proceedings of the 2nd Workshop on Representation Learning for {NLP}",
    month = aug,
    year = "2017",
    address = "Vancouver, Canada",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/W17-2608/",
    doi = "10.18653/v1/W17-2608",
    pages = "57--68",
    abstract = "Recent studies on knowledge base completion, the task of recovering missing relationships based on recorded relations, demonstrate the importance of learning embeddings from multi-step relations. However, due to the size of knowledge bases, learning multi-step relations directly on top of observed triplets could be costly. Hence, a manually designed procedure is often used when training the models. In this paper, we propose Implicit ReasoNets (IRNs), which is designed to perform multi-step inference implicitly through a controller and shared memory. Without a human-designed inference procedure, IRNs use training data to learn to perform multi-step inference in an embedding neural space through the shared memory and controller. While the inference procedure does not explicitly operate on top of observed triplets, our proposed model outperforms all previous approaches on the popular FB15k benchmark by more than 5.7{\%}."
}Markdown (Informal)
[Modeling Large-Scale Structured Relationships with Shared Memory for Knowledge Base Completion](https://preview.aclanthology.org/iwcs-25-ingestion/W17-2608/) (Shen et al., RepL4NLP 2017)
ACL