Chengfu Huo
2022
Implicit Relation Linking for Question Answering over Knowledge Graph
Yao Zhao
|
Jiacheng Huang
|
Wei Hu
|
Qijin Chen
|
XiaoXia Qiu
|
Chengfu Huo
|
Weijun Ren
Findings of the Association for Computational Linguistics: ACL 2022
Relation linking (RL) is a vital module in knowledge-based question answering (KBQA) systems. It aims to link the relations expressed in natural language (NL) to the corresponding ones in knowledge graph (KG). Existing methods mainly rely on the textual similarities between NL and KG to build relation links. Due to the ambiguity of NL and the incompleteness of KG, many relations in NL are implicitly expressed, and may not link to a single relation in KG, which challenges the current methods. In this paper, we propose an implicit RL method called ImRL, which links relation phrases in NL to relation paths in KG. To find proper relation paths, we propose a novel path ranking model that aligns not only textual information in the word embedding space but also structural information in the KG embedding space between relation phrases in NL and relation paths in KG. Besides, we leverage a gated mechanism with attention to inject prior knowledge from external paraphrase dictionaries to address the relation phrases with vague meaning. Our experiments on two benchmark and a newly-created datasets show that ImRL significantly outperforms several state-of-the-art methods, especially for implicit RL.
Search
Co-authors
- Yao Zhao 1
- Jiacheng Huang 1
- Wei Hu 1
- Qijin Chen 1
- XiaoXia Qiu 1
- show all...