Abstract
Conversational KBQA is about answering a sequence of questions related to a KB. Follow-up questions in conversational KBQA often have missing information referring to entities from the conversation history. In this paper, we propose to model these implied entities, which we refer to as the focal entities of the conversation. We propose a novel graph-based model to capture the transitions of focal entities and apply a graph neural network to derive a probability distribution of focal entities for each question, which is then combined with a standard KBQA module to perform answer ranking. Our experiments on two datasets demonstrate the effectiveness of our proposed method.- Anthology ID:
- 2021.acl-long.255
- Volume:
- Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
- Month:
- August
- Year:
- 2021
- Address:
- Online
- Editors:
- Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
- Venues:
- ACL | IJCNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3288–3297
- Language:
- URL:
- https://aclanthology.org/2021.acl-long.255
- DOI:
- 10.18653/v1/2021.acl-long.255
- Cite (ACL):
- Yunshi Lan and Jing Jiang. 2021. Modeling Transitions of Focal Entities for Conversational Knowledge Base Question Answering. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 3288–3297, Online. Association for Computational Linguistics.
- Cite (Informal):
- Modeling Transitions of Focal Entities for Conversational Knowledge Base Question Answering (Lan & Jiang, ACL-IJCNLP 2021)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2021.acl-long.255.pdf
- Code
- lanyunshi/conversationalkbqa
- Data
- CSQA, ConvQuestions