Knowledge-augmented Self-training of A Question Rewriter for Conversational Knowledge Base Question Answering
Xirui Ke, Jing Zhang, Xin Lv, Yiqi Xu, Shulin Cao, Cuiping Li, Hong Chen, Juanzi Li
Abstract
The recent rise of conversational applications such as online customer service systems and intelligent personal assistants has promoted the development of conversational knowledge base question answering (ConvKBQA). Different from the traditional single-turn KBQA, ConvKBQA usually explores multi-turn questions around a topic, where ellipsis and coreference pose great challenges to the single-turn KBQA systems which require self-contained questions. In this paper, we propose a rewrite-and-reason framework to first produce a full-fledged rewritten question based on the conversation history and then reason the answer by existing single-turn KBQA models. To overcome the absence of the rewritten supervision signals, we introduce a knowledge-augmented self-training mechanism to transfer the question rewriter from another dataset to adapt to the current knowledge base. Our question rewriter is decoupled from the subsequent QA process, which makes it easy to be united with either retrieval-based or semantic parsing-based KBQA models. Experiment results demonstrate the effectiveness of our method and a new state-of-the-art result is achieved. The code and dataset are available online now.- Anthology ID:
- 2022.findings-emnlp.133
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2022
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Editors:
- Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1844–1856
- Language:
- URL:
- https://aclanthology.org/2022.findings-emnlp.133
- DOI:
- 10.18653/v1/2022.findings-emnlp.133
- Cite (ACL):
- Xirui Ke, Jing Zhang, Xin Lv, Yiqi Xu, Shulin Cao, Cuiping Li, Hong Chen, and Juanzi Li. 2022. Knowledge-augmented Self-training of A Question Rewriter for Conversational Knowledge Base Question Answering. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 1844–1856, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- Knowledge-augmented Self-training of A Question Rewriter for Conversational Knowledge Base Question Answering (Ke et al., Findings 2022)
- PDF:
- https://preview.aclanthology.org/ingest-2024-clasp/2022.findings-emnlp.133.pdf