Improving Question Answering over Incomplete KBs with Knowledge-Aware Reader
Wenhan Xiong, Mo Yu, Shiyu Chang, Xiaoxiao Guo, William Yang Wang
Abstract
We propose a new end-to-end question answering model, which learns to aggregate answer evidence from an incomplete knowledge base (KB) and a set of retrieved text snippets.Under the assumptions that structured data is easier to query and the acquired knowledge can help the understanding of unstructured text, our model first accumulates knowledge ofKB entities from a question-related KB sub-graph; then reformulates the question in the latent space and reads the text with the accumulated entity knowledge at hand. The evidence from KB and text are finally aggregated to predict answers. On the widely-used KBQA benchmark WebQSP, our model achieves consistent improvements across settings with different extents of KB incompleteness.- Anthology ID:
- P19-1417
- Volume:
- Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2019
- Address:
- Florence, Italy
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4258–4264
- Language:
- URL:
- https://aclanthology.org/P19-1417
- DOI:
- 10.18653/v1/P19-1417
- Cite (ACL):
- Wenhan Xiong, Mo Yu, Shiyu Chang, Xiaoxiao Guo, and William Yang Wang. 2019. Improving Question Answering over Incomplete KBs with Knowledge-Aware Reader. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4258–4264, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- Improving Question Answering over Incomplete KBs with Knowledge-Aware Reader (Xiong et al., ACL 2019)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/P19-1417.pdf
- Code
- xwhan/Knowledge-Aware-Reader + additional community code