Abstract
Recently, end-to-end (E2E) trained models for question answering over knowledge graphs (KGQA) have delivered promising results using only a weakly supervised dataset. However, these models are trained and evaluated in a setting where hand-annotated question entities are supplied to the model, leaving the important and non-trivial task of entity resolution (ER) outside the scope of E2E learning. In this work, we extend the boundaries of E2E learning for KGQA to include the training of an ER component. Our model only needs the question text and the answer entities to train, and delivers a stand-alone QA model that does not require an additional ER component to be supplied during runtime. Our approach is fully differentiable, thanks to its reliance on a recent method for building differentiable KGs (Cohen et al., 2020). We evaluate our E2E trained model on two public datasets and show that it comes close to baseline models that use hand-annotated entities.- Anthology ID:
- 2021.emnlp-main.345
- Volume:
- Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2021
- Address:
- Online and Punta Cana, Dominican Republic
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4193–4200
- Language:
- URL:
- https://aclanthology.org/2021.emnlp-main.345
- DOI:
- 10.18653/v1/2021.emnlp-main.345
- Cite (ACL):
- Amir Saffari, Armin Oliya, Priyanka Sen, and Tom Ayoola. 2021. End-to-End Entity Resolution and Question Answering Using Differentiable Knowledge Graphs. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 4193–4200, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- End-to-End Entity Resolution and Question Answering Using Differentiable Knowledge Graphs (Saffari et al., EMNLP 2021)
- PDF:
- https://preview.aclanthology.org/starsem-semeval-split/2021.emnlp-main.345.pdf
- Data
- SimpleQuestions