Abstract
First-order factoid question answering assumes that the question can be answered by a single fact in a knowledge base (KB). While this does not seem like a challenging task, many recent attempts that apply either complex linguistic reasoning or deep neural networks achieve 65%–76% accuracy on benchmark sets. Our approach formulates the task as two machine learning problems: detecting the entities in the question, and classifying the question as one of the relation types in the KB. We train a recurrent neural network to solve each problem. On the SimpleQuestions dataset, our approach yields substantial improvements over previously published results — even neural networks based on much more complex architectures. The simplicity of our approach also has practical advantages, such as efficiency and modularity, that are valuable especially in an industry setting. In fact, we present a preliminary analysis of the performance of our model on real queries from Comcast’s X1 entertainment platform with millions of users every day.- Anthology ID:
- D17-1307
- Volume:
- Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
- Month:
- September
- Year:
- 2017
- Address:
- Copenhagen, Denmark
- Venue:
- EMNLP
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2866–2872
- Language:
- URL:
- https://aclanthology.org/D17-1307
- DOI:
- 10.18653/v1/D17-1307
- Cite (ACL):
- Ferhan Ture and Oliver Jojic. 2017. No Need to Pay Attention: Simple Recurrent Neural Networks Work!. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 2866–2872, Copenhagen, Denmark. Association for Computational Linguistics.
- Cite (Informal):
- No Need to Pay Attention: Simple Recurrent Neural Networks Work! (Ture & Jojic, EMNLP 2017)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/D17-1307.pdf
- Data
- SimpleQuestions