EviNets: Neural Networks for Combining Evidence Signals for Factoid Question Answering

Denis Savenkov, Eugene Agichtein


Abstract
A critical task for question answering is the final answer selection stage, which has to combine multiple signals available about each answer candidate. This paper proposes EviNets: a novel neural network architecture for factoid question answering. EviNets scores candidate answer entities by combining the available supporting evidence, e.g., structured knowledge bases and unstructured text documents. EviNets represents each piece of evidence with a dense embeddings vector, scores their relevance to the question, and aggregates the support for each candidate to predict their final scores. Each of the components is generic and allows plugging in a variety of models for semantic similarity scoring and information aggregation. We demonstrate the effectiveness of EviNets in experiments on the existing TREC QA and WikiMovies benchmarks, and on the new Yahoo! Answers dataset introduced in this paper. EviNets can be extended to other information types and could facilitate future work on combining evidence signals for joint reasoning in question answering.
Anthology ID:
P17-2047
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
299–304
Language:
URL:
https://aclanthology.org/P17-2047
DOI:
10.18653/v1/P17-2047
Bibkey:
Cite (ACL):
Denis Savenkov and Eugene Agichtein. 2017. EviNets: Neural Networks for Combining Evidence Signals for Factoid Question Answering. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 299–304, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
EviNets: Neural Networks for Combining Evidence Signals for Factoid Question Answering (Savenkov & Agichtein, ACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/P17-2047.pdf
Data
WikiMoviesWikiQA