Abstract
Answer Sentence Selection (AS2) is an efficient approach for the design of open-domain Question Answering (QA) systems. In order to achieve low latency, traditional AS2 models score question-answer pairs individually, ignoring any information from the document each potential answer was extracted from. In contrast, more computationally expensive models designed for machine reading comprehension tasks typically receive one or more passages as input, which often results in better accuracy. In this work, we present an approach to efficiently incorporate contextual information in AS2 models. For each answer candidate, we first use unsupervised similarity techniques to extract relevant sentences from its source document, which we then feed into an efficient transformer architecture fine-tuned for AS2. Our best approach, which leverages a multi-way attention architecture to efficiently encode context, improves 6% to 11% over non-contextual state of the art in AS2 with minimal impact on system latency. All experiments in this work were conducted in English.- Anthology ID:
- 2021.eacl-main.261
- Volume:
- Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
- Month:
- April
- Year:
- 2021
- Address:
- Online
- Editors:
- Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
- Venue:
- EACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3005–3010
- Language:
- URL:
- https://aclanthology.org/2021.eacl-main.261
- DOI:
- 10.18653/v1/2021.eacl-main.261
- Cite (ACL):
- Rujun Han, Luca Soldaini, and Alessandro Moschitti. 2021. Modeling Context in Answer Sentence Selection Systems on a Latency Budget. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 3005–3010, Online. Association for Computational Linguistics.
- Cite (Informal):
- Modeling Context in Answer Sentence Selection Systems on a Latency Budget (Han et al., EACL 2021)
- PDF:
- https://preview.aclanthology.org/corrections-2024-05/2021.eacl-main.261.pdf
- Data
- ASNQ, Natural Questions