Abstract
We present 1-Pager the first system that answers a question and retrieves evidence using a single Transformer-based model and decoding process. 1-Pager incrementally partitions the retrieval corpus using constrained decoding to select a document and answer string, and we show that this is competitive with comparable retrieve-and-read alternatives according to both retrieval and answer accuracy metrics. 1-Pager also outperforms the equivalent ‘closed-book’ question answering model, by grounding predictions in an evidence corpus. While 1-Pager is not yet on-par with more expensive systems that read many more documents before generating an answer, we argue that it provides an important step toward attributed generation by folding retrieval into the sequence-to-sequence paradigm that is currently dominant in NLP. We also show that the search paths used to partition the corpus are easy to read and understand, paving a way forward for interpretable neural retrieval.- Anthology ID:
- 2023.findings-emnlp.967
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 14529–14543
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.967
- DOI:
- 10.18653/v1/2023.findings-emnlp.967
- Cite (ACL):
- Palak Jain, Livio Soares, and Tom Kwiatkowski. 2023. 1-PAGER: One Pass Answer Generation and Evidence Retrieval. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 14529–14543, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- 1-PAGER: One Pass Answer Generation and Evidence Retrieval (Jain et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/ingest-2024-clasp/2023.findings-emnlp.967.pdf