Answer Consolidation: Formulation and Benchmarking

Wenxuan Zhou, Qiang Ning, Heba Elfardy, Kevin Small, Muhao Chen


Abstract
Current question answering (QA) systems primarily consider the single-answer scenario, where each question is assumed to be paired with one correct answer. However, in many real-world QA applications, multiple answer scenarios arise where consolidating answers into a comprehensive and non-redundant set of answers is a more efficient user interface. In this paper, we formulate the problem of answer consolidation, where answers are partitioned into multiple groups, each representing different aspects of the answer set. Then, given this partitioning, a comprehensive and non-redundant set of answers can be constructed by picking one answer from each group. To initiate research on answer consolidation, we construct a dataset consisting of 4,699 questions and 24,006 sentences and evaluate multiple models. Despite a promising performance achieved by the best-performing supervised models, we still believe this task has room for further improvements.
Anthology ID:
2022.naacl-main.320
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4314–4325
Language:
URL:
https://aclanthology.org/2022.naacl-main.320
DOI:
10.18653/v1/2022.naacl-main.320
Bibkey:
Cite (ACL):
Wenxuan Zhou, Qiang Ning, Heba Elfardy, Kevin Small, and Muhao Chen. 2022. Answer Consolidation: Formulation and Benchmarking. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4314–4325, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Answer Consolidation: Formulation and Benchmarking (Zhou et al., NAACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.naacl-main.320.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2022.naacl-main.320.mp4
Code
 amazon-research/question-answer-consolidation
Data
MultiNLI