Asking Clarification Questions to Handle Ambiguity in Open-Domain QA

Dongryeol Lee, Segwang Kim, Minwoo Lee, Hwanhee Lee, Joonsuk Park, Sang-Woo Lee, Kyomin Jung


Abstract
Ambiguous questions persist in open-domain question answering, because formulating a precise question with a unique answer is often challenging. Previous works have tackled this issue by asking disambiguated questions for all possible interpretations of the ambiguous question. Instead, we propose to ask a clarification question, where the user’s response will help identify the interpretation that best aligns with the user’s intention. We first present CAmbigNQ, a dataset consisting of 5,653 ambiguous questions, each with relevant passages, possible answers, and a clarification question. The clarification questions were efficiently created by generating them using InstructGPT and manually revising them as necessary. We then define a pipeline of three tasks—(1) ambiguity detection, (2) clarification question generation, and (3) clarification-based QA. In the process, we adopt or design appropriate evaluation metrics to facilitate sound research. Lastly, we achieve F1 of 61.3, 25.1, and 40.5 on the three tasks, demonstrating the need for further improvements while providing competitive baselines for future work.
Anthology ID:
2023.findings-emnlp.772
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11526–11544
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.772
DOI:
10.18653/v1/2023.findings-emnlp.772
Bibkey:
Cite (ACL):
Dongryeol Lee, Segwang Kim, Minwoo Lee, Hwanhee Lee, Joonsuk Park, Sang-Woo Lee, and Kyomin Jung. 2023. Asking Clarification Questions to Handle Ambiguity in Open-Domain QA. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11526–11544, Singapore. Association for Computational Linguistics.
Cite (Informal):
Asking Clarification Questions to Handle Ambiguity in Open-Domain QA (Lee et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2023.findings-emnlp.772.pdf