Capturing Conversational Interaction for Question Answering via Global History Reasoning

Jin Qian, Bowei Zou, Mengxing Dong, Xiao Li, AiTi Aw, Yu Hong


Abstract
Conversational Question Answering (ConvQA) is required to answer the current question, conditioned on the observable paragraph-level context and conversation history. Previous works have intensively studied history-dependent reasoning. They perceive and absorb topic-related information of prior utterances in the interactive encoding stage. It yielded significant improvement compared to history-independent reasoning. This paper further strengthens the ConvQA encoder by establishing long-distance dependency among global utterances in multi-turn conversation. We use multi-layer transformers to resolve long-distance relationships, which potentially contribute to the reweighting of attentive information in historical utterances. Experiments on QuAC show that our method obtains a substantial improvement (1%), yielding the F1 score of 73.7%. All source codes are available at https://github.com/jaytsien/GHR.
Anthology ID:
2022.findings-naacl.159
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2071–2078
Language:
URL:
https://aclanthology.org/2022.findings-naacl.159
DOI:
10.18653/v1/2022.findings-naacl.159
Bibkey:
Cite (ACL):
Jin Qian, Bowei Zou, Mengxing Dong, Xiao Li, AiTi Aw, and Yu Hong. 2022. Capturing Conversational Interaction for Question Answering via Global History Reasoning. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2071–2078, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Capturing Conversational Interaction for Question Answering via Global History Reasoning (Qian et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2022.findings-naacl.159.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-1/2022.findings-naacl.159.mp4
Code
 jaytsien/ghr
Data
QuAC