Hypothetical Training for Robust Machine Reading Comprehension of Tabular Context

Moxin Li, Wenjie Wang, Fuli Feng, Hanwang Zhang, Qifan Wang, Tat-Seng Chua


Abstract
Machine Reading Comprehension (MRC) models easily learn spurious correlations from complex contexts such as tabular data. Counterfactual training—using the factual and counterfactual data by augmentation—has become a promising solution. However, it is costly to construct faithful counterfactual examples because it is tricky to maintain the consistency and dependency of the tabular data. In this paper, we take a more efficient fashion to ask hypothetical questions like “in which year would the net profit be larger if the revenue in 2019 were $38,298?”, whose effects on the answers are equivalent to those expensive counterfactual tables. We propose a hypothetical training framework that uses paired examples with different hypothetical questions to supervise the direction of model gradient towards the counterfactual answer change. The superior generalization results on tabular MRC datasets, including a newly constructed stress test and MultiHiertt, validate our effectiveness.
Anthology ID:
2023.findings-acl.79
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1220–1236
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.findings-acl.79/
DOI:
10.18653/v1/2023.findings-acl.79
Bibkey:
Cite (ACL):
Moxin Li, Wenjie Wang, Fuli Feng, Hanwang Zhang, Qifan Wang, and Tat-Seng Chua. 2023. Hypothetical Training for Robust Machine Reading Comprehension of Tabular Context. In Findings of the Association for Computational Linguistics: ACL 2023, pages 1220–1236, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Hypothetical Training for Robust Machine Reading Comprehension of Tabular Context (Li et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.findings-acl.79.pdf