Realistic Data Augmentation Framework for Enhancing Tabular Reasoning

Dibyakanti Kumar, Vivek Gupta, Soumya Sharma, Shuo Zhang


Abstract
Existing approaches to constructing training data for Natural Language Inference (NLI) tasks, such as for semi-structured table reasoning, are either via crowdsourcing or fully automatic methods. However, the former is expensive and time consuming and thus limits scale, and the latter often produces naive examples that may lack complex reasoning. This paper develops a realistic semi-automated framework for data augmentation for tabular inference. Instead of manually generating a hypothesis for each table, our methodology generates hypothesis templates transferable to similar tables. In addition, our framework entails the creation of rational counterfactual tables based on human written logical constraints and premise paraphrasing. For our case study, we use the INFOTABS (Gupta et al., 2020), which is an entity centric tabular inference dataset. We observed that our framework could generate human-like tabular inference examples, which could benefit training data augmentation, especially in the scenario with limited supervision.
Anthology ID:
2022.findings-emnlp.324
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4411–4429
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.324
DOI:
10.18653/v1/2022.findings-emnlp.324
Bibkey:
Cite (ACL):
Dibyakanti Kumar, Vivek Gupta, Soumya Sharma, and Shuo Zhang. 2022. Realistic Data Augmentation Framework for Enhancing Tabular Reasoning. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 4411–4429, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Realistic Data Augmentation Framework for Enhancing Tabular Reasoning (Kumar et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-emnlp.324.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-emnlp.324.mp4