Turning Tables: Generating Examples from Semi-structured Tables for Endowing Language Models with Reasoning Skills

Ori Yoran, Alon Talmor, Jonathan Berant


Abstract
Models pre-trained with a language modeling objective possess ample world knowledge and language skills, but are known to struggle in tasks that require reasoning. In this work, we propose to leverage semi-structured tables, and automatically generate at scale question-paragraph pairs, where answering the question requires reasoning over multiple facts in the paragraph. We add a pre-training step over this synthetic data, which includes examples that require 16 different reasoning skills such as number comparison, conjunction, and fact composition. To improve data efficiency, we sample examples from reasoning skills where the model currently errs. We evaluate our approach on three reasoning-focused reading comprehension datasets, and show that our model, PReasM, substantially outperforms T5, a popular pre-trained encoder-decoder model. Moreover, sampling examples based on model errors leads to faster training and higher performance.
Anthology ID:
2022.acl-long.416
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6016–6031
Language:
URL:
https://aclanthology.org/2022.acl-long.416
DOI:
10.18653/v1/2022.acl-long.416
Bibkey:
Cite (ACL):
Ori Yoran, Alon Talmor, and Jonathan Berant. 2022. Turning Tables: Generating Examples from Semi-structured Tables for Endowing Language Models with Reasoning Skills. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6016–6031, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Turning Tables: Generating Examples from Semi-structured Tables for Endowing Language Models with Reasoning Skills (Yoran et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2022.acl-long.416.pdf
Software:
 2022.acl-long.416.software.zip
Video:
 https://preview.aclanthology.org/improve-issue-templates/2022.acl-long.416.mp4
Code
 oriyor/turning_tables
Data
DROPIIRC