Latent Compositional Representations Improve Systematic Generalization in Grounded Question Answering

Ben Bogin, Sanjay Subramanian, Matt Gardner, Jonathan Berant


Abstract
Answering questions that involve multi-step reasoning requires decomposing them and using the answers of intermediate steps to reach the final answer. However, state-of-the-art models in grounded question answering often do not explicitly perform decomposition, leading to difficulties in generalization to out-of-distribution examples. In this work, we propose a model that computes a representation and denotation for all question spans in a bottom-up, compositional manner using a CKY-style parser. Our model induces latent trees, driven by end-to-end (the answer) supervision only. We show that this inductive bias towards tree structures dramatically improves systematic generalization to out-of- distribution examples, compared to strong baselines on an arithmetic expressions benchmark as well as on C losure, a dataset that focuses on systematic generalization for grounded question answering. On this challenging dataset, our model reaches an accuracy of 96.1%, significantly higher than prior models that almost perfectly solve the task on a random, in-distribution split.
Anthology ID:
2021.tacl-1.12
Volume:
Transactions of the Association for Computational Linguistics, Volume 9
Month:
Year:
2021
Address:
Cambridge, MA
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
195–210
Language:
URL:
https://aclanthology.org/2021.tacl-1.12
DOI:
10.1162/tacl_a_00361
Bibkey:
Cite (ACL):
Ben Bogin, Sanjay Subramanian, Matt Gardner, and Jonathan Berant. 2021. Latent Compositional Representations Improve Systematic Generalization in Grounded Question Answering. Transactions of the Association for Computational Linguistics, 9:195–210.
Cite (Informal):
Latent Compositional Representations Improve Systematic Generalization in Grounded Question Answering (Bogin et al., TACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2021.tacl-1.12.pdf
Video:
 https://preview.aclanthology.org/remove-xml-comments/2021.tacl-1.12.mp4