Do Deep Neural Networks Capture Compositionality in Arithmetic Reasoning?

Keito Kudo, Yoichi Aoki, Tatsuki Kuribayashi, Ana Brassard, Masashi Yoshikawa, Keisuke Sakaguchi, Kentaro Inui


Abstract
Compositionality is a pivotal property of symbolic reasoning. However, how well recent neural models capture compositionality remains underexplored in the symbolic reasoning tasks. This study empirically addresses this question by systematically examining recently published pre-trained seq2seq models with a carefully controlled dataset of multi-hop arithmetic symbolic reasoning. We introduce a skill tree on compositionality in arithmetic symbolic reasoning that defines the hierarchical levels of complexity along with three compositionality dimensions: systematicity, productivity, and substitutivity. Our experiments revealed that among the three types of composition, the models struggled most with systematicity, performing poorly even with relatively simple compositions. That difficulty was not resolved even after training the models with intermediate reasoning steps.
Anthology ID:
2023.eacl-main.98
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1351–1362
Language:
URL:
https://aclanthology.org/2023.eacl-main.98
DOI:
10.18653/v1/2023.eacl-main.98
Bibkey:
Cite (ACL):
Keito Kudo, Yoichi Aoki, Tatsuki Kuribayashi, Ana Brassard, Masashi Yoshikawa, Keisuke Sakaguchi, and Kentaro Inui. 2023. Do Deep Neural Networks Capture Compositionality in Arithmetic Reasoning?. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1351–1362, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Do Deep Neural Networks Capture Compositionality in Arithmetic Reasoning? (Kudo et al., EACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.eacl-main.98.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2023.eacl-main.98.mp4