Revisiting Generative Commonsense Reasoning: A Pre-Ordering Approach

Chao Zhao, Faeze Brahman, Tenghao Huang, Snigdha Chaturvedi


Abstract
Pre-trained models (PTMs) have lead to great improvements in natural language generation (NLG). However, it is still unclear how much commonsense knowledge they possess. With the goal of evaluating commonsense knowledge of NLG models, recent work has proposed the problem of generative commonsense reasoning, e.g., to compose a logical sentence given a set of unordered concepts. Existing approaches to this problem hypothesize that PTMs lack sufficient parametric knowledge for this task, which can be overcome by introducing external knowledge or task-specific pre-training objectives. Different from this trend, we argue that PTM’s inherent ability for generative commonsense reasoning is underestimated due to the order-agnostic property of its input. In particular, we hypothesize that the order of the input concepts can affect the PTM’s ability to utilize its commonsense knowledge. To this end, we propose a pre-ordering approach to elaborately manipulate the order of the given concepts before generation. Experiments show that our approach can outperform the more sophisticated models that have access to a lot of external data and resources.
Anthology ID:
2022.findings-naacl.129
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1709–1718
Language:
URL:
https://aclanthology.org/2022.findings-naacl.129
DOI:
10.18653/v1/2022.findings-naacl.129
Bibkey:
Cite (ACL):
Chao Zhao, Faeze Brahman, Tenghao Huang, and Snigdha Chaturvedi. 2022. Revisiting Generative Commonsense Reasoning: A Pre-Ordering Approach. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1709–1718, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Revisiting Generative Commonsense Reasoning: A Pre-Ordering Approach (Zhao et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2022.findings-naacl.129.pdf
Video:
 https://preview.aclanthology.org/emnlp22-frontmatter/2022.findings-naacl.129.mp4
Code
 zhaochaocs/planned-ptm