Abductive Commonsense Reasoning Exploiting Mutually Exclusive Explanations

Wenting Zhao, Justin Chiu, Claire Cardie, Alexander Rush


Abstract
Abductive reasoning aims to find plausible explanations for an event. This style of reasoning is critical for commonsense tasks where there are often multiple plausible explanations. Existing approaches for abductive reasoning in natural language processing (NLP) often rely on manually generated annotations for supervision; however, such annotations can be subjective and biased. Instead of using direct supervision, this work proposes an approach for abductive commonsense reasoning that exploits the fact that only a subset of explanations is correct for a given context. The method uses posterior regularization to enforce a mutual exclusion constraint, encouraging the model to learn the distinction between fluent explanations and plausible ones. We evaluate our approach on a diverse set of abductive reasoning datasets; experimental results show that our approach outperforms or is comparable to directly applying pretrained language models in a zero-shot manner and other knowledge-augmented zero-shot methods.
Anthology ID:
2023.acl-long.831
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14883–14896
Language:
URL:
https://aclanthology.org/2023.acl-long.831
DOI:
10.18653/v1/2023.acl-long.831
Bibkey:
Cite (ACL):
Wenting Zhao, Justin Chiu, Claire Cardie, and Alexander Rush. 2023. Abductive Commonsense Reasoning Exploiting Mutually Exclusive Explanations. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 14883–14896, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Abductive Commonsense Reasoning Exploiting Mutually Exclusive Explanations (Zhao et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2023.acl-long.831.pdf
Video:
 https://preview.aclanthology.org/ingest-2024-clasp/2023.acl-long.831.mp4