ACCENT: An Automatic Event Commonsense Evaluation Metric for Open-Domain Dialogue Systems

Sarik Ghazarian, Yijia Shao, Rujun Han, Aram Galstyan, Nanyun Peng


Abstract
Commonsense reasoning is omnipresent in human communications and thus is an important feature for open-domain dialogue systems. However, evaluating commonsense in dialogue systems is still an open challenge. We take the first step by focusing on event commonsense that considers events and their relations, and is crucial in both dialogues and general commonsense reasoning. We propose ACCENT, an event commonsense evaluation metric empowered by commonsense knowledge bases (CSKBs). ACCENT first extracts event-relation tuples from a dialogue, and then evaluates the response by scoring the tuples in terms of their compatibility with the CSKB. To evaluate ACCENT, we construct the first public event commonsense evaluation dataset for open-domain dialogues.Our experiments show that ACCENT is an efficient metric for event commonsense evaluation, which achieves higher correlations with human judgments than existing baselines.
Anthology ID:
2023.acl-long.241
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4398–4419
Language:
URL:
https://aclanthology.org/2023.acl-long.241
DOI:
10.18653/v1/2023.acl-long.241
Bibkey:
Cite (ACL):
Sarik Ghazarian, Yijia Shao, Rujun Han, Aram Galstyan, and Nanyun Peng. 2023. ACCENT: An Automatic Event Commonsense Evaluation Metric for Open-Domain Dialogue Systems. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4398–4419, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
ACCENT: An Automatic Event Commonsense Evaluation Metric for Open-Domain Dialogue Systems (Ghazarian et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.acl-long.241.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/2023.acl-long.241.mp4