Investigating Evaluation of Open-Domain Dialogue Systems With Human Generated Multiple References

Prakhar Gupta, Shikib Mehri, Tiancheng Zhao, Amy Pavel, Maxine Eskenazi, Jeffrey Bigham


Abstract
The aim of this paper is to mitigate the shortcomings of automatic evaluation of open-domain dialog systems through multi-reference evaluation. Existing metrics have been shown to correlate poorly with human judgement, particularly in open-domain dialog. One alternative is to collect human annotations for evaluation, which can be expensive and time consuming. To demonstrate the effectiveness of multi-reference evaluation, we augment the test set of DailyDialog with multiple references. A series of experiments show that the use of multiple references results in improved correlation between several automatic metrics and human judgement for both the quality and the diversity of system output.
Anthology ID:
W19-5944
Volume:
Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue
Month:
September
Year:
2019
Address:
Stockholm, Sweden
Editors:
Satoshi Nakamura, Milica Gasic, Ingrid Zukerman, Gabriel Skantze, Mikio Nakano, Alexandros Papangelis, Stefan Ultes, Koichiro Yoshino
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
379–391
Language:
URL:
https://aclanthology.org/W19-5944
DOI:
10.18653/v1/W19-5944
Bibkey:
Cite (ACL):
Prakhar Gupta, Shikib Mehri, Tiancheng Zhao, Amy Pavel, Maxine Eskenazi, and Jeffrey Bigham. 2019. Investigating Evaluation of Open-Domain Dialogue Systems With Human Generated Multiple References. In Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue, pages 379–391, Stockholm, Sweden. Association for Computational Linguistics.
Cite (Informal):
Investigating Evaluation of Open-Domain Dialogue Systems With Human Generated Multiple References (Gupta et al., SIGDIAL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ml4al-ingestion/W19-5944.pdf
Code
 prakharguptaz/multirefeval +  additional community code
Data
DailyDialog