Learning From Free-Text Human Feedback – Collect New Datasets Or Extend Existing Ones?

Dominic Petrak, Nafise Moosavi, Ye Tian, Nikolai Rozanov, Iryna Gurevych


Abstract
Continuous learning from free-text human feedback, such as error corrections, new knowledge, or alternative responses, is essential for today’s chatbots and virtual assistants to stay up-to-date, engaging, and socially acceptable. However, for research on methods for learning from such data, annotated data is scarce. To address this, we examine the error and user response types of six popular dialogue datasets from various types, including MultiWoZ, PersonaChat, Wizards-of-Wikipedia, and others, to assess their extendibility with the needed annotations. For this corpus study, we manually annotate a subset of each dataset with error and user response types using an improved version of the Integrated Error Taxonomy and a newly proposed user response type taxonomy. We provide the resulting dataset (EURTAD) to the community. Our findings provide new insights into dataset composition, including error types, user response types, and the relations between them.
Anthology ID:
2023.emnlp-main.1011
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16259–16279
Language:
URL:
https://aclanthology.org/2023.emnlp-main.1011
DOI:
10.18653/v1/2023.emnlp-main.1011
Bibkey:
Cite (ACL):
Dominic Petrak, Nafise Moosavi, Ye Tian, Nikolai Rozanov, and Iryna Gurevych. 2023. Learning From Free-Text Human Feedback – Collect New Datasets Or Extend Existing Ones?. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 16259–16279, Singapore. Association for Computational Linguistics.
Cite (Informal):
Learning From Free-Text Human Feedback – Collect New Datasets Or Extend Existing Ones? (Petrak et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2023.emnlp-main.1011.pdf
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2023.emnlp-main.1011.mp4