Emotion Impacts Speech Recognition Performance

Rushab Munot, Ani Nenkova


Abstract
It has been established that the performance of speech recognition systems depends on multiple factors including the lexical content, speaker identity and dialect. Here we use three English datasets of acted emotion to demonstrate that emotional content also impacts the performance of commercial systems. On two of the corpora, emotion is a bigger contributor to recognition errors than speaker identity and on two, neutral speech is recognized considerably better than emotional speech. We further evaluate the commercial systems on spontaneous interactions that contain portions of emotional speech. We propose and validate on the acted datasets, a method that allows us to evaluate the overall impact of emotion on recognition even when manual transcripts are not available. Using this method, we show that emotion in natural spontaneous dialogue is a less prominent but still significant factor in recognition accuracy.
Anthology ID:
N19-3003
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Sudipta Kar, Farah Nadeem, Laura Burdick, Greg Durrett, Na-Rae Han
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16–21
Language:
URL:
https://aclanthology.org/N19-3003
DOI:
10.18653/v1/N19-3003
Bibkey:
Cite (ACL):
Rushab Munot and Ani Nenkova. 2019. Emotion Impacts Speech Recognition Performance. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop, pages 16–21, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Emotion Impacts Speech Recognition Performance (Munot & Nenkova, NAACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/N19-3003.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/N19-3003.mp4