Analyzing Learned Representations of a Deep ASR Performance Prediction Model

Zied Elloumi, Laurent Besacier, Olivier Galibert, Benjamin Lecouteux


Abstract
This paper addresses a relatively new task: prediction of ASR performance on unseen broadcast programs. In a previous paper, we presented an ASR performance prediction system using CNNs that encode both text (ASR transcript) and speech, in order to predict word error rate. This work is dedicated to the analysis of speech signal embeddings and text embeddings learnt by the CNN while training our prediction model. We try to better understand which information is captured by the deep model and its relation with different conditioning factors. It is shown that hidden layers convey a clear signal about speech style, accent and broadcast type. We then try to leverage these 3 types of information at training time through multi-task learning. Our experiments show that this allows to train slightly more efficient ASR performance prediction systems that - in addition - simultaneously tag the analyzed utterances according to their speech style, accent and broadcast program origin.
Anthology ID:
W18-5402
Volume:
Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP
Month:
November
Year:
2018
Address:
Brussels, Belgium
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9–15
Language:
URL:
https://aclanthology.org/W18-5402
DOI:
10.18653/v1/W18-5402
Bibkey:
Cite (ACL):
Zied Elloumi, Laurent Besacier, Olivier Galibert, and Benjamin Lecouteux. 2018. Analyzing Learned Representations of a Deep ASR Performance Prediction Model. In Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP, pages 9–15, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Analyzing Learned Representations of a Deep ASR Performance Prediction Model (Elloumi et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/W18-5402.pdf