Does Joint Training Really Help Cascaded Speech Translation?

Viet Anh Khoa Tran, David Thulke, Yingbo Gao, Christian Herold, Hermann Ney


Abstract
Currently, in speech translation, the straightforward approach - cascading a recognition system with a translation system - delivers state-of-the-art results.However, fundamental challenges such as error propagation from the automatic speech recognition system still remain.To mitigate these problems, recently, people turn their attention to direct data and propose various joint training methods.In this work, we seek to answer the question of whether joint training really helps cascaded speech translation.We review recent papers on the topic and also investigate a joint training criterion by marginalizing the transcription posterior probabilities.Our findings show that a strong cascaded baseline can diminish any improvements obtained using joint training, and we suggest alternatives to joint training.We hope this work can serve as a refresher of the current speech translation landscape, and motivate research in finding more efficient and creative ways to utilize the direct data for speech translation.
Anthology ID:
2022.emnlp-main.297
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4480–4487
Language:
URL:
https://aclanthology.org/2022.emnlp-main.297
DOI:
10.18653/v1/2022.emnlp-main.297
Bibkey:
Cite (ACL):
Viet Anh Khoa Tran, David Thulke, Yingbo Gao, Christian Herold, and Hermann Ney. 2022. Does Joint Training Really Help Cascaded Speech Translation?. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 4480–4487, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Does Joint Training Really Help Cascaded Speech Translation? (Tran et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2022.emnlp-main.297.pdf