Abstract
The success of end-to-end speech-to-text translation (ST) is often achieved by utilizing source transcripts, e.g., by pre-training with automatic speech recognition (ASR) and machine translation (MT) tasks, or by introducing additional ASR and MT data. Unfortunately, transcripts are only sometimes available since numerous unwritten languages exist worldwide. In this paper, we aim to utilize large amounts of target-side monolingual data to enhance ST without transcripts. Motivated by the remarkable success of back translation in MT, we develop a back translation algorithm for ST (BT4ST) to synthesize pseudo ST data from monolingual target data. To ease the challenges posed by short-to-long generation and one-to-many mapping, we introduce self-supervised discrete units and achieve back translation by cascading a target-to-unit model and a unit-to-speech model. With our synthetic ST data, we achieve an average boost of 2.3 BLEU on MuST-C En-De, En-Fr, and En-Es datasets. More experiments show that our method is especially effective in low-resource scenarios.- Anthology ID:
- 2023.acl-long.251
- Volume:
- Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4567–4587
- Language:
- URL:
- https://aclanthology.org/2023.acl-long.251
- DOI:
- 10.18653/v1/2023.acl-long.251
- Cite (ACL):
- Qingkai Fang and Yang Feng. 2023. Back Translation for Speech-to-text Translation Without Transcripts. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4567–4587, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Back Translation for Speech-to-text Translation Without Transcripts (Fang & Feng, ACL 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2023.acl-long.251.pdf