Beyond Sentence-Level End-to-End Speech Translation: Context Helps

Biao Zhang, Ivan Titov, Barry Haddow, Rico Sennrich


Abstract
Document-level contextual information has shown benefits to text-based machine translation, but whether and how context helps end-to-end (E2E) speech translation (ST) is still under-studied. We fill this gap through extensive experiments using a simple concatenation-based context-aware ST model, paired with adaptive feature selection on speech encodings for computational efficiency. We investigate several decoding approaches, and introduce in-model ensemble decoding which jointly performs document- and sentence-level translation using the same model. Our results on the MuST-C benchmark with Transformer demonstrate the effectiveness of context to E2E ST. Compared to sentence-level ST, context-aware ST obtains better translation quality (+0.18-2.61 BLEU), improves pronoun and homophone translation, shows better robustness to (artificial) audio segmentation errors, and reduces latency and flicker to deliver higher quality for simultaneous translation.
Anthology ID:
2021.acl-long.200
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2566–2578
Language:
URL:
https://aclanthology.org/2021.acl-long.200
DOI:
10.18653/v1/2021.acl-long.200
Bibkey:
Cite (ACL):
Biao Zhang, Ivan Titov, Barry Haddow, and Rico Sennrich. 2021. Beyond Sentence-Level End-to-End Speech Translation: Context Helps. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 2566–2578, Online. Association for Computational Linguistics.
Cite (Informal):
Beyond Sentence-Level End-to-End Speech Translation: Context Helps (Zhang et al., ACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2021.acl-long.200.pdf
Code
 bzhangGo/zero
Data
MuST-C