Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation
Abstract
We address the problem of simultaneous translation by modifying the Neural MT decoder to operate with dynamically built encoder and attention. We propose a tunable agent which decides the best segmentation strategy for a user-defined BLEU loss and Average Proportion (AP) constraint. Our agent outperforms previously proposed Wait-if-diff and Wait-if-worse agents (Cho and Esipova, 2016) on BLEU with a lower latency. Secondly we proposed data-driven changes to Neural MT training to better match the incremental decoding framework.- Anthology ID:
- N18-2079
- Volume:
- Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)
- Month:
- June
- Year:
- 2018
- Address:
- New Orleans, Louisiana
- Editors:
- Marilyn Walker, Heng Ji, Amanda Stent
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 493–499
- Language:
- URL:
- https://aclanthology.org/N18-2079
- DOI:
- 10.18653/v1/N18-2079
- Cite (ACL):
- Fahim Dalvi, Nadir Durrani, Hassan Sajjad, and Stephan Vogel. 2018. Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 493–499, New Orleans, Louisiana. Association for Computational Linguistics.
- Cite (Informal):
- Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation (Dalvi et al., NAACL 2018)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-1/N18-2079.pdf