Abstract
Transformers have brought a remarkable improvement in the performance of neural machine translation (NMT) systems but they could be surprisingly vulnerable to noise. In this work, we try to investigate how noise breaks Transformers and if there exist solutions to deal with such issues. There is a large body of work in the NMT literature on analyzing the behavior of conventional models for the problem of noise but Transformers are relatively understudied in this context. Motivated by this, we introduce a novel data-driven technique called Target Augmented Fine-tuning (TAFT) to incorporate noise during training. This idea is comparable to the well-known fine-tuning strategy. Moreover, we propose two other novel extensions to the original Transformer: Controlled Denoising (CD) and Dual-Channel Decoding (DCD), that modify the neural architecture as well as the training process to handle noise. One important characteristic of our techniques is that they only impact the training phase and do not impose any overhead at inference time. We evaluated our techniques to translate the English–German pair in both directions and observed that our models have a higher tolerance to noise. More specifically, they perform with no deterioration where up to 10% of entire test words are infected by noise.- Anthology ID:
- 2021.findings-emnlp.323
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2021
- Month:
- November
- Year:
- 2021
- Address:
- Punta Cana, Dominican Republic
- Venue:
- Findings
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3831–3840
- Language:
- URL:
- https://aclanthology.org/2021.findings-emnlp.323
- DOI:
- 10.18653/v1/2021.findings-emnlp.323
- Cite (ACL):
- Peyman Passban, Puneeth Saladi, and Qun Liu. 2021. Revisiting Robust Neural Machine Translation: A Transformer Case Study. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 3831–3840, Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- Revisiting Robust Neural Machine Translation: A Transformer Case Study (Passban et al., Findings 2021)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2021.findings-emnlp.323.pdf