Addressing the Length Bias Challenge in Document-Level Neural Machine Translation

Zhang Zhuocheng, Shuhao Gu, Min Zhang, Yang Feng


Abstract
Document-level neural machine translation (DNMT) has shown promising results by incorporating context information through increased maximum lengths of source and target sentences. However, this approach also introduces a length bias problem, whereby DNMT suffers from significant translation quality degradation when decoding sentences that are much shorter or longer than the maximum sentence length during training, i.e., the length bias problem. To prevent the model from neglecting shorter sentences, we sample the training data to ensure a more uniform distribution across different sentence lengths while progressively increasing the maximum sentence length during training. Additionally, we introduce a length-normalized attention mechanism to aid the model in focusing on target information, mitigating the issue of attention divergence when processing longer sentences. Furthermore, during the decoding stage of DNMT, we propose a sliding decoding strategy that limits the length of target sentences to not exceed the maximum length encountered during training. The experimental results indicate that our method can achieve state-of-the-art results on several open datasets, and further analysis shows that our method can significantly alleviate the length bias problem.
Anthology ID:
2023.findings-emnlp.773
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11545–11556
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.773
DOI:
10.18653/v1/2023.findings-emnlp.773
Bibkey:
Cite (ACL):
Zhang Zhuocheng, Shuhao Gu, Min Zhang, and Yang Feng. 2023. Addressing the Length Bias Challenge in Document-Level Neural Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11545–11556, Singapore. Association for Computational Linguistics.
Cite (Informal):
Addressing the Length Bias Challenge in Document-Level Neural Machine Translation (Zhuocheng et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.findings-emnlp.773.pdf