@inproceedings{shao-etal-2024-understanding,
    title = "Understanding and Addressing the Under-Translation Problem from the Perspective of Decoding Objective",
    author = "Shao, Chenze  and
      Meng, Fandong  and
      Zeng, Jiali  and
      Zhou, Jie",
    editor = "Ku, Lun-Wei  and
      Martins, Andre  and
      Srikumar, Vivek",
    booktitle = "Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
    month = aug,
    year = "2024",
    address = "Bangkok, Thailand",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2024.acl-long.209/",
    doi = "10.18653/v1/2024.acl-long.209",
    pages = "3800--3814",
    abstract = "Neural Machine Translation (NMT) has made remarkable progress over the past years. However, under-translation and over-translation remain two challenging problems in state-of-the-art NMT systems. In this work, we conduct an in-depth analysis on the underlying cause of under-translation in NMT, providing an explanation from the perspective of decoding objective. To optimize the beam search objective, the model tends to overlook words it is less confident about, leading to the under-translation phenomenon. Correspondingly, the model{'}s confidence in predicting the End Of Sentence (EOS) diminishes when under-translation occurs, serving as a mild penalty for under-translated candidates. Building upon this analysis, we propose employing the confidence of predicting EOS as a detector for under-translation, and strengthening the confidence-based penalty to penalize candidates with a high risk of under-translation.Experiments on both synthetic and real-world data show that our method can accurately detect and rectify under-translated outputs, with minor impact on other correct translations."
}Markdown (Informal)
[Understanding and Addressing the Under-Translation Problem from the Perspective of Decoding Objective](https://preview.aclanthology.org/ingest-emnlp/2024.acl-long.209/) (Shao et al., ACL 2024)
ACL