Challenges of Neural Machine Translation for Short Texts

Yu Wan, Baosong Yang, Derek Fai Wong, Lidia Sam Chao, Liang Yao, Haibo Zhang, Boxing Chen


Abstract
Short texts (STs) present in a variety of scenarios, including query, dialog, and entity names. Most of the exciting studies in neural machine translation (NMT) are focused on tackling open problems concerning long sentences rather than short ones. The intuition behind is that, with respect to human learning and processing, short sequences are generally regarded as easy examples. In this article, we first dispel this speculation via conducting preliminary experiments, showing that the conventional state-of-the-art NMT approach, namely, Transformer (Vaswani et al. 2017), still suffers from over-translation and mistranslation errors over STs. After empirically investigating the rationale behind this, we summarize two challenges in NMT for STs associated with translation error types above, respectively: (1) the imbalanced length distribution in training set intensifies model inference calibration over STs, leading to more over-translation cases on STs; and (2) the lack of contextual information forces NMT to have higher data uncertainty on short sentences, and thus NMT model is troubled by considerable mistranslation errors. Some existing approaches, like balancing data distribution for training (e.g., data upsampling) and complementing contextual information (e.g., introducing translation memory) can alleviate the translation issues in NMT for STs. We encourage researchers to investigate other challenges in NMT for STs, thus reducing ST translation errors and enhancing translation quality.
Anthology ID:
2022.cl-2.3
Volume:
Computational Linguistics, Volume 48, Issue 2 - June 2022
Month:
June
Year:
2022
Address:
Cambridge, MA
Venue:
CL
SIG:
Publisher:
MIT Press
Note:
Pages:
321–342
Language:
URL:
https://aclanthology.org/2022.cl-2.3
DOI:
10.1162/coli_a_00435
Bibkey:
Cite (ACL):
Yu Wan, Baosong Yang, Derek Fai Wong, Lidia Sam Chao, Liang Yao, Haibo Zhang, and Boxing Chen. 2022. Challenges of Neural Machine Translation for Short Texts. Computational Linguistics, 48(2):321–342.
Cite (Informal):
Challenges of Neural Machine Translation for Short Texts (Wan et al., CL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.cl-2.3.pdf