Uncertainty-Aware Curriculum Learning for Neural Machine Translation
Yikai Zhou, Baosong Yang, Derek F. Wong, Yu Wan, Lidia S. Chao
Abstract
Neural machine translation (NMT) has proven to be facilitated by curriculum learning which presents examples in an easy-to-hard order at different training stages. The keys lie in the assessment of data difficulty and model competence. We propose uncertainty-aware curriculum learning, which is motivated by the intuition that: 1) the higher the uncertainty in a translation pair, the more complex and rarer the information it contains; and 2) the end of the decline in model uncertainty indicates the completeness of current training stage. Specifically, we serve cross-entropy of an example as its data difficulty and exploit the variance of distributions over the weights of the network to present the model uncertainty. Extensive experiments on various translation tasks reveal that our approach outperforms the strong baseline and related methods on both translation quality and convergence speed. Quantitative analyses reveal that the proposed strategy offers NMT the ability to automatically govern its learning schedule.- Anthology ID:
- 2020.acl-main.620
- Volume:
- Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Editors:
- Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 6934–6944
- Language:
- URL:
- https://aclanthology.org/2020.acl-main.620
- DOI:
- 10.18653/v1/2020.acl-main.620
- Cite (ACL):
- Yikai Zhou, Baosong Yang, Derek F. Wong, Yu Wan, and Lidia S. Chao. 2020. Uncertainty-Aware Curriculum Learning for Neural Machine Translation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6934–6944, Online. Association for Computational Linguistics.
- Cite (Informal):
- Uncertainty-Aware Curriculum Learning for Neural Machine Translation (Zhou et al., ACL 2020)
- PDF:
- https://preview.aclanthology.org/bionlp-24-ingestion/2020.acl-main.620.pdf