Yongdong Zhang
2020
Curriculum Learning for Natural Language Understanding
Benfeng Xu
|
Licheng Zhang
|
Zhendong Mao
|
Quan Wang
|
Hongtao Xie
|
Yongdong Zhang
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
With the great success of pre-trained language models, the pretrain-finetune paradigm now becomes the undoubtedly dominant solution for natural language understanding (NLU) tasks. At the fine-tune stage, target task data is usually introduced in a completely random order and treated equally. However, examples in NLU tasks can vary greatly in difficulty, and similar to human learning procedure, language models can benefit from an easy-to-difficult curriculum. Based on this idea, we propose our Curriculum Learning approach. By reviewing the trainset in a crossed way, we are able to distinguish easy examples from difficult ones, and arrange a curriculum for language models. Without any manual model architecture design or use of external data, our Curriculum Learning approach obtains significant and universal performance improvements on a wide range of NLU tasks.
2015
SOLAR: Scalable Online Learning Algorithms for Ranking
Jialei Wang
|
Ji Wan
|
Yongdong Zhang
|
Steven Hoi
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Search
Co-authors
- Benfeng Xu 1
- Licheng Zhang 1
- Zhendong Mao 1
- Quan Wang 1
- Hongtao Xie 1
- show all...