Sergey Golovanov
2019
Large-Scale Transfer Learning for Natural Language Generation
Sergey Golovanov
|
Rauf Kurbanov
|
Sergey Nikolenko
|
Kyryl Truskovskyi
|
Alexander Tselousov
|
Thomas Wolf
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Large-scale pretrained language models define state of the art in natural language processing, achieving outstanding performance on a variety of tasks. We study how these architectures can be applied and adapted for natural language generation, comparing a number of architectural and training schemes. We focus in particular on open-domain dialog as a typical high entropy generation task, presenting and comparing different architectures for adapting pretrained models with state of the art results.