Large-Scale Transfer Learning for Natural Language Generation
Sergey Golovanov, Rauf Kurbanov, Sergey Nikolenko, Kyryl Truskovskyi, Alexander Tselousov, Thomas Wolf
Abstract
Large-scale pretrained language models define state of the art in natural language processing, achieving outstanding performance on a variety of tasks. We study how these architectures can be applied and adapted for natural language generation, comparing a number of architectural and training schemes. We focus in particular on open-domain dialog as a typical high entropy generation task, presenting and comparing different architectures for adapting pretrained models with state of the art results.- Anthology ID:
- P19-1608
- Volume:
- Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2019
- Address:
- Florence, Italy
- Editors:
- Anna Korhonen, David Traum, Lluís Màrquez
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 6053–6058
- Language:
- URL:
- https://preview.aclanthology.org/add_missing_videos/P19-1608/
- DOI:
- 10.18653/v1/P19-1608
- Cite (ACL):
- Sergey Golovanov, Rauf Kurbanov, Sergey Nikolenko, Kyryl Truskovskyi, Alexander Tselousov, and Thomas Wolf. 2019. Large-Scale Transfer Learning for Natural Language Generation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 6053–6058, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- Large-Scale Transfer Learning for Natural Language Generation (Golovanov et al., ACL 2019)
- PDF:
- https://preview.aclanthology.org/add_missing_videos/P19-1608.pdf