Tim Rault
2020
Transformers: State-of-the-Art Natural Language Processing
Thomas Wolf
|
Lysandre Debut
|
Victor Sanh
|
Julien Chaumond
|
Clement Delangue
|
Anthony Moi
|
Pierric Cistac
|
Tim Rault
|
Remi Louf
|
Morgan Funtowicz
|
Joe Davison
|
Sam Shleifer
|
Patrick von Platen
|
Clara Ma
|
Yacine Jernite
|
Julien Plu
|
Canwen Xu
|
Teven Le Scao
|
Sylvain Gugger
|
Mariama Drame
|
Quentin Lhoest
|
Alexander Rush
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. Transformers is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. Transformers is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at https://github.com/huggingface/transformers.
Search
Co-authors
- Thomas Wolf 1
- Lysandre Debut 1
- Victor Sanh 1
- Julien Chaumond 1
- Clément Delangue 1
- show all...