Ben Wang


Fixing paper assignments

  1. Please select all papers that belong to the same person.
  2. Indicate below which author they should be assigned to.
Provide a valid ORCID iD here. This will be used to match future papers to this author.
Provide the name of the school or the university where the author has received or will receive their highest degree (e.g., Ph.D. institution for researchers, or current affiliation for students). This will be used to form the new author page ID, if needed.

TODO: "submit" and "cancel" buttons here


2022

pdf bib
GPT-NeoX-20B: An Open-Source Autoregressive Language Model
Sidney Black | Stella Biderman | Eric Hallahan | Quentin Anthony | Leo Gao | Laurence Golding | Horace He | Connor Leahy | Kyle McDonell | Jason Phang | Michael Pieler | Usvsn Sai Prashanth | Shivanshu Purohit | Laria Reynolds | Jonathan Tow | Ben Wang | Samuel Weinbach
Proceedings of BigScience Episode #5 -- Workshop on Challenges & Perspectives in Creating Large Language Models

We introduce GPT-NeoX-20B, a 20 billion parameter autoregressive language model trained on the Pile, whose weights will be made freely and openly available to the public through a permissive license. It is, to the best of our knowledge, the largest dense autoregressive model that has publicly available weights at the time of submission. In this work, we describe GPT-NeoX-20B’s architecture and training, and evaluate its performance. We open-source the training and evaluation code, as well as the model weights, at https://github.com/EleutherAI/gpt-neox.