The Annotated Transformer

Alexander Rush


Abstract
A major goal of open-source NLP is to quickly and accurately reproduce the results of new work, in a manner that the community can easily use and modify. While most papers publish enough detail for replication, it still may be difficult to achieve good results in practice. This paper presents a worked exercise of paper reproduction with the goal of implementing the results of the recent Transformer model. The replication exercise aims at simple code structure that follows closely with the original work, while achieving an efficient usable system.
Anthology ID:
W18-2509
Volume:
Proceedings of Workshop for NLP Open Source Software (NLP-OSS)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Eunjeong L. Park, Masato Hagiwara, Dmitrijs Milajevs, Liling Tan
Venue:
NLPOSS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
52–60
Language:
URL:
https://aclanthology.org/W18-2509
DOI:
10.18653/v1/W18-2509
Bibkey:
Cite (ACL):
Alexander Rush. 2018. The Annotated Transformer. In Proceedings of Workshop for NLP Open Source Software (NLP-OSS), pages 52–60, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
The Annotated Transformer (Rush, NLPOSS 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/W18-2509.pdf
Code
 harvardnlp/annotated-transformer