NarrowBERT: Accelerating Masked Language Model Pretraining and Inference

Haoxin Li, Phillip Keung, Daniel Cheng, Jungo Kasai, Noah A. Smith


Abstract
Large-scale language model pretraining is a very successful form of self-supervised learning in natural language processing, but it is increasingly expensive to perform as the models and pretraining corpora have become larger over time. We propose NarrowBERT, a modified transformer encoder that increases the throughput for masked language model pretraining by more than 2x. NarrowBERT sparsifies the transformer model such that the self-attention queries and feedforward layers only operate on the masked tokens of each sentence during pretraining, rather than all of the tokens as with the usual transformer encoder. We also show that NarrowBERT increases the throughput at inference time by as much as 3.5x with minimal (or no) performance degradation on sentence encoding tasks like MNLI. Finally, we examine the performance of NarrowBERT on the IMDB and Amazon reviews classification and CoNLL NER tasks and show that it is also comparable to standard BERT performance.
Anthology ID:
2023.acl-short.146
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1723–1730
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.acl-short.146/
DOI:
10.18653/v1/2023.acl-short.146
Bibkey:
Cite (ACL):
Haoxin Li, Phillip Keung, Daniel Cheng, Jungo Kasai, and Noah A. Smith. 2023. NarrowBERT: Accelerating Masked Language Model Pretraining and Inference. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1723–1730, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
NarrowBERT: Accelerating Masked Language Model Pretraining and Inference (Li et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.acl-short.146.pdf
Video:
 https://preview.aclanthology.org/build-pipeline-with-new-library/2023.acl-short.146.mp4