General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference

Jingfei Du, Myle Ott, Haoran Li, Xing Zhou, Veselin Stoyanov


Abstract
The state of the art on many NLP tasks is currently achieved by large pre-trained language models, which require a considerable amount of computation. We aim to reduce the inference cost in a setting where many different predictions are made on a single piece of text. In that case, computational cost during inference can be amortized over the different predictions (tasks) using a shared text encoder. We compare approaches for training such an encoder and show that encoders pre-trained over multiple tasks generalize well to unseen tasks. We also compare ways of extracting fixed- and limited-size representations from this encoder, including pooling features extracted from multiple layers or positions. Our best approach compares favorably to knowledge distillation, achieving higher accuracy and lower computational cost once the system is handling around 7 tasks. Further, we show that through binary quantization, we can reduce the size of the extracted representations by a factor of 16 to store them for later use. The resulting method offers a compelling solution for using large-scale pre-trained models at a fraction of the computational cost when multiple tasks are performed on the same text.
Anthology ID:
2020.findings-emnlp.271
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3018–3030
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.271
DOI:
10.18653/v1/2020.findings-emnlp.271
Bibkey:
Cite (ACL):
Jingfei Du, Myle Ott, Haoran Li, Xing Zhou, and Veselin Stoyanov. 2020. General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 3018–3030, Online. Association for Computational Linguistics.
Cite (Informal):
General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference (Du et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2020.findings-emnlp.271.pdf
Video:
 https://slideslive.com/38940109
Data
GLUE