AdapLeR: Speeding up Inference by Adaptive Length Reduction

Ali Modarressi, Hosein Mohebbi, Mohammad Taher Pilehvar


Abstract
Pre-trained language models have shown stellar performance in various downstream tasks. But, this usually comes at the cost of high latency and computation, hindering their usage in resource-limited settings. In this work, we propose a novel approach for reducing the computational cost of BERT with minimal loss in downstream performance. Our method dynamically eliminates less contributing tokens through layers, resulting in shorter lengths and consequently lower computational cost. To determine the importance of each token representation, we train a Contribution Predictor for each layer using a gradient-based saliency method. Our experiments on several diverse classification tasks show speedups up to 22x during inference time without much sacrifice in performance. We also validate the quality of the selected tokens in our method using human annotations in the ERASER benchmark. In comparison to other widely used strategies for selecting important tokens, such as saliency and attention, our proposed method has a significantly lower false positive rate in generating rationales. Our code is freely available at https://github.com/amodaresi/AdapLeR.
Anthology ID:
2022.acl-long.1
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–15
Language:
URL:
https://aclanthology.org/2022.acl-long.1
DOI:
10.18653/v1/2022.acl-long.1
Bibkey:
Cite (ACL):
Ali Modarressi, Hosein Mohebbi, and Mohammad Taher Pilehvar. 2022. AdapLeR: Speeding up Inference by Adaptive Length Reduction. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1–15, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
AdapLeR: Speeding up Inference by Adaptive Length Reduction (Modarressi et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.acl-long.1.pdf
Software:
 2022.acl-long.1.software.zip
Video:
 https://preview.aclanthology.org/auto-file-uploads/2022.acl-long.1.mp4
Code
 amodaresi/adapler
Data
GLUEHateXplainIMDb Movie ReviewsMRPCMultiNLIMultiRCQNLISST