Consistent Accelerated Inference via Confident Adaptive Transformers

Tal Schuster, Adam Fisch, Tommi Jaakkola, Regina Barzilay


Abstract
We develop a novel approach for confidently accelerating inference in the large and expensive multilayer Transformers that are now ubiquitous in natural language processing (NLP). Amortized or approximate computational methods increase efficiency, but can come with unpredictable performance costs. In this work, we present CATs – Confident Adaptive Transformers – in which we simultaneously increase computational efficiency, while guaranteeing a specifiable degree of consistency with the original model with high confidence. Our method trains additional prediction heads on top of intermediate layers, and dynamically decides when to stop allocating computational effort to each input using a meta consistency classifier. To calibrate our early prediction stopping rule, we formulate a unique extension of conformal prediction. We demonstrate the effectiveness of this approach on four classification and regression tasks.
Anthology ID:
2021.emnlp-main.406
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4962–4979
Language:
URL:
https://aclanthology.org/2021.emnlp-main.406
DOI:
10.18653/v1/2021.emnlp-main.406
Bibkey:
Cite (ACL):
Tal Schuster, Adam Fisch, Tommi Jaakkola, and Regina Barzilay. 2021. Consistent Accelerated Inference via Confident Adaptive Transformers. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 4962–4979, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Consistent Accelerated Inference via Confident Adaptive Transformers (Schuster et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2021.emnlp-main.406.pdf
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2021.emnlp-main.406.mp4
Code
 TalSchuster/CATs
Data
AG NewsIMDb Movie ReviewsVitaminC