Operator Selection and Ordering in a Pipeline Approach to Efficiency Optimizations for Transformers

Ji Xin, Raphael Tang, Zhiying Jiang, Yaoliang Yu, Jimmy Lin


Abstract
There exists a wide variety of efficiency methods for natural language processing (NLP) tasks, such as pruning, distillation, dynamic inference, quantization, etc. From a different perspective, we can consider an efficiency method as an operator applied on a model. Naturally, we may construct a pipeline of operators, i.e., to apply multiple efficiency methods on the model sequentially. In this paper, we study the plausibility of this idea, and more importantly, the commutativity and cumulativeness of efficiency operators. We make two interesting observations from our experiments: (1) The operators are commutative—the order of efficiency methods within the pipeline has little impact on the final results; (2) The operators are also cumulative—the final results of combining several efficiency methods can be estimated by combining the results of individual methods. These observations deepen our understanding of efficiency operators and provide useful guidelines for building them in real-world applications.
Anthology ID:
2023.findings-acl.180
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2870–2882
Language:
URL:
https://aclanthology.org/2023.findings-acl.180
DOI:
10.18653/v1/2023.findings-acl.180
Bibkey:
Cite (ACL):
Ji Xin, Raphael Tang, Zhiying Jiang, Yaoliang Yu, and Jimmy Lin. 2023. Operator Selection and Ordering in a Pipeline Approach to Efficiency Optimizations for Transformers. In Findings of the Association for Computational Linguistics: ACL 2023, pages 2870–2882, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Operator Selection and Ordering in a Pipeline Approach to Efficiency Optimizations for Transformers (Xin et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.findings-acl.180.pdf