HULK: An Energy Efficiency Benchmark Platform for Responsible Natural Language Processing

Xiyou Zhou, Zhiyu Chen, Xiaoyong Jin, William Yang Wang


Abstract
Computation-intensive pretrained models have been taking the lead of many natural language processing benchmarks such as GLUE. However, energy efficiency in the process of model training and inference becomes a critical bottleneck. We introduce HULK, a multi-task energy efficiency benchmarking platform for responsible natural language processing. With HULK, we compare pretrained models’ energy efficiency from the perspectives of time and cost. Baseline benchmarking results are provided for further analysis. The fine-tuning efficiency of different pretrained models can differ significantly among different tasks, and fewer parameter number does not necessarily imply better efficiency. We analyzed such a phenomenon and demonstrated the method for comparing the multi-task efficiency of pretrained models. Our platform is available at https://hulkbenchmark.github.io/ .
Anthology ID:
2021.eacl-demos.39
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations
Month:
April
Year:
2021
Address:
Online
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
329–336
Language:
URL:
https://aclanthology.org/2021.eacl-demos.39
DOI:
10.18653/v1/2021.eacl-demos.39
Bibkey:
Cite (ACL):
Xiyou Zhou, Zhiyu Chen, Xiaoyong Jin, and William Yang Wang. 2021. HULK: An Energy Efficiency Benchmark Platform for Responsible Natural Language Processing. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations, pages 329–336, Online. Association for Computational Linguistics.
Cite (Informal):
HULK: An Energy Efficiency Benchmark Platform for Responsible Natural Language Processing (Zhou et al., EACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.eacl-demos.39.pdf
Data
GLUEMultiNLISQuADSSTSuperGLUE