Isabella Olariu
2023
Evaluating Parameter-Efficient Finetuning Approaches for Pre-trained Models on the Financial Domain
Isabella Olariu
|
Cedric Lothritz
|
Jacques Klein
|
Tegawendé Bissyandé
|
Siwen Guo
|
Shohreh Haddadan
Findings of the Association for Computational Linguistics: EMNLP 2023
Large-scale language models with millions, billions, or trillions of trainable parameters are becoming increasingly popular. However, they risk becoming rapidly over-parameterized and the adaptation cost of fully fine-tuning them increases significantly. Storing them becomes progressively impractical as it requires keeping a separate copy of all the fine-tuned weights for each task. By freezing all pre-trained weights during fine-tuning, parameter-efficient tuning approaches have become an appealing alternative to traditional fine-tuning. The performance of these approaches has been evaluated on common NLP tasks of the GLUE benchmark and shown to match full fine-tuning performance, however, their impact is less researched in domain-specific fields such as finance. This work compares the performance of a set of financial BERT-like models to their fully fine-tuned counterparts by leveraging different parameter-efficient tuning methods. We see that results are comparable to traditional fine-tuning while gaining in time and resource efficiency.
Search