To Adapt or to Fine-tune: A Case Study on Abstractive Summarization

Zheng Zhao, Pinzhen Chen


Abstract
“Recent advances in the field of abstractive summarization leverage pre-trained language models rather than train a model from scratch. However, such models are sluggish to train and accompanied by a massive overhead. Researchers have proposed a few lightweight alternatives such as smaller adapters to mitigate the drawbacks. Nonetheless, it remains uncertain whether using adapters benefits the task of summarization, in terms of improved efficiency without an unpleasant sacrifice in performance. In this work, we carry out multifaceted investigations on fine-tuning and adapters for summarization tasks with varying complexity: language, domain, and task transfer. In our experiments, fine-tuning a pre-trained language model generally attains a better performance than using adapters; the performance gap positively correlates with the amount of training data used. Notably, adapters exceed fine-tuning under extremely low-resource conditions. We further provide insights on multilinguality, model convergence, and robustness, hoping to shed light on the pragmatic choice of fine-tuning or adapters in abstractive summarization.”
Anthology ID:
2022.ccl-1.73
Volume:
Proceedings of the 21st Chinese National Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Nanchang, China
Editors:
Maosong Sun (孙茂松), Yang Liu (刘洋), Wanxiang Che (车万翔), Yang Feng (冯洋), Xipeng Qiu (邱锡鹏), Gaoqi Rao (饶高琦), Yubo Chen (陈玉博)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
824–835
Language:
English
URL:
https://aclanthology.org/2022.ccl-1.73
DOI:
Bibkey:
Cite (ACL):
Zheng Zhao and Pinzhen Chen. 2022. To Adapt or to Fine-tune: A Case Study on Abstractive Summarization. In Proceedings of the 21st Chinese National Conference on Computational Linguistics, pages 824–835, Nanchang, China. Chinese Information Processing Society of China.
Cite (Informal):
To Adapt or to Fine-tune: A Case Study on Abstractive Summarization (Zhao & Chen, CCL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2022.ccl-1.73.pdf
Code
 zsquaredz/adapt_vs_finetune
Data
BookSumCNN/Daily MailNCLSWikiLinguaXL-Sum