Investigating Adapters for Parameter-efficient Low-resource Automatic Speech Recognition
Ahnaf Mozib Samin, Shekhar Nayak, Andrea De Marco, Claudia Borg
Abstract
Recent years have witnessed the adoption of parameter-efficient adapters in pre-trained language models for natural language processing. Yet, their application in speech processing remains less studied. In this work, we explore the adapters for low-resource speech recognition, introducing a novel technique - ConvAdapt into pre-trained speech models. We investigate various aspects such as data requirements, transfer learning within adapters, and scaling of feed-forward layers in adapters. Our findings reveal that bottleneck adapters offer competitiveness with full fine-tuning with at least 10 hours of data, but they are not as effective in few-shot learning scenarios. Notably, ConvAdapt demonstrates improved performance in such cases. In addition, transfer learning in adapters shows promise, necessitating research in related languages. Furthermore, employing larger speech models for adapter-tuning surpasses fine-tuning with ample data, potentially due to reduced overfitting than fine-tuning.- Anthology ID:
- 2025.repl4nlp-1.8
- Volume:
- Proceedings of the 10th Workshop on Representation Learning for NLP (RepL4NLP-2025)
- Month:
- May
- Year:
- 2025
- Address:
- Albuquerque, NM
- Editors:
- Vaibhav Adlakha, Alexandra Chronopoulou, Xiang Lorraine Li, Bodhisattwa Prasad Majumder, Freda Shi, Giorgos Vernikos
- Venues:
- RepL4NLP | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 100–107
- Language:
- URL:
- https://preview.aclanthology.org/moar-dois/2025.repl4nlp-1.8/
- DOI:
- 10.18653/v1/2025.repl4nlp-1.8
- Cite (ACL):
- Ahnaf Mozib Samin, Shekhar Nayak, Andrea De Marco, and Claudia Borg. 2025. Investigating Adapters for Parameter-efficient Low-resource Automatic Speech Recognition. In Proceedings of the 10th Workshop on Representation Learning for NLP (RepL4NLP-2025), pages 100–107, Albuquerque, NM. Association for Computational Linguistics.
- Cite (Informal):
- Investigating Adapters for Parameter-efficient Low-resource Automatic Speech Recognition (Samin et al., RepL4NLP 2025)
- PDF:
- https://preview.aclanthology.org/moar-dois/2025.repl4nlp-1.8.pdf