Adaptation of Large Language Models

Zixuan Ke, Yifei Ming, Shafiq Joty


Abstract
This tutorial on adaptation of Large Language Models (LLMs) is designed to address the growing demand for models that go beyond the static capabilities of generic LLMs by providing an overview of dynamic, domain-specific, and task-adaptive LLM adaptation techniques. While general LLMs have demonstrated strong generalization across a variety of tasks, they often struggle to perform well in specialized domains such as finance, healthcare, and code generation for underrepresented languages. Additionally, their static nature limits their ability to evolve with the changing world, and they are often extremely large in size, making them impractical and costly to deploy at scale. As a result, the adaptation of LLMs has drawn much attention since the birth of LLMs and is of core importance, both for industry, which focuses on serving its targeted users, and academia, which can greatly benefit from small but powerful LLMs
Anthology ID:
2025.naacl-tutorial.5
Volume:
Proceedings of the 2025 Annual Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 5: Tutorial Abstracts)
Month:
May
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Maria Lomeli, Swabha Swayamdipta, Rui Zhang
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
30–37
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.naacl-tutorial.5/
DOI:
Bibkey:
Cite (ACL):
Zixuan Ke, Yifei Ming, and Shafiq Joty. 2025. Adaptation of Large Language Models. In Proceedings of the 2025 Annual Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 5: Tutorial Abstracts), pages 30–37, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Adaptation of Large Language Models (Ke et al., NAACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.naacl-tutorial.5.pdf