Domain Adaptation of Foundation LLMs for e-Commerce
Christian Herold, Michael Kozielski, Tala Bazazo, Pavel Petrushkov, Yannick Versley, Seyyed Hadi Hashemi, Patrycja Cieplicka, Dominika Basaj, Shahram Khadivi
Abstract
We present the e-Llama models: 8 billion and 70 billion parameter large language models that are adapted towards the e-commerce domain.These models are meant as foundation models with deep knowledge about e-commerce, that form a base for instruction- and fine-tuning.The e-Llama models are obtained by continuously pretraining the Llama 3.1 base models on 1 trillion tokens of domain-specific data.We discuss our approach and motivate our choice of hyperparameters with a series of ablation studies.To quantify how well the models have been adapted to the e-commerce domain, we define and implement a set of multilingual, e-commerce specific evaluation tasks.We show that, when carefully choosing the training setup, the Llama 3.1 models can be adapted towards the new domain without sacrificing significant performance on general domain tasks.We also explore the possibility of merging the adapted model and the base model for a better control of the performance trade-off between domains.- Anthology ID:
- 2025.acl-industry.74
- Volume:
- Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 6: Industry Track)
- Month:
- July
- Year:
- 2025
- Address:
- Vienna, Austria
- Editors:
- Georg Rehm, Yunyao Li
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1039–1049
- Language:
- URL:
- https://preview.aclanthology.org/landing_page/2025.acl-industry.74/
- DOI:
- Cite (ACL):
- Christian Herold, Michael Kozielski, Tala Bazazo, Pavel Petrushkov, Yannick Versley, Seyyed Hadi Hashemi, Patrycja Cieplicka, Dominika Basaj, and Shahram Khadivi. 2025. Domain Adaptation of Foundation LLMs for e-Commerce. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 6: Industry Track), pages 1039–1049, Vienna, Austria. Association for Computational Linguistics.
- Cite (Informal):
- Domain Adaptation of Foundation LLMs for e-Commerce (Herold et al., ACL 2025)
- PDF:
- https://preview.aclanthology.org/landing_page/2025.acl-industry.74.pdf