Adapting General-Purpose Embedding Models to Private Datasets Using Keyword-based Retrieval

Yubai Wei, Jiale Han, Yi Yang


Abstract
Text embedding models play a cornerstone role in AI applications, such as retrieval-augmented generation (RAG). While general-purpose text embedding models demonstrate strong performance on generic retrieval benchmarks, their effectiveness diminishes when applied to private datasets (e.g., company-specific proprietary data), which often contain specialized terminology and lingo. In this work, we introduce BMEmbed, a novel method for adapting general-purpose text embedding models to private datasets. By leveraging the well-established keyword-based retrieval technique (BM25), we construct supervisory signals from the ranking of keyword-based retrieval results to facilitate model adaptation. We evaluate BMEmbed across a range of domains, datasets, and models, showing consistent improvements in retrieval performance. Moreover, we provide empirical insights into how BM25-based signals contribute to improving embeddings by fostering alignment and uniformity, highlighting the value of this approach in adapting models to domain-specific data. We release the source code for the research community.
Anthology ID:
2025.findings-acl.357
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6856–6870
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.findings-acl.357/
DOI:
Bibkey:
Cite (ACL):
Yubai Wei, Jiale Han, and Yi Yang. 2025. Adapting General-Purpose Embedding Models to Private Datasets Using Keyword-based Retrieval. In Findings of the Association for Computational Linguistics: ACL 2025, pages 6856–6870, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Adapting General-Purpose Embedding Models to Private Datasets Using Keyword-based Retrieval (Wei et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.findings-acl.357.pdf