Understanding Cross-Domain Adaptation in Low-Resource Topic Modeling

Pritom Saha Akash, Kevin Chen-Chuan Chang


Abstract
Topic modeling plays a vital role in uncovering hidden semantic structures within text corpora, but existing models struggle in low-resource settings where limited target-domain data leads to unstable and incoherent topic inference. We address this challenge by formally introducing domain adaptation for low-resource topic modeling, where a high-resource source domain informs a low-resource target domain without overwhelming it with irrelevant content. We establish a finite-sample generalization bound showing that effective knowledge transfer depends on robust performance in both domains, minimizing latent-space discrepancy, and preventing overfitting to the data. Guided by these insights, we propose DALTA (Domain-Aligned Latent Topic Adaptation), a new framework that employs a shared encoder for domain-invariant features, specialized decoders for domain-specific nuances, and adversarial alignment to selectively transfer relevant information. Experiments on diverse low-resource datasets demonstrate that DALTA consistently outperforms state-of-the-art methods in terms of topic coherence, stability, and transferability.
Anthology ID:
2025.acl-long.298
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5988–6001
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.298/
DOI:
Bibkey:
Cite (ACL):
Pritom Saha Akash and Kevin Chen-Chuan Chang. 2025. Understanding Cross-Domain Adaptation in Low-Resource Topic Modeling. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 5988–6001, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Understanding Cross-Domain Adaptation in Low-Resource Topic Modeling (Akash & Chang, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.298.pdf