@inproceedings{akash-chang-2025-understanding,
    title = "Understanding Cross-Domain Adaptation in Low-Resource Topic Modeling",
    author = "Akash, Pritom Saha  and
      Chang, Kevin Chen-Chuan",
    editor = "Che, Wanxiang  and
      Nabende, Joyce  and
      Shutova, Ekaterina  and
      Pilehvar, Mohammad Taher",
    booktitle = "Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
    month = jul,
    year = "2025",
    address = "Vienna, Austria",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.acl-long.298/",
    doi = "10.18653/v1/2025.acl-long.298",
    pages = "5988--6001",
    ISBN = "979-8-89176-251-0",
    abstract = "Topic modeling plays a vital role in uncovering hidden semantic structures within text corpora, but existing models struggle in low-resource settings where limited target-domain data leads to unstable and incoherent topic inference. We address this challenge by formally introducing domain adaptation for low-resource topic modeling, where a high-resource source domain informs a low-resource target domain without overwhelming it with irrelevant content. We establish a finite-sample generalization bound showing that effective knowledge transfer depends on robust performance in both domains, minimizing latent-space discrepancy, and preventing overfitting to the data. Guided by these insights, we propose DALTA (Domain-Aligned Latent Topic Adaptation), a new framework that employs a shared encoder for domain-invariant features, specialized decoders for domain-specific nuances, and adversarial alignment to selectively transfer relevant information. Experiments on diverse low-resource datasets demonstrate that DALTA consistently outperforms state-of-the-art methods in terms of topic coherence, stability, and transferability."
}Markdown (Informal)
[Understanding Cross-Domain Adaptation in Low-Resource Topic Modeling](https://preview.aclanthology.org/ingest-emnlp/2025.acl-long.298/) (Akash & Chang, ACL 2025)
ACL