Abstract
In the burgeoning field of natural language processing, Neural Topic Models (NTMs) and Large Language Models (LLMs) have emerged as areas of significant research interest. Despite this, NTMs primarily utilize contextual embeddings from LLMs, which are not optimal for clustering or capable for topic generation. Our study addresses this gap by introducing a novel framework named Diffusion-Enhanced Topic Modeling using Encoder-Decoder-based LLMs (DeTiME). DeTiME leverages Encoder-Decoder-based LLMs to produce highly clusterable embeddings that could generate topics that exhibit both superior clusterability and enhanced semantic coherence compared to existing methods. Additionally, by exploiting the power of diffusion, our framework also provides the capability to generate content relevant to the identified topics. This dual functionality allows users to efficiently produce highly clustered topics and related content simultaneously. DeTiME’s potential extends to generating clustered embeddings as well. Notably, our proposed framework proves to be efficient to train and exhibits high adaptability, demonstrating its potential for a wide array of applications.- Anthology ID:
- 2023.findings-emnlp.606
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 9040–9057
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.606
- DOI:
- 10.18653/v1/2023.findings-emnlp.606
- Cite (ACL):
- Weijie Xu, Wenxiang Hu, Fanyou Wu, and Srinivasan Sengamedu. 2023. DeTiME: Diffusion-Enhanced Topic Modeling using Encoder-decoder based LLM. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 9040–9057, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- DeTiME: Diffusion-Enhanced Topic Modeling using Encoder-decoder based LLM (Xu et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/naacl24-info/2023.findings-emnlp.606.pdf