Abstract
Neural topic models have been widely used to extract common topics across documents. Recently, contrastive learning has been applied to variational autoencoder-based neural topic models, achieving promising results. However, due to the limitation of the unidirectional structure of the variational autoencoder, the encoder is enhanced with the contrastive loss instead of the decoder, leading to a gap between model training and evaluation.To address the limitation, we propose a novel neural topic modeling framework based on cycle adversarial training and contrastive learning to apply contrastive learning on the generator directly.Specifically, a self-supervised contrastive loss is proposed to make the generator capture similar topic information, which leads to better topic-word distributions. Meanwhile, a discriminative contrastive loss is proposed to cooperate with the self-supervised contrastive loss to balance the generation and discrimination. Moreover, based on the reconstruction ability of the cycle generative adversarial network, a novel data augmentation strategy is designed and applied to the topic distribution directly. Experiments have been conducted on four benchmark datasets and results show that the proposed approach outperforms competitive baselines.- Anthology ID:
- 2023.findings-acl.616
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 9720–9731
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.616
- DOI:
- Cite (ACL):
- Boyu Wang, Linhai Zhang, Deyu Zhou, Yi Cao, and Jiandong Ding. 2023. Neural Topic Modeling based on Cycle Adversarial Training and Contrastive Learning. In Findings of the Association for Computational Linguistics: ACL 2023, pages 9720–9731, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Neural Topic Modeling based on Cycle Adversarial Training and Contrastive Learning (Wang et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nodalida-main-page/2023.findings-acl.616.pdf