Adaptive Mixed Component LDA for Low Resource Topic Modeling

Suzanna Sia, Kevin Duh


Abstract
Probabilistic topic models in low data resource scenarios are faced with less reliable estimates due to sparsity of discrete word co-occurrence counts, and do not have the luxury of retraining word or topic embeddings using neural methods. In this challenging resource constrained setting, we explore mixture models which interpolate between the discrete and continuous topic-word distributions that utilise pre-trained embeddings to improve topic coherence. We introduce an automatic trade-off between the discrete and continuous representations via an adaptive mixture coefficient, which places greater weight on the discrete representation when the corpus statistics are more reliable. The adaptive mixture coefficient takes into account global corpus statistics, and the uncertainty in each topic’s continuous distributions. Our approach outperforms the fully discrete, fully continuous, and static mixture model on topic coherence in low resource settings. We additionally demonstrate the generalisability of our method by extending it to handle multilingual document collections.
Anthology ID:
2021.eacl-main.209
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2451–2469
Language:
URL:
https://aclanthology.org/2021.eacl-main.209
DOI:
10.18653/v1/2021.eacl-main.209
Bibkey:
Cite (ACL):
Suzanna Sia and Kevin Duh. 2021. Adaptive Mixed Component LDA for Low Resource Topic Modeling. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 2451–2469, Online. Association for Computational Linguistics.
Cite (Informal):
Adaptive Mixed Component LDA for Low Resource Topic Modeling (Sia & Duh, EACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2021.eacl-main.209.pdf
Code
 suzyahyah/adaptive_mixture_topic_model