Tree-Structured Neural Topic Model

Masaru Isonuma, Junichiro Mori, Danushka Bollegala, Ichiro Sakata


Abstract
This paper presents a tree-structured neural topic model, which has a topic distribution over a tree with an infinite number of branches. Our model parameterizes an unbounded ancestral and fraternal topic distribution by applying doubly-recurrent neural networks. With the help of autoencoding variational Bayes, our model improves data scalability and achieves competitive performance when inducing latent topics and tree structures, as compared to a prior tree-structured topic model (Blei et al., 2010). This work extends the tree-structured topic model such that it can be incorporated with neural models for downstream tasks.
Anthology ID:
2020.acl-main.73
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
800–806
Language:
URL:
https://aclanthology.org/2020.acl-main.73
DOI:
10.18653/v1/2020.acl-main.73
Bibkey:
Cite (ACL):
Masaru Isonuma, Junichiro Mori, Danushka Bollegala, and Ichiro Sakata. 2020. Tree-Structured Neural Topic Model. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 800–806, Online. Association for Computational Linguistics.
Cite (Informal):
Tree-Structured Neural Topic Model (Isonuma et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.acl-main.73.pdf
Video:
 http://slideslive.com/38928818