TAN-NTM: Topic Attention Networks for Neural Topic Modeling

Madhur Panwar, Shashank Shailabh, Milan Aggarwal, Balaji Krishnamurthy


Abstract
Topic models have been widely used to learn text representations and gain insight into document corpora. To perform topic discovery, most existing neural models either take document bag-of-words (BoW) or sequence of tokens as input followed by variational inference and BoW reconstruction to learn topic-word distribution. However, leveraging topic-word distribution for learning better features during document encoding has not been explored much. To this end, we develop a framework TAN-NTM, which processes document as a sequence of tokens through a LSTM whose contextual outputs are attended in a topic-aware manner. We propose a novel attention mechanism which factors in topic-word distribution to enable the model to attend on relevant words that convey topic related cues. The output of topic attention module is then used to carry out variational inference. We perform extensive ablations and experiments resulting in ~9-15 percentage improvement over score of existing SOTA topic models in NPMI coherence on several benchmark datasets - 20Newsgroups, Yelp Review Polarity and AGNews. Further, we show that our method learns better latent document-topic features compared to existing topic models through improvement on two downstream tasks: document classification and topic guided keyphrase generation.
Anthology ID:
2021.acl-long.299
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3865–3880
Language:
URL:
https://aclanthology.org/2021.acl-long.299
DOI:
10.18653/v1/2021.acl-long.299
Bibkey:
Cite (ACL):
Madhur Panwar, Shashank Shailabh, Milan Aggarwal, and Balaji Krishnamurthy. 2021. TAN-NTM: Topic Attention Networks for Neural Topic Modeling. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 3865–3880, Online. Association for Computational Linguistics.
Cite (Informal):
TAN-NTM: Topic Attention Networks for Neural Topic Modeling (Panwar et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.acl-long.299.pdf
Optional supplementary material:
 2021.acl-long.299.OptionalSupplementaryMaterial.zip
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2021.acl-long.299.mp4
Data
AG NewsYelp Review Polarity