Abstract
Neural topic models can augment or replace bag-of-words inputs with the learned representations of deep pre-trained transformer-based word prediction models. One added benefit when using representations from multilingual models is that they facilitate zero-shot polylingual topic modeling. However, while it has been widely observed that pre-trained embeddings should be fine-tuned to a given task, it is not immediately clear what supervision should look like for an unsupervised task such as topic modeling. Thus, we propose several methods for fine-tuning encoders to improve both monolingual and zero-shot polylingual neural topic modeling. We consider fine-tuning on auxiliary tasks, constructing a new topic classification task, integrating the topic classification objective directly into topic model training, and continued pre-training. We find that fine-tuning encoder representations on topic classification and integrating the topic classification task directly into topic modeling improves topic quality, and that fine-tuning encoder representations on any task is the most important factor for facilitating cross-lingual transfer.- Anthology ID:
- 2021.naacl-main.243
- Volume:
- Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
- Month:
- June
- Year:
- 2021
- Address:
- Online
- Editors:
- Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3054–3068
- Language:
- URL:
- https://aclanthology.org/2021.naacl-main.243
- DOI:
- 10.18653/v1/2021.naacl-main.243
- Cite (ACL):
- Aaron Mueller and Mark Dredze. 2021. Fine-tuning Encoders for Improved Monolingual and Zero-shot Polylingual Neural Topic Modeling. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 3054–3068, Online. Association for Computational Linguistics.
- Cite (Informal):
- Fine-tuning Encoders for Improved Monolingual and Zero-shot Polylingual Neural Topic Modeling (Mueller & Dredze, NAACL 2021)
- PDF:
- https://preview.aclanthology.org/improve-issue-templates/2021.naacl-main.243.pdf
- Code
- aaronmueller/contextualized-topic-models
- Data
- MLDoc, MultiNLI, SNLI