Unsupervised Keyphrase Extraction via Interpretable Neural Networks
Rishabh Joshi, Vidhisha Balachandran, Emily Saldanha, Maria Glenski, Svitlana Volkova, Yulia Tsvetkov
Abstract
Keyphrase extraction aims at automatically extracting a list of “important” phrases representing the key concepts in a document. Prior approaches for unsupervised keyphrase extraction resorted to heuristic notions of phrase importance via embedding clustering or graph centrality, requiring extensive domain expertise. Our work presents a simple alternative approach which defines keyphrases as document phrases that are salient for predicting the topic of the document. To this end, we propose INSPECT—an approach that uses self-explaining models for identifying influential keyphrases in a document by measuring the predictive impact of input phrases on the downstream task of the document topic classification. We show that this novel method not only alleviates the need for ad-hoc heuristics but also achieves state-of-the-art results in unsupervised keyphrase extraction in four datasets across two domains: scientific publications and news articles.- Anthology ID:
- 2023.findings-eacl.82
- Volume:
- Findings of the Association for Computational Linguistics: EACL 2023
- Month:
- May
- Year:
- 2023
- Address:
- Dubrovnik, Croatia
- Editors:
- Andreas Vlachos, Isabelle Augenstein
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1107–1119
- Language:
- URL:
- https://aclanthology.org/2023.findings-eacl.82
- DOI:
- 10.18653/v1/2023.findings-eacl.82
- Cite (ACL):
- Rishabh Joshi, Vidhisha Balachandran, Emily Saldanha, Maria Glenski, Svitlana Volkova, and Yulia Tsvetkov. 2023. Unsupervised Keyphrase Extraction via Interpretable Neural Networks. In Findings of the Association for Computational Linguistics: EACL 2023, pages 1107–1119, Dubrovnik, Croatia. Association for Computational Linguistics.
- Cite (Informal):
- Unsupervised Keyphrase Extraction via Interpretable Neural Networks (Joshi et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/naacl24-info/2023.findings-eacl.82.pdf