SimCKP: Simple Contrastive Learning of Keyphrase Representations

Minseok Choi, Chaeheon Gwak, Seho Kim, Si Kim, Jaegul Choo


Abstract
Keyphrase generation (KG) aims to generate a set of summarizing words or phrases given a source document, while keyphrase extraction (KE) aims to identify them from the text. Because the search space is much smaller in KE, it is often combined with KG to predict keyphrases that may or may not exist in the corresponding document. However, current unified approaches adopt sequence labeling and maximization-based generation that primarily operate at a token level, falling short in observing and scoring keyphrases as a whole. In this work, we propose SimCKP, a simple contrastive learning framework that consists of two stages: 1) An extractor-generator that extracts keyphrases by learning context-aware phrase-level representations in a contrastive manner while also generating keyphrases that do not appear in the document; 2) A reranker that adapts scores for each generated phrase by likewise aligning their representations with the corresponding document. Experimental results on multiple benchmark datasets demonstrate the effectiveness of our proposed approach, which outperforms the state-of-the-art models by a significant margin.
Anthology ID:
2023.findings-emnlp.199
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3003–3015
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.199
DOI:
10.18653/v1/2023.findings-emnlp.199
Bibkey:
Cite (ACL):
Minseok Choi, Chaeheon Gwak, Seho Kim, Si Kim, and Jaegul Choo. 2023. SimCKP: Simple Contrastive Learning of Keyphrase Representations. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 3003–3015, Singapore. Association for Computational Linguistics.
Cite (Informal):
SimCKP: Simple Contrastive Learning of Keyphrase Representations (Choi et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.findings-emnlp.199.pdf