McPhraSy: Multi-Context Phrase Similarity and Clustering
Amir Cohen, Hila Gonen, Ori Shapira, Ran Levy, Yoav Goldberg
Abstract
Phrase similarity is a key component of many NLP applications. Current phrase similarity methods focus on embedding the phrase itself and use the phrase context only during training of the pretrained model. To better leverage the information in the context, we propose McPhraSy (Multi-context Phrase Similarity), a novel algorithm for estimating the similarity of phrases based on multiple contexts. At inference time, McPhraSy represents each phrase by considering multiple contexts in which it appears and computes the similarity of two phrases by aggregating the pairwise similarities between the contexts of the phrases. Incorporating context during inference enables McPhraSy to outperform current state-of-the-art models on two phrase similarity datasets by up to 13.3%. Finally, we also present a new downstream task that relies on phrase similarity – keyphrase clustering – and create a new benchmark for it in the product reviews domain. We show that McPhraSy surpasses all other baselines for this task.- Anthology ID:
- 2022.findings-emnlp.259
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2022
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Editors:
- Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3538–3550
- Language:
- URL:
- https://aclanthology.org/2022.findings-emnlp.259
- DOI:
- 10.18653/v1/2022.findings-emnlp.259
- Cite (ACL):
- Amir Cohen, Hila Gonen, Ori Shapira, Ran Levy, and Yoav Goldberg. 2022. McPhraSy: Multi-Context Phrase Similarity and Clustering. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 3538–3550, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- McPhraSy: Multi-Context Phrase Similarity and Clustering (Cohen et al., Findings 2022)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-emnlp.259.pdf