CATE: A Contrastive Pre-trained Model for Metaphor Detection with Semi-supervised Learning

Zhenxi Lin, Qianli Ma, Jiangyue Yan, Jieyu Chen


Abstract
Metaphors are ubiquitous in natural language, and detecting them requires contextual reasoning about whether a semantic incongruence actually exists. Most existing work addresses this problem using pre-trained contextualized models. Despite their success, these models require a large amount of labeled data and are not linguistically-based. In this paper, we proposed a ContrAstive pre-Trained modEl (CATE) for metaphor detection with semi-supervised learning. Our model first uses a pre-trained model to obtain a contextual representation of target words and employs a contrastive objective to promote an increased distance between target words’ literal and metaphorical senses based on linguistic theories. Furthermore, we propose a simple strategy to collect large-scale candidate instances from the general corpus and generalize the model via self-training. Extensive experiments show that CATE achieves better performance against state-of-the-art baselines on several benchmark datasets.
Anthology ID:
2021.emnlp-main.316
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3888–3898
Language:
URL:
https://aclanthology.org/2021.emnlp-main.316
DOI:
10.18653/v1/2021.emnlp-main.316
Bibkey:
Cite (ACL):
Zhenxi Lin, Qianli Ma, Jiangyue Yan, and Jieyu Chen. 2021. CATE: A Contrastive Pre-trained Model for Metaphor Detection with Semi-supervised Learning. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 3888–3898, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
CATE: A Contrastive Pre-trained Model for Metaphor Detection with Semi-supervised Learning (Lin et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/2021.emnlp-main.316.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-3/2021.emnlp-main.316.mp4