ZhichunRoad at SemEval-2022 Task 2: Adversarial Training and Contrastive Learning for Multiword Representations

Xuange Cui, Wei Xiong, Songlin Wang


Abstract
This paper presents our contribution to the SemEval-2022 Task 2: Multilingual Idiomaticity Detection and Sentence Embedding.We explore the impact of three different pre-trained multilingual language models in the SubTaskA.By enhancing the model generalization and robustness, we use the exponential moving average (EMA) method and the adversarial attack strategy.In SubTaskB, we add an effective cross-attention module for modeling the relationships of two sentences.We jointly train the model with a contrastive learning objective and employ a momentum contrast to enlarge the number of negative pairs.Additionally, we use the alignment and uniformity properties to measure the quality of sentence embeddings.Our approach obtained competitive results in both subtasks.
Anthology ID:
2022.semeval-1.24
Volume:
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
SemEval
SIGs:
SIGLEX | SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
197–203
Language:
URL:
https://aclanthology.org/2022.semeval-1.24
DOI:
10.18653/v1/2022.semeval-1.24
Bibkey:
Cite (ACL):
Xuange Cui, Wei Xiong, and Songlin Wang. 2022. ZhichunRoad at SemEval-2022 Task 2: Adversarial Training and Contrastive Learning for Multiword Representations. In Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022), pages 197–203, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
ZhichunRoad at SemEval-2022 Task 2: Adversarial Training and Contrastive Learning for Multiword Representations (Cui et al., SemEval 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.semeval-1.24.pdf