drsphelps at SemEval-2022 Task 2: Learning idiom representations using BERTRAM

Dylan Phelps


Abstract
This paper describes our system for SemEval-2022 Task 2 Multilingual Idiomaticity Detection and Sentence Embedding sub-task B. We modify a standard BERT sentence transformer by adding embeddings for each idiom, which are created using BERTRAM and a small number of contexts. We show that this technique increases the quality of idiom representations and leads to better performance on the task. We also perform analysis on our final results and show that the quality of the produced idiom embeddings is highly sensitive to the quality of the input contexts.
Anthology ID:
2022.semeval-1.18
Volume:
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
SemEval
SIGs:
SIGLEX | SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
158–164
Language:
URL:
https://aclanthology.org/2022.semeval-1.18
DOI:
10.18653/v1/2022.semeval-1.18
Bibkey:
Cite (ACL):
Dylan Phelps. 2022. drsphelps at SemEval-2022 Task 2: Learning idiom representations using BERTRAM. In Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022), pages 158–164, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
drsphelps at SemEval-2022 Task 2: Learning idiom representations using BERTRAM (Phelps, SemEval 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.semeval-1.18.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2022.semeval-1.18.mp4
Data
CC100