Leveraging Three Types of Embeddings from Masked Language Models in Idiom Token Classification

Ryosuke Takahashi, Ryohei Sasano, Koichi Takeda


Abstract
Many linguistic expressions have idiomatic and literal interpretations, and the automatic distinction of these two interpretations has been studied for decades. Recent research has shown that contextualized word embeddings derived from masked language models (MLMs) can give promising results for idiom token classification. This indicates that contextualized word embedding alone contains information about whether the word is being used in a literal sense or not. However, we believe that more types of information can be derived from MLMs and that leveraging such information can improve idiom token classification. In this paper, we leverage three types of embeddings from MLMs; uncontextualized token embeddings and masked token embeddings in addition to the standard contextualized word embeddings and show that the newly added embeddings significantly improve idiom token classification for both English and Japanese datasets.
Anthology ID:
2022.starsem-1.21
Volume:
Proceedings of the 11th Joint Conference on Lexical and Computational Semantics
Month:
July
Year:
2022
Address:
Seattle, Washington
Editors:
Vivi Nastase, Ellie Pavlick, Mohammad Taher Pilehvar, Jose Camacho-Collados, Alessandro Raganato
Venue:
*SEM
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
234–239
Language:
URL:
https://aclanthology.org/2022.starsem-1.21
DOI:
10.18653/v1/2022.starsem-1.21
Bibkey:
Cite (ACL):
Ryosuke Takahashi, Ryohei Sasano, and Koichi Takeda. 2022. Leveraging Three Types of Embeddings from Masked Language Models in Idiom Token Classification. In Proceedings of the 11th Joint Conference on Lexical and Computational Semantics, pages 234–239, Seattle, Washington. Association for Computational Linguistics.
Cite (Informal):
Leveraging Three Types of Embeddings from Masked Language Models in Idiom Token Classification (Takahashi et al., *SEM 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.starsem-1.21.pdf