Abstract
Recent work on automatic sequential metaphor detection has involved recurrent neural networks initialized with different pre-trained word embeddings and which are sometimes combined with hand engineered features. To capture lexical and orthographic information automatically, in this paper we propose to add character based word representation. Also, to contrast the difference between literal and contextual meaning, we utilize a similarity network. We explore these components via two different architectures - a BiLSTM model and a Transformer Encoder model similar to BERT to perform metaphor identification. We participate in the Second Shared Task on Metaphor Detection on both the VUA and TOFEL datasets with the above models. The experimental results demonstrate the effectiveness of our method as it outperforms all the systems which participated in the previous shared task.- Anthology ID:
- 2020.figlang-1.18
- Volume:
- Proceedings of the Second Workshop on Figurative Language Processing
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Editors:
- Beata Beigman Klebanov, Ekaterina Shutova, Patricia Lichtenstein, Smaranda Muresan, Chee Wee, Anna Feldman, Debanjan Ghosh
- Venue:
- Fig-Lang
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 116–125
- Language:
- URL:
- https://aclanthology.org/2020.figlang-1.18
- DOI:
- 10.18653/v1/2020.figlang-1.18
- Cite (ACL):
- Tarun Kumar and Yashvardhan Sharma. 2020. Character aware models with similarity learning for metaphor detection. In Proceedings of the Second Workshop on Figurative Language Processing, pages 116–125, Online. Association for Computational Linguistics.
- Cite (Informal):
- Character aware models with similarity learning for metaphor detection (Kumar & Sharma, Fig-Lang 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2020.figlang-1.18.pdf