Multi-Module Recurrent Neural Networks with Transfer Learning

Filip Skurniak, Maria Janicka, Aleksander Wawer


Abstract
This paper describes multiple solutions designed and tested for the problem of word-level metaphor detection. The proposed systems are all based on variants of recurrent neural network architectures. Specifically, we explore multiple sources of information: pre-trained word embeddings (Glove), a dictionary of language concreteness and a transfer learning scenario based on the states of an encoder network from neural network machine translation system. One of the architectures is based on combining all three systems: (1) Neural CRF (Conditional Random Fields), trained directly on the metaphor data set; (2) Neural Machine Translation encoder of a transfer learning scenario; (3) a neural network used to predict final labels, trained directly on the metaphor data set. Our results vary between test sets: Neural CRF standalone is the best one on submission data, while combined system scores the highest on a test subset randomly selected from training data.
Anthology ID:
W18-0917
Volume:
Proceedings of the Workshop on Figurative Language Processing
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Beata Beigman Klebanov, Ekaterina Shutova, Patricia Lichtenstein, Smaranda Muresan, Chee Wee
Venue:
Fig-Lang
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
128–132
Language:
URL:
https://aclanthology.org/W18-0917
DOI:
10.18653/v1/W18-0917
Bibkey:
Cite (ACL):
Filip Skurniak, Maria Janicka, and Aleksander Wawer. 2018. Multi-Module Recurrent Neural Networks with Transfer Learning. In Proceedings of the Workshop on Figurative Language Processing, pages 128–132, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Multi-Module Recurrent Neural Networks with Transfer Learning (Skurniak et al., Fig-Lang 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/W18-0917.pdf