Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor Detection

Yuri Bizzoni, Mehdi Ghanimifard

[How to correct problems with metadata yourself]


Abstract
We present and compare two alternative deep neural architectures to perform word-level metaphor detection on text: a bi-LSTM model and a new structure based on recursive feed-forward concatenation of the input. We discuss different versions of such models and the effect that input manipulation - specifically, reducing the length of sentences and introducing concreteness scores for words - have on their performance.
Anthology ID:
W18-0911
Volume:
Proceedings of the Workshop on Figurative Language Processing
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Beata Beigman Klebanov, Ekaterina Shutova, Patricia Lichtenstein, Smaranda Muresan, Chee Wee
Venue:
Fig-Lang
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
91–101
Language:
URL:
https://aclanthology.org/W18-0911
DOI:
10.18653/v1/W18-0911
Bibkey:
Cite (ACL):
Yuri Bizzoni and Mehdi Ghanimifard. 2018. Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor Detection. In Proceedings of the Workshop on Figurative Language Processing, pages 91–101, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor Detection (Bizzoni & Ghanimifard, Fig-Lang 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/W18-0911.pdf
Code
 GU-CLASP/ocota