Neural Metaphor Detection in Context

Ge Gao, Eunsol Choi, Yejin Choi, Luke Zettlemoyer


Abstract
We present end-to-end neural models for detecting metaphorical word use in context. We show that relatively standard BiLSTM models which operate on complete sentences work well in this setting, in comparison to previous work that used more restricted forms of linguistic context. These models establish a new state-of-the-art on existing verb metaphor detection benchmarks, and show strong performance on jointly predicting the metaphoricity of all words in a running text.
Anthology ID:
D18-1060
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
607–613
Language:
URL:
https://aclanthology.org/D18-1060
DOI:
10.18653/v1/D18-1060
Bibkey:
Cite (ACL):
Ge Gao, Eunsol Choi, Yejin Choi, and Luke Zettlemoyer. 2018. Neural Metaphor Detection in Context. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 607–613, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Neural Metaphor Detection in Context (Gao et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/D18-1060.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/D18-1060.mp4
Code
 gao-g/metaphor-in-context