Linguistic Analysis Improves Neural Metaphor Detection

Kevin Stowe, Sarah Moeller, Laura Michaelis, Martha Palmer


Abstract
In the field of metaphor detection, deep learning systems are the ubiquitous and achieve strong performance on many tasks. However, due to the complicated procedures for manually identifying metaphors, the datasets available are relatively small and fraught with complications. We show that using syntactic features and lexical resources can automatically provide additional high-quality training data for metaphoric language, and this data can cover gaps and inconsistencies in metaphor annotation, improving state-of-the-art word-level metaphor identification. This novel application of automatically improving training data improves classification across numerous tasks, and reconfirms the necessity of high-quality data for deep learning frameworks.
Anthology ID:
K19-1034
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
362–371
Language:
URL:
https://aclanthology.org/K19-1034
DOI:
10.18653/v1/K19-1034
Bibkey:
Cite (ACL):
Kevin Stowe, Sarah Moeller, Laura Michaelis, and Martha Palmer. 2019. Linguistic Analysis Improves Neural Metaphor Detection. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 362–371, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Linguistic Analysis Improves Neural Metaphor Detection (Stowe et al., CoNLL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/K19-1034.pdf