ALBERT-BiLSTM for Sequential Metaphor Detection
Shuqun Li, Jingjie Zeng, Jinhui Zhang, Tao Peng, Liang Yang, Hongfei Lin
Abstract
In our daily life, metaphor is a common way of expression. To understand the meaning of a metaphor, we should recognize the metaphor words which play important roles. In the metaphor detection task, we design a sequence labeling model based on ALBERT-LSTM-softmax. By applying this model, we carry out a lot of experiments and compare the experimental results with different processing methods, such as with different input sentences and tokens, or the methods with CRF and softmax. Then, some tricks are adopted to improve the experimental results. Finally, our model achieves a 0.707 F1-score for the all POS subtask and a 0.728 F1-score for the verb subtask on the TOEFL dataset.- Anthology ID:
- 2020.figlang-1.17
- Volume:
- Proceedings of the Second Workshop on Figurative Language Processing
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Editors:
- Beata Beigman Klebanov, Ekaterina Shutova, Patricia Lichtenstein, Smaranda Muresan, Chee Wee, Anna Feldman, Debanjan Ghosh
- Venue:
- Fig-Lang
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 110–115
- Language:
- URL:
- https://aclanthology.org/2020.figlang-1.17
- DOI:
- 10.18653/v1/2020.figlang-1.17
- Cite (ACL):
- Shuqun Li, Jingjie Zeng, Jinhui Zhang, Tao Peng, Liang Yang, and Hongfei Lin. 2020. ALBERT-BiLSTM for Sequential Metaphor Detection. In Proceedings of the Second Workshop on Figurative Language Processing, pages 110–115, Online. Association for Computational Linguistics.
- Cite (Informal):
- ALBERT-BiLSTM for Sequential Metaphor Detection (Li et al., Fig-Lang 2020)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2020.figlang-1.17.pdf