Go Figure! Multi-task transformer-based architecture for metaphor detection using idioms: ETS team in 2020 metaphor shared task
Xianyang Chen, Chee Wee (Ben) Leong, Michael Flor, Beata Beigman Klebanov
Abstract
This paper describes the ETS entry to the 2020 Metaphor Detection shared task. Our contribution consists of a sequence of experiments using BERT, starting with a baseline, strengthening it by spell-correcting the TOEFL corpus, followed by a multi-task learning setting, where one of the tasks is the token-level metaphor classification as per the shared task, while the other is meant to provide additional training that we hypothesized to be relevant to the main task. In one case, out-of-domain data manually annotated for metaphor is used for the auxiliary task; in the other case, in-domain data automatically annotated for idioms is used for the auxiliary task. Both multi-task experiments yield promising results.- Anthology ID:
- 2020.figlang-1.32
- Volume:
- Proceedings of the Second Workshop on Figurative Language Processing
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Editors:
- Beata Beigman Klebanov, Ekaterina Shutova, Patricia Lichtenstein, Smaranda Muresan, Chee Wee, Anna Feldman, Debanjan Ghosh
- Venue:
- Fig-Lang
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 235–243
- Language:
- URL:
- https://aclanthology.org/2020.figlang-1.32
- DOI:
- 10.18653/v1/2020.figlang-1.32
- Cite (ACL):
- Xianyang Chen, Chee Wee (Ben) Leong, Michael Flor, and Beata Beigman Klebanov. 2020. Go Figure! Multi-task transformer-based architecture for metaphor detection using idioms: ETS team in 2020 metaphor shared task. In Proceedings of the Second Workshop on Figurative Language Processing, pages 235–243, Online. Association for Computational Linguistics.
- Cite (Informal):
- Go Figure! Multi-task transformer-based architecture for metaphor detection using idioms: ETS team in 2020 metaphor shared task (Chen et al., Fig-Lang 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-1/2020.figlang-1.32.pdf