Testing the Ability of Language Models to Interpret Figurative Language

Emmy Liu, Chenxuan Cui, Kenneth Zheng, Graham Neubig


Abstract
Figurative and metaphorical language are commonplace in discourse, and figurative expressions play an important role in communication and cognition. However, figurative language has been a relatively under-studied area in NLP, and it remains an open question to what extent modern language models can interpret nonliteral phrases. To address this question, we introduce Fig-QA, a Winograd-style nonliteral language understanding task consisting of correctly interpreting paired figurative phrases with divergent meanings. We evaluate the performance of several state-of-the-art language models on this task, and find that although language models achieve performance significantly over chance, they still fall short of human performance, particularly in zero- or few-shot settings. This suggests that further work is needed to improve the nonliteral reasoning capabilities of language models.
Anthology ID:
2022.naacl-main.330
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4437–4452
Language:
URL:
https://aclanthology.org/2022.naacl-main.330
DOI:
10.18653/v1/2022.naacl-main.330
Bibkey:
Cite (ACL):
Emmy Liu, Chenxuan Cui, Kenneth Zheng, and Graham Neubig. 2022. Testing the Ability of Language Models to Interpret Figurative Language. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4437–4452, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Testing the Ability of Language Models to Interpret Figurative Language (Liu et al., NAACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.naacl-main.330.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2022.naacl-main.330.mp4
Code
 nightingal3/fig-qa +  additional community code
Data
Fig-QAANLIWinoGrande