Abstract
Noun compound interpretation is the task of expressing a noun compound (e.g. chocolate bunny) in a free-text paraphrase that makes the relationship between the constituent nouns explicit (e.g. bunny-shaped chocolate). We propose modifications to the data and evaluation setup of the standard task (Hendrickx et al., 2013), and show that GPT-3 solves it almost perfectly. We then investigate the task of noun compound conceptualization, i.e. paraphrasing a novel or rare noun compound. E.g., chocolate crocodile is a crocodile-shaped chocolate. This task requires creativity, commonsense, and the ability to generalize knowledge about similar concepts. While GPT-3’s performance is not perfect, it is better than that of humans—likely thanks to its access to vast amounts of knowledge, and because conceptual processing is effortful for people (Connell and Lynott, 2012). Finally, we estimate the extent to which GPT-3 is reasoning about the world vs. parroting its training data. We find that the outputs from GPT-3 often have significant overlap with a large web corpus, but that the parroting strategy is less beneficial for novel noun compounds.- Anthology ID:
- 2023.findings-acl.169
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2698–2710
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.169
- DOI:
- 10.18653/v1/2023.findings-acl.169
- Cite (ACL):
- Albert Coil and Vered Shwartz. 2023. From chocolate bunny to chocolate crocodile: Do Language Models Understand Noun Compounds?. In Findings of the Association for Computational Linguistics: ACL 2023, pages 2698–2710, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- From chocolate bunny to chocolate crocodile: Do Language Models Understand Noun Compounds? (Coil & Shwartz, Findings 2023)
- PDF:
- https://preview.aclanthology.org/proper-vol2-ingestion/2023.findings-acl.169.pdf