A Computational Approach to Visual Metonymy

Saptarshi Ghosh, Linfeng Liu, Tianyu Jiang


Abstract
Images often communicate more than they literally depict: a set of tools can suggest an occupation and a cultural artifact can suggest a tradition. This kind of indirect visual reference, known as visual metonymy, invites viewers to recover a target concept via associated cues rather than explicit depiction. In this work, we present the first computational investigation of visual metonymy. We introduce a novel pipeline grounded in semiotic theory that leverages large language models and text-to-image models to generate metonymic visual representations. Using this framework, we construct ViMET, the first visual metonymy dataset comprising 2,000 multiple-choice questions to evaluate the cognitive reasoning abilities in multimodal language models. Experimental results on our dataset reveal a significant gap between human performance (86.9%) and state-of-the-art vision-language models (65.9%), highlighting limitations in machines’ ability to interpret indirect visual references. Our dataset is publicly available at: https://github.com/cincynlp/ViMET.
Anthology ID:
2026.eacl-long.92
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2075–2099
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.92/
DOI:
Bibkey:
Cite (ACL):
Saptarshi Ghosh, Linfeng Liu, and Tianyu Jiang. 2026. A Computational Approach to Visual Metonymy. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2075–2099, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
A Computational Approach to Visual Metonymy (Ghosh et al., EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.92.pdf