Abstract
As an interesting and challenging task, sarcasm generation has attracted widespread attention. Although very recent studies have made promising progress, none of them considers generating a sarcastic description for a given image - as what people are doing on Twitter. In this paper, we present a Multi-modal Sarcasm Generation (MSG) task: Given an image with hashtags that provide the sarcastic target, MSG aims to generate sarcastic descriptions like humans. Different from textual sarcasm generation, MSG is more challenging as it is difficult to accurately capture the key information from images, hashtags, and OCR tokens and exploit multi-modal incongruity to generate sarcastic descriptions. To support the research on MSG, we develop MuSG, a new dataset with 5000 images and related Twitter text. We also propose a multi-modal Transformer-based method as a solution to this MSG task. The input features are embedded in the common space and passed through the multi-modal Transformer layers to generate the sarcastic descriptions by the auto-regressive paradigm. Both automatic and manual evaluations demonstrate the superiority of our method. The dataset and code will be available soon.- Anthology ID:
- 2023.findings-acl.346
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5601–5613
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.346
- DOI:
- 10.18653/v1/2023.findings-acl.346
- Cite (ACL):
- Wenye Zhao, Qingbao Huang, Dongsheng Xu, and Peizhi Zhao. 2023. Multi-modal Sarcasm Generation: Dataset and Solution. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5601–5613, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Multi-modal Sarcasm Generation: Dataset and Solution (Zhao et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-acl.346.pdf