Abstract
Recent work has adopted models of pragmatic reasoning for the generation of informative language in, e.g., image captioning. We propose a simple but highly effective relaxation of fully rational decoding, based on an existing incremental and character-level approach to pragmatically informative neural image captioning. We implement a mixed, ‘fast’ and ‘slow’, speaker that applies pragmatic reasoning occasionally (only word-initially), while unrolling the language model. In our evaluation, we find that increased informativeness through pragmatic decoding generally lowers quality and, somewhat counter-intuitively, increases repetitiveness in captions. Our mixed speaker, however, achieves a good balance between quality and informativeness.- Anthology ID:
- 2021.inlg-1.41
- Volume:
- Proceedings of the 14th International Conference on Natural Language Generation
- Month:
- August
- Year:
- 2021
- Address:
- Aberdeen, Scotland, UK
- Editors:
- Anya Belz, Angela Fan, Ehud Reiter, Yaji Sripada
- Venue:
- INLG
- SIG:
- SIGGEN
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 371–376
- Language:
- URL:
- https://aclanthology.org/2021.inlg-1.41
- DOI:
- 10.18653/v1/2021.inlg-1.41
- Cite (ACL):
- Sina Zarrieß, Hendrik Buschmeier, Ting Han, and Simeon Schüz. 2021. Decoding, Fast and Slow: A Case Study on Balancing Trade-Offs in Incremental, Character-level Pragmatic Reasoning. In Proceedings of the 14th International Conference on Natural Language Generation, pages 371–376, Aberdeen, Scotland, UK. Association for Computational Linguistics.
- Cite (Informal):
- Decoding, Fast and Slow: A Case Study on Balancing Trade-Offs in Incremental, Character-level Pragmatic Reasoning (Zarrieß et al., INLG 2021)
- PDF:
- https://preview.aclanthology.org/fix-dup-bibkey/2021.inlg-1.41.pdf
- Data
- MS COCO