Pixel-Level BPE for Auto-Regressive Image Generation

Anton Razzhigaev, Anton Voronov, Andrey Kaznacheev, Andrey Kuznetsov, Denis Dimitrov, Alexander Panchenko


Abstract
Pixel-level autoregression with Transformer models (Image GPT or iGPT) is one of the recent approaches to image generation that has not received massive attention and elaboration due to quadratic complexity of attention as it imposes huge memory requirements and thus restricts the resolution of the generated images. In this paper, we propose to tackle this problem by adopting Byte-Pair-Encoding (BPE) originally proposed for text processing to the image domain to drastically reduce the length of the modeled sequence. The obtained results demonstrate that it is possible to decrease the amount of computation required to generate images pixel-by-pixel while preserving their quality and the expressiveness of the features extracted from the model. Our results show that there is room for improvement for iGPT-like models with more thorough research on the way to the optimal sequence encoding techniques for images.
Anthology ID:
2022.mmmpie-1.4
Volume:
Proceedings of the First Workshop on Performance and Interpretability Evaluations of Multimodal, Multipurpose, Massive-Scale Models
Month:
October
Year:
2022
Address:
Virtual
Venue:
MMMPIE
SIG:
Publisher:
International Conference on Computational Linguistics
Note:
Pages:
26–30
Language:
URL:
https://aclanthology.org/2022.mmmpie-1.4
DOI:
Bibkey:
Cite (ACL):
Anton Razzhigaev, Anton Voronov, Andrey Kaznacheev, Andrey Kuznetsov, Denis Dimitrov, and Alexander Panchenko. 2022. Pixel-Level BPE for Auto-Regressive Image Generation. In Proceedings of the First Workshop on Performance and Interpretability Evaluations of Multimodal, Multipurpose, Massive-Scale Models, pages 26–30, Virtual. International Conference on Computational Linguistics.
Cite (Informal):
Pixel-Level BPE for Auto-Regressive Image Generation (Razzhigaev et al., MMMPIE 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.mmmpie-1.4.pdf
Data
ImageNet