APoLLo : Unified Adapter and Prompt Learning for Vision Language Models

Sanjoy Chowdhury, Sayan Nag, Dinesh Manocha


Abstract
The choice of input text prompt plays a critical role in the performance of Vision-Language Pretrained (VLP) models such as CLIP. We present APoLLo, a unified multi-modal approach that combines Adapter and Prompt learning for Vision-Language models. Our method is designed to substantially improve the generalization capabilities of VLP models when they are fine-tuned in a few-shot setting. We introduce trainable cross-attention-based adapter layers in conjunction with vision and language encoders to strengthen the alignment between the two modalities. We enforce consistency between the respective encoder branches (receiving augmented inputs) to prevent overfitting in downstream tasks. Our method is evaluated on three representative tasks: generalization to novel classes, cross-dataset evaluation, and unseen domain shifts. In practice, APoLLo achieves a relative gain up to 6.03% over MaPLe (SOTA) on novel classes for 10 diverse image recognition datasets.
Anthology ID:
2023.emnlp-main.629
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10173–10187
Language:
URL:
https://aclanthology.org/2023.emnlp-main.629
DOI:
10.18653/v1/2023.emnlp-main.629
Bibkey:
Cite (ACL):
Sanjoy Chowdhury, Sayan Nag, and Dinesh Manocha. 2023. APoLLo : Unified Adapter and Prompt Learning for Vision Language Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 10173–10187, Singapore. Association for Computational Linguistics.
Cite (Informal):
APoLLo : Unified Adapter and Prompt Learning for Vision Language Models (Chowdhury et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2023.emnlp-main.629.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/2023.emnlp-main.629.mp4