Exploring Large Language Models for Multi-Modal Out-of-Distribution Detection

Yi Dai, Hao Lang, Kaisheng Zeng, Fei Huang, Yongbin Li


Abstract
Out-of-distribution (OOD) detection is essential for reliable and trustworthy machine learning. Recent multi-modal OOD detection leverages textual information from in-distribution (ID) class names for visual OOD detection, yet it currently neglects the rich contextual information of ID classes. Large language models (LLMs) encode a wealth of world knowledge and can be prompted to generate descriptive features for each class. Indiscriminately using such knowledge causes catastrophic damage to OOD detection due to LLMs’ hallucinations, as is observed by our analysis. In this paper, we propose to apply world knowledge to enhance OOD detection performance through selective generation from LLMs. Specifically, we introduce a consistency-based uncertainty calibration method to estimate the confidence score of each generation. We further extract visual objects from each image to fully capitalize on the aforementioned world knowledge. Extensive experiments demonstrate that our method consistently outperforms the state-of-the-art.
Anthology ID:
2023.findings-emnlp.351
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5292–5305
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.351
DOI:
10.18653/v1/2023.findings-emnlp.351
Bibkey:
Cite (ACL):
Yi Dai, Hao Lang, Kaisheng Zeng, Fei Huang, and Yongbin Li. 2023. Exploring Large Language Models for Multi-Modal Out-of-Distribution Detection. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 5292–5305, Singapore. Association for Computational Linguistics.
Cite (Informal):
Exploring Large Language Models for Multi-Modal Out-of-Distribution Detection (Dai et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.findings-emnlp.351.pdf