Attribution-Guided Multi-Object Hallucination and Bias Detection in Vision-Language Models

Sirat Samyoun, Yingtai Xiao, Jian Du


Abstract
Vision-Language Models excel in multi-modal tasks but often hallucinate objects or exhibit linguistic bias by over-repeating object names, especially in complex multi-object scenes. Existing methods struggle with multi-object grounding because language priors frequently dominate visual evidence, causing hallucinated or biased objects to produce attention distributions or similarity scores nearly indistinguishable from those of real objects. We introduce SHAPLENS, a Shapley value–based attribution framework using Kernel SHAP and multi-layer fusion to detect hallucinated and biased objects. Evaluated on ADE and COCO datasets across four leading VLMs, SHAPLENS improves hallucination detection accuracy by 8–12% and F1 by 10–14% over the best baselines. It also achieves up to 6% higher bias detection performance across three distinct bias types on a curated HQH benchmark and exhibits minimal degradation (<0.03%) across partial and perturbed contexts.
Anthology ID:
2026.eacl-long.210
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4529–4548
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.210/
DOI:
Bibkey:
Cite (ACL):
Sirat Samyoun, Yingtai Xiao, and Jian Du. 2026. Attribution-Guided Multi-Object Hallucination and Bias Detection in Vision-Language Models. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4529–4548, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Attribution-Guided Multi-Object Hallucination and Bias Detection in Vision-Language Models (Samyoun et al., EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.210.pdf