Abstract
Images can give us insights into the contextual meanings of words, but current image-text grounding approaches require detailed annotations. Such granular annotation is rare, expensive, and unavailable in most domain-specific contexts. In contrast, unlabeled multi-image, multi-sentence documents are abundant. Can lexical grounding be learned from such documents, even though they have significant lexical and visual overlap? Working with a case study dataset of real estate listings, we demonstrate the challenge of distinguishing highly correlated grounded terms, such as “kitchen” and “bedroom”, and introduce metrics to assess this document similarity. We present a simple unsupervised clustering-based method that increases precision and recall beyond object detection and image tagging baselines when evaluated on labeled subsets of the dataset. The proposed method is particularly effective for local contextual meanings of a word, for example associating “granite” with countertops in the real estate dataset and with rocky landscapes in a Wikipedia dataset.- Anthology ID:
- 2020.emnlp-main.160
- Volume:
- Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2039–2045
- Language:
- URL:
- https://aclanthology.org/2020.emnlp-main.160
- DOI:
- 10.18653/v1/2020.emnlp-main.160
- Cite (ACL):
- Gregory Yauney, Jack Hessel, and David Mimno. 2020. Domain-Specific Lexical Grounding in Noisy Visual-Textual Documents. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 2039–2045, Online. Association for Computational Linguistics.
- Cite (Informal):
- Domain-Specific Lexical Grounding in Noisy Visual-Textual Documents (Yauney et al., EMNLP 2020)
- PDF:
- https://preview.aclanthology.org/paclic-22-ingestion/2020.emnlp-main.160.pdf
- Code
- gyauney/domain-specific-lexical-grounding
- Data
- COCO