Word Representation Learning in Multimodal Pre-Trained Transformers: An Intrinsic Evaluation

Sandro Pezzelle, Ece Takmaz, Raquel Fernández


Abstract
This study carries out a systematic intrinsic evaluation of the semantic representations learned by state-of-the-art pre-trained multimodal Transformers. These representations are claimed to be task-agnostic and shown to help on many downstream language-and-vision tasks. However, the extent to which they align with human semantic intuitions remains unclear. We experiment with various models and obtain static word representations from the contextualized ones they learn. We then evaluate them against the semantic judgments provided by human speakers. In line with previous evidence, we observe a generalized advantage of multimodal representations over language- only ones on concrete word pairs, but not on abstract ones. On the one hand, this confirms the effectiveness of these models to align language and vision, which results in better semantic representations for concepts that are grounded in images. On the other hand, models are shown to follow different representation learning patterns, which sheds some light on how and when they perform multimodal integration.
Anthology ID:
2021.tacl-1.93
Volume:
Transactions of the Association for Computational Linguistics, Volume 9
Month:
Year:
2021
Address:
Cambridge, MA
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
1563–1579
Language:
URL:
https://aclanthology.org/2021.tacl-1.93
DOI:
10.1162/tacl_a_00443
Bibkey:
Cite (ACL):
Sandro Pezzelle, Ece Takmaz, and Raquel Fernández. 2021. Word Representation Learning in Multimodal Pre-Trained Transformers: An Intrinsic Evaluation. Transactions of the Association for Computational Linguistics, 9:1563–1579.
Cite (Informal):
Word Representation Learning in Multimodal Pre-Trained Transformers: An Intrinsic Evaluation (Pezzelle et al., TACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.tacl-1.93.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2021.tacl-1.93.mp4