Improving Multimodal fusion via Mutual Dependency Maximisation

Pierre Colombo, Emile Chapuis, Matthieu Labeau, Chloé Clavel


Abstract
Multimodal sentiment analysis is a trending area of research, and multimodal fusion is one of its most active topic. Acknowledging humans communicate through a variety of channels (i.e visual, acoustic, linguistic), multimodal systems aim at integrating different unimodal representations into a synthetic one. So far, a consequent effort has been made on developing complex architectures allowing the fusion of these modalities. However, such systems are mainly trained by minimising simple losses such as L1 or cross-entropy. In this work, we investigate unexplored penalties and propose a set of new objectives that measure the dependency between modalities. We demonstrate that our new penalties lead to a consistent improvement (up to 4.3 on accuracy) across a large variety of state-of-the-art models on two well-known sentiment analysis datasets: CMU-MOSI and CMU-MOSEI. Our method not only achieves a new SOTA on both datasets but also produces representations that are more robust to modality drops. Finally, a by-product of our methods includes a statistical network which can be used to interpret the high dimensional representations learnt by the model.
Anthology ID:
2021.emnlp-main.21
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
231–245
Language:
URL:
https://aclanthology.org/2021.emnlp-main.21
DOI:
10.18653/v1/2021.emnlp-main.21
Bibkey:
Cite (ACL):
Pierre Colombo, Emile Chapuis, Matthieu Labeau, and Chloé Clavel. 2021. Improving Multimodal fusion via Mutual Dependency Maximisation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 231–245, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Improving Multimodal fusion via Mutual Dependency Maximisation (Colombo et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.21.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.21.mp4
Data
CMU-MOSEI