Andrew Owens


2022

pdf
Towards Understanding the Relation between Gestures and Language
Artem Abzaliev | Andrew Owens | Rada Mihalcea
Proceedings of the 29th International Conference on Computational Linguistics

In this paper, we explore the relation between gestures and language. Using a multimodal dataset, consisting of Ted talks where the language is aligned with the gestures made by the speakers, we adapt a semi-supervised multimodal model to learn gesture embeddings. We show that gestures are predictive of the native language of the speaker, and that gesture embeddings further improve language prediction result. In addition, gesture embeddings might contain some linguistic information, as we show by probing embeddings for psycholinguistic categories. Finally, we analyze the words that lead to the most expressive gestures and find that function words drive the expressiveness of gestures.