Natural Language Informs the Interpretation of Iconic Gestures: A Computational Approach
Abstract
When giving descriptions, speakers often signify object shape or size with hand gestures. Such so-called ‘iconic’ gestures represent their meaning through their relevance to referents in the verbal content, rather than having a conventional form. The gesture form on its own is often ambiguous, and the aspect of the referent that it highlights is constrained by what the language makes salient. We show how the verbal content guides gesture interpretation through a computational model that frames the task as a multi-label classification task that maps multimodal utterances to semantic categories, using annotated human-human data.- Anthology ID:
- I17-2023
- Volume:
- Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
- Month:
- November
- Year:
- 2017
- Address:
- Taipei, Taiwan
- Editors:
- Greg Kondrak, Taro Watanabe
- Venue:
- IJCNLP
- SIG:
- Publisher:
- Asian Federation of Natural Language Processing
- Note:
- Pages:
- 134–139
- Language:
- URL:
- https://aclanthology.org/I17-2023
- DOI:
- Cite (ACL):
- Ting Han, Julian Hough, and David Schlangen. 2017. Natural Language Informs the Interpretation of Iconic Gestures: A Computational Approach. In Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 134–139, Taipei, Taiwan. Asian Federation of Natural Language Processing.
- Cite (Informal):
- Natural Language Informs the Interpretation of Iconic Gestures: A Computational Approach (Han et al., IJCNLP 2017)
- PDF:
- https://preview.aclanthology.org/teach-a-man-to-fish/I17-2023.pdf