Improving Handshape Representations for Sign Language Processing: A Graph Neural Network Approach

Alessa Carbo, Eric Nalisnick


Abstract
Handshapes serve a fundamental phonological role in signed languages, with American Sign Language employing approximately 50 distinct shapes. However, computational approaches rarely model handshapes explicitly, which limits both recognition accuracy and linguistic analysis. We introduce a novel graph neural network that separates temporal dynamics from static handshape configurations. Our approach combines anatomically informed graph structures with contrastive learning to address key challenges in handshape recognition, including subtle inter-class distinctions and temporal variations. We establish the first benchmark for structured handshape recognition in signing sequences, achieving 46% accuracy across 37 handshape classes, compared to 25% for baseline methods.
Anthology ID:
2025.emnlp-main.1483
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
29110–29123
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1483/
DOI:
Bibkey:
Cite (ACL):
Alessa Carbo and Eric Nalisnick. 2025. Improving Handshape Representations for Sign Language Processing: A Graph Neural Network Approach. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 29110–29123, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Improving Handshape Representations for Sign Language Processing: A Graph Neural Network Approach (Carbo & Nalisnick, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1483.pdf
Checklist:
 2025.emnlp-main.1483.checklist.pdf