PoseStitch-SLT: Linguistically Inspired Pose-Stitching for End-to-End Sign Language Translation

Abhinav Joshi, Vaibhav Sharma, Sanjeet Singh, Ashutosh Modi


Abstract
Sign language translation remains a challenging task due to the scarcity of large-scale, sentence-aligned datasets. Prior arts have focused on various feature extraction and architectural changes to support neural machine translation for sign languages. We propose PoseStitch-SLT, a novel pre-training scheme that is inspired by linguistic-templates-based sentence generation technique. With translation comparison on two sign language datasets, How2Sign and iSign, we show that a simple transformer-based encoder-decoder architecture outperforms the prior art when considering template-generated sentence pairs in training. We achieve BLEU-4 score improvements from 1.97 to 4.56 on How2Sign and from 0.55 to 3.43 on iSign, surpassing prior state-of-the-art methods for pose-based gloss-free translation. The results demonstrate the effectiveness of template-driven synthetic supervision in low-resource sign language settings.
Anthology ID:
2025.emnlp-main.698
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13845–13864
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.698/
DOI:
Bibkey:
Cite (ACL):
Abhinav Joshi, Vaibhav Sharma, Sanjeet Singh, and Ashutosh Modi. 2025. PoseStitch-SLT: Linguistically Inspired Pose-Stitching for End-to-End Sign Language Translation. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 13845–13864, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
PoseStitch-SLT: Linguistically Inspired Pose-Stitching for End-to-End Sign Language Translation (Joshi et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.698.pdf
Checklist:
 2025.emnlp-main.698.checklist.pdf