Man-Wai Mak
2025
TrInk: Ink Generation with Transformer Network
Zezhong Jin
|
Shubhang Desai
|
Xu Chen
|
Biyi Fang
|
Zhuoyi Huang
|
Zhe Li
|
Chong-Xin Gan
|
Xiao Tu
|
Man-Wai Mak
|
Yan Lu
|
Shujie Liu
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
In this paper, we propose TrInk, a Transformer-based model for ink generation, which effectively captures global dependencies. To better facilitate the alignment between the input text and generated stroke points, we introduce scaled positional embeddings and a Gaussian memory mask in the cross-attention module. Additionally, we design both subjective and objective evaluation pipelines to comprehensively assess the legibility and style consistency of the generated handwriting. Experiments demonstrate that our Transformer-based model achieves a 35.56% reduction in character error rate (CER) and an 29.66% reduction in word error rate (WER) on the IAM-OnDB dataset compared to previous methods. We provide an demo page with handwriting samples from TrInk and baseline models at: https://akahello-a11y.github.io/trink-demo/
Search
Fix author
Co-authors
- Xu Chen (陈旭) 1
- Shubhang Desai 1
- Biyi Fang 1
- Chong-Xin Gan 1
- Zhuoyi Huang 1
- show all...