Multi-features Enhanced Multi-task Learning for Vietnamese Treebank Conversion

Zhenguo Zhang, Jianjian Liu, Li Ying


Abstract
“Pre-trained language representation-based dependency parsing models have achieved obviousimprovements in rich-resource languages. However, these model performances depend on thequality and scale of training data significantly. Compared with Chinese and English, the scale ofVietnamese Dependency treebank is scarcity. Considering human annotation is labor-intensiveand time-consuming, we propose a multi-features enhanced multi-task learning framework toconvert all heterogeneous Vietnamese Treebanks to a unified one. On the one hand, we exploitTree BiLSTM and pattern embedding to extract global and local dependency tree features fromthe source Treebank. On the other hand, we propose to integrate these features into a multi-tasklearning framework to use the source dependency parsing to assist the conversion processing.Experiments on the benchmark datasets show that our proposed model can effectively convertheterogeneous treebanks, thus further improving the Vietnamese dependency parsing accuracy byabout 7.12 points in LAS.”
Anthology ID:
2024.ccl-1.80
Volume:
Proceedings of the 23rd Chinese National Conference on Computational Linguistics (Volume 1: Main Conference)
Month:
July
Year:
2024
Address:
Taiyuan, China
Editors:
Sun Maosong, Liang Jiye, Han Xianpei, Liu Zhiyuan, He Yulan
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
1035–1046
Language:
English
URL:
https://preview.aclanthology.org/author-degibert/2024.ccl-1.80/
DOI:
Bibkey:
Cite (ACL):
Zhenguo Zhang, Jianjian Liu, and Li Ying. 2024. Multi-features Enhanced Multi-task Learning for Vietnamese Treebank Conversion. In Proceedings of the 23rd Chinese National Conference on Computational Linguistics (Volume 1: Main Conference), pages 1035–1046, Taiyuan, China. Chinese Information Processing Society of China.
Cite (Informal):
Multi-features Enhanced Multi-task Learning for Vietnamese Treebank Conversion (Zhang et al., CCL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-degibert/2024.ccl-1.80.pdf