Abstract
Despite the rapid development of neural-based models, syntax still plays a crucial role in modern natural language processing. However, few studies have incorporated syntactic information into ancient Chinese understanding tasks due to the lack of syntactic annotation. This paper explores the role of syntax in ancient Chinese understanding based on the noisy syntax trees from unsupervised derivation and modern Chinese syntax parsers. On top of that, we propose a novel syntax encoding component – confidence-based syntax encoding network (cSEN) to alleviate the side effects from the existing noise caused by unsupervised syntax derivation and the incompatibility between ancient and modern Chinese. Experiments on two typical ancient Chinese understanding tasks, ancient poetry theme classification and ancient-modern Chinese translation, demonstrate that syntactic information can effectively enhance the understanding of ancient Chinese over strong baselines, and that the proposed cSEN plays an important role in noisy scenarios.- Anthology ID:
- 2023.acl-srw.15
- Volume:
- Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Vishakh Padmakumar, Gisela Vallejo, Yao Fu
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 83–92
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2023.acl-srw.15/
- DOI:
- 10.18653/v1/2023.acl-srw.15
- Cite (ACL):
- Ping Wang, Shitou Zhang, Zuchao Li, and Jingrui Hou. 2023. Enhancing Ancient Chinese Understanding with Derived Noisy Syntax Trees. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 83–92, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Enhancing Ancient Chinese Understanding with Derived Noisy Syntax Trees (Wang et al., ACL 2023)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2023.acl-srw.15.pdf