GRoWE: A GujiRoBERTa-Enhanced Approach to Ancient Chinese NER via Word-Word Relation Classification and Model Ensembling

Tian Xia, Yilin Wang, Xinkai Wang, Yahe Yang, Qun Zhao, Menghui Yang


Abstract
Named entity recognition is a fundamental task in ancient Chinese text analysis.Based on the pre-trained language model of ancient Chinese texts, this paper proposes a new named entity recognition method GRoWE. It uses the ancient Chinese texts pre-trained language model GujiRoBERTa as the base model, and the wordword relation prediction model is superposed upon the base model to construct a superposition model. Then ensemble strategies are used to multiple superposition models. On the EvaHan 2025 public test set, the F1 value of the proposed method reaches 86.79%, which is 6.18% higher than that of the mainstream BERT_LSTM_CRF baseline model, indicating that the model architecture and ensemble strategy play an important role in improving the recognition effect of naming entities in ancient Chinese texts.
Anthology ID:
2025.alp-1.24
Volume:
Proceedings of the Second Workshop on Ancient Language Processing
Month:
May
Year:
2025
Address:
The Albuquerque Convention Center, Laguna
Editors:
Adam Anderson, Shai Gordin, Bin Li, Yudong Liu, Marco C. Passarotti, Rachele Sprugnoli
Venues:
ALP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
187–191
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.alp-1.24/
DOI:
Bibkey:
Cite (ACL):
Tian Xia, Yilin Wang, Xinkai Wang, Yahe Yang, Qun Zhao, and Menghui Yang. 2025. GRoWE: A GujiRoBERTa-Enhanced Approach to Ancient Chinese NER via Word-Word Relation Classification and Model Ensembling. In Proceedings of the Second Workshop on Ancient Language Processing, pages 187–191, The Albuquerque Convention Center, Laguna. Association for Computational Linguistics.
Cite (Informal):
GRoWE: A GujiRoBERTa-Enhanced Approach to Ancient Chinese NER via Word-Word Relation Classification and Model Ensembling (Xia et al., ALP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.alp-1.24.pdf