Zijian Cai


2022

pdf
OPDAI at SemEval-2022 Task 11: A hybrid approach for Chinese NER using outside Wikipedia knowledge
Ze Chen | Kangxu Wang | Jiewen Zheng | Zijian Cai | Jiarong He | Jin Gao
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)

This article describes the OPDAI submission to SemEval-2022 Task 11 on Chinese complex NER. First, we explore the performance of model-based approaches and their ensemble, finding that fine-tuning the pre-trained Chinese RoBERTa-wwm model with word semantic representation and contextual gazetteer representation performs best among single models. However, the model-based approach performs poorly on test data because of low-context and unseen-entity cases. Then, we extend our system into two stages: (1) generating entity candidates by using neural model, soft-templates and Wikipedia lexicon. (2) predicting the final entity results within a feature-based rank model. For the evaluation, our best submission achieves an F1 score of 0.7954 and attains the third-best score in the Chinese sub-track.

pdf bib
Using Deep Mixture-of-Experts to Detect Word Meaning Shift for TempoWiC
Ze Chen | Kangxu Wang | Zijian Cai | Jiewen Zheng | Jiarong He | Max Gao | Jason Zhang
Proceedings of the First Workshop on Ever Evolving NLP (EvoNLP)

This paper mainly describes the dma submission to the TempoWiC task, which achieves a macro-F1 score of 77.05% and attains the first place in this task. We first explore the impact of different pre-trained language models. Then we adopt data cleaning, data augmentation, and adversarial training strategies to enhance the model generalization and robustness. For further improvement, we integrate POS information and word semantic representation using a Mixture-of-Experts (MoE) approach. The experimental results show that MoE can overcome the feature overuse issue and combine the context, POS, and word semantic features well. Additionally, we use a model ensemble method for the final prediction, which has been proven effective by many research works.