CrowNER at Rocling 2022 Shared Task: NER using MacBERT and Adversarial Training

Qiu-Xia Zhang, Te-Yu Chi, Te-Lun Yang, Jyh-Shing Roger Jang


Abstract
This study uses training and validation data from the “ROCLING 2022 Chinese Health Care Named Entity Recognition Task” for modeling. The modeling process adopts technologies such as data augmentation and data post-processing, and uses the MacBERT pre-training model to build a dedicated Chinese medical field NER recognizer. During the fine-tuning process, we also added adversarial training methods, such as FGM and PGD, and the results of the final tuned model were close to the best team for task evaluation. In addition, by introducing mixed-precision training, we also greatly reduce the time cost of training.
Anthology ID:
2022.rocling-1.40
Volume:
Proceedings of the 34th Conference on Computational Linguistics and Speech Processing (ROCLING 2022)
Month:
November
Year:
2022
Address:
Taipei, Taiwan
Editors:
Yung-Chun Chang, Yi-Chin Huang
Venue:
ROCLING
SIG:
Publisher:
The Association for Computational Linguistics and Chinese Language Processing (ACLCLP)
Note:
Pages:
321–328
Language:
Chinese
URL:
https://aclanthology.org/2022.rocling-1.40
DOI:
Bibkey:
Cite (ACL):
Qiu-Xia Zhang, Te-Yu Chi, Te-Lun Yang, and Jyh-Shing Roger Jang. 2022. CrowNER at Rocling 2022 Shared Task: NER using MacBERT and Adversarial Training. In Proceedings of the 34th Conference on Computational Linguistics and Speech Processing (ROCLING 2022), pages 321–328, Taipei, Taiwan. The Association for Computational Linguistics and Chinese Language Processing (ACLCLP).
Cite (Informal):
CrowNER at Rocling 2022 Shared Task: NER using MacBERT and Adversarial Training (Zhang et al., ROCLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.rocling-1.40.pdf