CYUT at ROCLING-2021 Shared Task: Based on BERT and MacBERT

Xie-Sheng Hong, Shih-Hung Wu


Abstract
This paper present a description for the ROCLING 2021 shared task in dimensional sentiment analysis for educational texts. We submitted two runs in the final test. Both runs use the standard regression model. The Run1 uses Chinese version of BERT as the base, and in Run2 we use the early version of MacBERT that Chinese version of RoBERTa-like BERT model, RoBERTa-wwm-ext. Using powerful pre-training model of BERT for text embedding to help train the model.
Anthology ID:
2021.rocling-1.48
Volume:
Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing (ROCLING 2021)
Month:
October
Year:
2021
Address:
Taoyuan, Taiwan
Venue:
ROCLING
SIG:
Publisher:
The Association for Computational Linguistics and Chinese Language Processing (ACLCLP)
Note:
Pages:
367–374
Language:
URL:
https://aclanthology.org/2021.rocling-1.48
DOI:
Bibkey:
Cite (ACL):
Xie-Sheng Hong and Shih-Hung Wu. 2021. CYUT at ROCLING-2021 Shared Task: Based on BERT and MacBERT. In Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing (ROCLING 2021), pages 367–374, Taoyuan, Taiwan. The Association for Computational Linguistics and Chinese Language Processing (ACLCLP).
Cite (Informal):
CYUT at ROCLING-2021 Shared Task: Based on BERT and MacBERT (Hong & Wu, ROCLING 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2021.rocling-1.48.pdf