Zihang Xu
2022
HFL at SemEval-2022 Task 8: A Linguistics-inspired Regression Model with Data Augmentation for Multilingual News Similarity
Zihang Xu
|
Ziqing Yang
|
Yiming Cui
|
Zhigang Chen
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
This paper describes our system designed for SemEval-2022 Task 8: Multilingual News Article Similarity. We proposed a linguistics-inspired model trained with a few task-specific strategies. The main techniques of our system are: 1) data augmentation, 2) multi-label loss, 3) adapted R-Drop, 4) samples reconstruction with the head-tail combination. We also present a brief analysis of some negative methods like two-tower architecture. Our system ranked 1st on the leaderboard while achieving a Pearson’s Correlation Coefficient of 0.818 on the official evaluation set.
CINO: A Chinese Minority Pre-trained Language Model
Ziqing Yang
|
Zihang Xu
|
Yiming Cui
|
Baoxin Wang
|
Min Lin
|
Dayong Wu
|
Zhigang Chen
Proceedings of the 29th International Conference on Computational Linguistics
Multilingual pre-trained language models have shown impressive performance on cross-lingual tasks. It greatly facilitates the applications of natural language processing on low-resource languages. However, there are still some languages that the current multilingual models do not perform well on. In this paper, we propose CINO (Chinese Minority Pre-trained Language Model), a multilingual pre-trained language model for Chinese minority languages. It covers Standard Chinese, Yue Chinese, and six other ethnic minority languages. To evaluate the cross-lingual ability of the multilingual model on ethnic minority languages, we collect documents from Wikipedia and news websites, and construct two text classification datasets, WCM (Wiki-Chinese-Minority) and CMNews (Chinese-Minority-News). We show that CINO notably outperforms the baselines on various classification tasks. The CINO model and the datasets are publicly available at http://cino.hfl-rc.com.
Search
Co-authors
- Ziqing Yang 2
- Yiming Cui 2
- Zhigang Chen 2
- Baoxin Wang 1
- Min Lin 1
- show all...