Gang Rao
2021
RG PA at SemEval-2021 Task 1: A Contextual Attention-based Model with RoBERTa for Lexical Complexity Prediction
Gang Rao
|
Maochang Li
|
Xiaolong Hou
|
Lianxin Jiang
|
Yang Mo
|
Jianping Shen
Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval-2021)
In this paper we propose a contextual attention based model with two-stage fine-tune training using RoBERTa. First, we perform the first-stage fine-tune on corpus with RoBERTa, so that the model can learn some prior domain knowledge. Then we get the contextual embedding of context words based on the token-level embedding with the fine-tuned model. And we use Kfold cross-validation to get K models and ensemble them to get the final result. Finally, we attain the 2nd place in the final evaluation phase of sub-task 2 with pearson correlation of 0.8575.
FPAI at SemEval-2021 Task 6: BERT-MRC for Propaganda Techniques Detection
Xiaolong Hou
|
Junsong Ren
|
Gang Rao
|
Lianxin Lian
|
Zhihao Ruan
|
Yang Mo
|
JIanping Shen
Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval-2021)
The objective of subtask 2 of SemEval-2021 Task 6 is to identify techniques used together with the span(s) of text covered by each technique. This paper describes the system and model we developed for the task. We first propose a pipeline system to identify spans, then to classify the technique in the input sequence. But it severely suffers from handling the overlapping in nested span. Then we propose to formulize the task as a question answering task by MRC framework which achieves a better result compared to the pipeline method. Moreover, data augmentation and loss design techniques are also explored to alleviate the problem of data sparse and imbalance. Finally, we attain the 3rd place in the final evaluation phase.
Search
Co-authors
- Xiaolong Hou 2
- Yang Mo 2
- Jianping Shen 2
- Maochang Li 1
- Lianxin Jiang 1
- show all...