OleNet at SemEval-2019 Task 9: BERT based Multi-Perspective Models for Suggestion Mining

Jiaxiang Liu, Shuohuan Wang, Yu Sun


Abstract
This paper describes our system partici- pated in Task 9 of SemEval-2019: the task is focused on suggestion mining and it aims to classify given sentences into sug- gestion and non-suggestion classes in do- main specific and cross domain training setting respectively. We propose a multi- perspective architecture for learning rep- resentations by using different classical models including Convolutional Neural Networks (CNN), Gated Recurrent Units (GRU), Feed Forward Attention (FFA), etc. To leverage the semantics distributed in large amount of unsupervised data, we also have adopted the pre-trained Bidi- rectional Encoder Representations from Transformers (BERT) model as an en- coder to produce sentence and word rep- resentations. The proposed architecture is applied for both sub-tasks, and achieved f1-score of 0.7812 for subtask A, and 0.8579 for subtask B. We won the first and second place for the two tasks respec- tively in the final competition.
Anthology ID:
S19-2216
Volume:
Proceedings of the 13th International Workshop on Semantic Evaluation
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota, USA
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
1231–1236
Language:
URL:
https://aclanthology.org/S19-2216
DOI:
10.18653/v1/S19-2216
Bibkey:
Cite (ACL):
Jiaxiang Liu, Shuohuan Wang, and Yu Sun. 2019. OleNet at SemEval-2019 Task 9: BERT based Multi-Perspective Models for Suggestion Mining. In Proceedings of the 13th International Workshop on Semantic Evaluation, pages 1231–1236, Minneapolis, Minnesota, USA. Association for Computational Linguistics.
Cite (Informal):
OleNet at SemEval-2019 Task 9: BERT based Multi-Perspective Models for Suggestion Mining (Liu et al., SemEval 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/S19-2216.pdf