Adapting BERT to Implicit Discourse Relation Classification with a Focus on Discourse Connectives

Yudai Kishimoto, Yugo Murawaki, Sadao Kurohashi


Abstract
BERT, a neural network-based language model pre-trained on large corpora, is a breakthrough in natural language processing, significantly outperforming previous state-of-the-art models in numerous tasks. However, there have been few reports on its application to implicit discourse relation classification, and it is not clear how BERT is best adapted to the task. In this paper, we test three methods of adaptation. (1) We perform additional pre-training on text tailored to discourse classification. (2) In expectation of knowledge transfer from explicit discourse relations to implicit discourse relations, we add a task named explicit connective prediction at the additional pre-training step. (3) To exploit implicit connectives given by treebank annotators, we add a task named implicit connective prediction at the fine-tuning step. We demonstrate that these three techniques can be combined straightforwardly in a single training pipeline. Through comprehensive experiments, we found that the first and second techniques provide additional gain while the last one did not.
Anthology ID:
2020.lrec-1.145
Volume:
Proceedings of the Twelfth Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
1152–1158
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.145
DOI:
Bibkey:
Cite (ACL):
Yudai Kishimoto, Yugo Murawaki, and Sadao Kurohashi. 2020. Adapting BERT to Implicit Discourse Relation Classification with a Focus on Discourse Connectives. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 1152–1158, Marseille, France. European Language Resources Association.
Cite (Informal):
Adapting BERT to Implicit Discourse Relation Classification with a Focus on Discourse Connectives (Kishimoto et al., LREC 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.lrec-1.145.pdf