Zineb Bennis


A simple but effective model for attachment in discourse parsing with multi-task learning for relation labeling
Zineb Bennis | Julie Hunter | Nicholas Asher
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics

In this paper, we present a discourse parsing model for conversation trained on the STAC. We fine-tune a BERT-based model to encode pairs of discourse units and use a simple linear layer to predict discourse attachments. We then exploit a multi-task setting to predict relation labels. The multitask approach effectively aids in the difficult task of relation type prediction; our f1 score of 57 surpasses the state of the art with no loss in performance for attachment, confirming the intuitive interdependence of these two tasks. Our method also improves over previous discourse parsing models in allowing longer input sizes and in permitting attachments in which one node has multiple parents, an important feature of multiparty conversation.