A Systematic Study of Neural Discourse Models for Implicit Discourse Relation

Attapol Rutherford, Vera Demberg, Nianwen Xue


Abstract
Inferring implicit discourse relations in natural language text is the most difficult subtask in discourse parsing. Many neural network models have been proposed to tackle this problem. However, the comparison for this task is not unified, so we could hardly draw clear conclusions about the effectiveness of various architectures. Here, we propose neural network models that are based on feedforward and long-short term memory architecture and systematically study the effects of varying structures. To our surprise, the best-configured feedforward architecture outperforms LSTM-based model in most cases despite thorough tuning. Further, we compare our best feedforward system with competitive convolutional and recurrent networks and find that feedforward can actually be more effective. For the first time for this task, we compile and publish outputs from previous neural and non-neural systems to establish the standard for further comparison.
Anthology ID:
E17-1027
Volume:
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers
Month:
April
Year:
2017
Address:
Valencia, Spain
Editors:
Mirella Lapata, Phil Blunsom, Alexander Koller
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
281–291
Language:
URL:
https://aclanthology.org/E17-1027
DOI:
Bibkey:
Cite (ACL):
Attapol Rutherford, Vera Demberg, and Nianwen Xue. 2017. A Systematic Study of Neural Discourse Models for Implicit Discourse Relation. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers, pages 281–291, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
A Systematic Study of Neural Discourse Models for Implicit Discourse Relation (Rutherford et al., EACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/E17-1027.pdf