Modeling discourse cohesion for discourse parsing via memory network

Yanyan Jia, Yuan Ye, Yansong Feng, Yuxuan Lai, Rui Yan, Dongyan Zhao


Abstract
Identifying long-span dependencies between discourse units is crucial to improve discourse parsing performance. Most existing approaches design sophisticated features or exploit various off-the-shelf tools, but achieve little success. In this paper, we propose a new transition-based discourse parser that makes use of memory networks to take discourse cohesion into account. The automatically captured discourse cohesion benefits discourse parsing, especially for long span scenarios. Experiments on the RST discourse treebank show that our method outperforms traditional featured based methods, and the memory based discourse cohesion can improve the overall parsing performance significantly.
Anthology ID:
P18-2070
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
438–443
Language:
URL:
https://aclanthology.org/P18-2070
DOI:
10.18653/v1/P18-2070
Bibkey:
Cite (ACL):
Yanyan Jia, Yuan Ye, Yansong Feng, Yuxuan Lai, Rui Yan, and Dongyan Zhao. 2018. Modeling discourse cohesion for discourse parsing via memory network. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 438–443, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Modeling discourse cohesion for discourse parsing via memory network (Jia et al., ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/P18-2070.pdf
Presentation:
 P18-2070.Presentation.pdf
Video:
 https://preview.aclanthology.org/landing_page/P18-2070.mp4