Discourse Parsing Enhanced by Discourse Dependence Perception

Yuqing Xing, Longyin Zhang, Fang Kong, Guodong Zhou


Abstract
In recent years, top-down neural models have achieved significant success in text-level discourse parsing. Nevertheless, they still suffer from the top-down error propagation issue, especially when the performance on the upper-level tree nodes is terrible. In this research, we aim to learn from the correlations in between EDUs directly to shorten the hierarchical distance of the RST structure to alleviate the above problem. Specifically, we contribute a joint top-down framework that learns from both discourse dependency and constituency parsing through one shared encoder and two independent decoders. Moreover, we also explore a constituency-to-dependency conversion scheme tailored for the Chinese discourse corpus to ensure the high quality of the joint learning process. Our experimental results on CDTB show that the dependency information we use well heightens the understanding of the rhetorical structure, especially for the upper-level tree layers.
Anthology ID:
2022.aacl-main.28
Volume:
Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
November
Year:
2022
Address:
Online only
Venues:
AACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
354–363
Language:
URL:
https://aclanthology.org/2022.aacl-main.28
DOI:
Bibkey:
Cite (ACL):
Yuqing Xing, Longyin Zhang, Fang Kong, and Guodong Zhou. 2022. Discourse Parsing Enhanced by Discourse Dependence Perception. In Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 354–363, Online only. Association for Computational Linguistics.
Cite (Informal):
Discourse Parsing Enhanced by Discourse Dependence Perception (Xing et al., AACL-IJCNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.aacl-main.28.pdf