Discourse Information for Document-Level Temporal Dependency Parsing
Jingcheng Niu, Victoria Ng, Erin Rees, Simon De Montigny, Gerald Penn
Abstract
In this study, we examine the benefits of incorporating discourse information into document-level temporal dependency parsing. Specifically, we evaluate the effectiveness of integrating both high-level discourse profiling information, which describes the discourse function of sentences, and surface-level sentence position information into temporal dependency graph (TDG) parsing. Unexpectedly, our results suggest that simple sentence position information, particularly when encoded using our novel sentence-position embedding method, performs the best, perhaps because it does not rely on noisy model-generated feature inputs. Our proposed system surpasses the current state-of-the-art TDG parsing systems in performance. Furthermore, we aim to broaden the discussion on the relationship between temporal dependency parsing and discourse analysis, given the substantial similarities shared between the two tasks. We argue that discourse analysis results should not be merely regarded as an additional input feature for temporal dependency parsing. Instead, adopting advanced discourse analysis techniques and research insights can lead to more effective and comprehensive approaches to temporal information extraction tasks.- Anthology ID:
- 2023.codi-1.10
- Volume:
- Proceedings of the 4th Workshop on Computational Approaches to Discourse (CODI 2023)
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Michael Strube, Chloe Braud, Christian Hardmeier, Junyi Jessy Li, Sharid Loaiciga, Amir Zeldes
- Venue:
- CODI
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 82–88
- Language:
- URL:
- https://aclanthology.org/2023.codi-1.10
- DOI:
- 10.18653/v1/2023.codi-1.10
- Cite (ACL):
- Jingcheng Niu, Victoria Ng, Erin Rees, Simon De Montigny, and Gerald Penn. 2023. Discourse Information for Document-Level Temporal Dependency Parsing. In Proceedings of the 4th Workshop on Computational Approaches to Discourse (CODI 2023), pages 82–88, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Discourse Information for Document-Level Temporal Dependency Parsing (Niu et al., CODI 2023)
- PDF:
- https://preview.aclanthology.org/naacl-24-ws-corrections/2023.codi-1.10.pdf