Abstract
Both syntactic and semantic structures are key linguistic contextual clues, in which parsing the latter has been well shown beneficial from parsing the former. However, few works ever made an attempt to let semantic parsing help syntactic parsing. As linguistic representation formalisms, both syntax and semantics may be represented in either span (constituent/phrase) or dependency, on both of which joint learning was also seldom explored. In this paper, we propose a novel joint model of syntactic and semantic parsing on both span and dependency representations, which incorporates syntactic information effectively in the encoder of neural network and benefits from two representation formalisms in a uniform way. The experiments show that semantics and syntax can benefit each other by optimizing joint objectives. Our single model achieves new state-of-the-art or competitive results on both span and dependency semantic parsing on Propbank benchmarks and both dependency and constituent syntactic parsing on Penn Treebank.- Anthology ID:
- 2020.findings-emnlp.398
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2020
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Trevor Cohn, Yulan He, Yang Liu
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4438–4449
- Language:
- URL:
- https://aclanthology.org/2020.findings-emnlp.398
- DOI:
- 10.18653/v1/2020.findings-emnlp.398
- Cite (ACL):
- Junru Zhou, Zuchao Li, and Hai Zhao. 2020. Parsing All: Syntax and Semantics, Dependencies and Spans. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4438–4449, Online. Association for Computational Linguistics.
- Cite (Informal):
- Parsing All: Syntax and Semantics, Dependencies and Spans (Zhou et al., Findings 2020)
- PDF:
- https://preview.aclanthology.org/ingest-2024-clasp/2020.findings-emnlp.398.pdf
- Code
- DoodleJZ/ParsingAll
- Data
- Penn Treebank