Pre- and In-Parsing Models for Neural Empty Category Detection

Yufei Chen, Yuanyuan Zhao, Weiwei Sun, Xiaojun Wan

[How to correct problems with metadata yourself]


Abstract
Motivated by the positive impact of empty category on syntactic parsing, we study neural models for pre- and in-parsing detection of empty category, which has not previously been investigated. We find several non-obvious facts: (a) BiLSTM can capture non-local contextual information which is essential for detecting empty categories, (b) even with a BiLSTM, syntactic information is still able to enhance the detection, and (c) automatic detection of empty categories improves parsing quality for overt words. Our neural ECD models outperform the prior state-of-the-art by significant margins.
Anthology ID:
P18-1250
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2687–2696
Language:
URL:
https://aclanthology.org/P18-1250
DOI:
10.18653/v1/P18-1250
Bibkey:
Cite (ACL):
Yufei Chen, Yuanyuan Zhao, Weiwei Sun, and Xiaojun Wan. 2018. Pre- and In-Parsing Models for Neural Empty Category Detection. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2687–2696, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Pre- and In-Parsing Models for Neural Empty Category Detection (Chen et al., ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/P18-1250.pdf
Software:
 P18-1250.Software.zip
Poster:
 P18-1250.Poster.pdf