A Neural Network Parser that Handles Sparse Data

James Henderson


Abstract
Previous work has demonstrated the viability of a particular neural network architecture, Simple Synchrony Networks, for syntactic parsing. Here we present additional results on the performance of this type of parser, including direct comparisons on the same dataset with a standard statistical parsing method, Probabilistic Context Free Grammars. We focus these experiments on demonstrating one of the main advantages of the SSN parser over the PCFG, handling sparse data. We use smaller datasets than are typically used with statistical methods, resulting in the PCFG finding parses for under half of the test sentences, while the SSN finds parses for all sentences. Even on the PCFG ‘s parsed half, the SSN performs better than the PCFG, as measure by recall and precision on both constituents and a dependency-like measure.
Anthology ID:
2000.iwpt-1.14
Volume:
Proceedings of the Sixth International Workshop on Parsing Technologies
Month:
February 23-25
Year:
2000
Address:
Trento, Italy
Venues:
IWPT | WS
SIG:
SIGPARSE
Publisher:
Association for Computational Linguistics
Note:
Pages:
123–134
Language:
URL:
https://aclanthology.org/2000.iwpt-1.14
DOI:
Bibkey:
Cite (ACL):
James Henderson. 2000. A Neural Network Parser that Handles Sparse Data. In Proceedings of the Sixth International Workshop on Parsing Technologies, pages 123–134, Trento, Italy. Association for Computational Linguistics.
Cite (Informal):
A Neural Network Parser that Handles Sparse Data (Henderson, IWPT 2000)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2000.iwpt-1.14.pdf