Abstract
Sequence tagging models for constituent parsing are faster, but less accurate than other types of parsers. In this work, we address the following weaknesses of such constituent parsers: (a) high error rates around closing brackets of long constituents, (b) large label sets, leading to sparsity, and (c) error propagation arising from greedy decoding. To effectively close brackets, we train a model that learns to switch between tagging schemes. To reduce sparsity, we decompose the label set and use multi-task learning to jointly learn to predict sublabels. Finally, we mitigate issues from greedy decoding through auxiliary losses and sentence-level fine-tuning with policy gradient. Combining these techniques, we clearly surpass the performance of sequence tagging constituent parsers on the English and Chinese Penn Treebanks, and reduce their parsing time even further. On the SPMRL datasets, we observe even greater improvements across the board, including a new state of the art on Basque, Hebrew, Polish and Swedish.- Anthology ID:
- N19-1341
- Volume:
- Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
- Month:
- June
- Year:
- 2019
- Address:
- Minneapolis, Minnesota
- Editors:
- Jill Burstein, Christy Doran, Thamar Solorio
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3372–3383
- Language:
- URL:
- https://aclanthology.org/N19-1341
- DOI:
- 10.18653/v1/N19-1341
- Cite (ACL):
- David Vilares, Mostafa Abdou, and Anders Søgaard. 2019. Better, Faster, Stronger Sequence Tagging Constituent Parsers. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 3372–3383, Minneapolis, Minnesota. Association for Computational Linguistics.
- Cite (Informal):
- Better, Faster, Stronger Sequence Tagging Constituent Parsers (Vilares et al., NAACL 2019)
- PDF:
- https://preview.aclanthology.org/autopr/N19-1341.pdf
- Code
- aghie/tree2labels + additional community code
- Data
- Penn Treebank