Dylan Finkbeiner
2021
Improved Latent Tree Induction with Distant Supervision via Span Constraints
Zhiyang Xu
|
Andrew Drozdov
|
Jay Yoon Lee
|
Tim O’Gorman
|
Subendhu Rongali
|
Dylan Finkbeiner
|
Shilpa Suresh
|
Mohit Iyyer
|
Andrew McCallum
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
For over thirty years, researchers have developed and analyzed methods for latent tree induction as an approach for unsupervised syntactic parsing. Nonetheless, modern systems still do not perform well enough compared to their supervised counterparts to have any practical use as structural annotation of text. In this work, we present a technique that uses distant supervision in the form of span constraints (i.e. phrase bracketing) to improve performance in unsupervised constituency parsing. Using a relatively small number of span constraints we can substantially improve the output from DIORA, an already competitive unsupervised parsing system. Compared with full parse tree annotation, span constraints can be acquired with minimal effort, such as with a lexicon derived from Wikipedia, to find exact text matches. Our experiments show span constraints based on entities improves constituency parsing on English WSJ Penn Treebank by more than 5 F1. Furthermore, our method extends to any domain where span constraints are easily attainable, and as a case study we demonstrate its effectiveness by parsing biomedical text from the CRAFT dataset.
Search
Co-authors
- Zhiyang Xu 1
- Andrew Drozdov 1
- Jay Yoon Lee 1
- Tim O’Gorman 1
- Subendhu Rongali 1
- show all...