Abstract
Exact structured inference with neural network scoring functions is computationally challenging but several methods have been proposed for approximating inference. One approach is to perform gradient descent with respect to the output structure directly (Belanger and McCallum, 2016). Another approach, proposed recently, is to train a neural network (an “inference network”) to perform inference (Tu and Gimpel, 2018). In this paper, we compare these two families of inference methods on three sequence labeling datasets. We choose sequence labeling because it permits us to use exact inference as a benchmark in terms of speed, accuracy, and search error. Across datasets, we demonstrate that inference networks achieve a better speed/accuracy/search error trade-off than gradient descent, while also being faster than exact inference at similar accuracy levels. We find further benefit by combining inference networks and gradient descent, using the former to provide a warm start for the latter.- Anthology ID:
- N19-1335
- Volume:
- Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
- Month:
- June
- Year:
- 2019
- Address:
- Minneapolis, Minnesota
- Editors:
- Jill Burstein, Christy Doran, Thamar Solorio
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3313–3324
- Language:
- URL:
- https://aclanthology.org/N19-1335
- DOI:
- 10.18653/v1/N19-1335
- Cite (ACL):
- Lifu Tu and Kevin Gimpel. 2019. Benchmarking Approximate Inference Methods for Neural Structured Prediction. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 3313–3324, Minneapolis, Minnesota. Association for Computational Linguistics.
- Cite (Informal):
- Benchmarking Approximate Inference Methods for Neural Structured Prediction (Tu & Gimpel, NAACL 2019)
- PDF:
- https://preview.aclanthology.org/fix-dup-bibkey/N19-1335.pdf
- Data
- CoNLL 2003