Abstract
This paper introduces rank-based training of structured prediction energy networks (SPENs). Our method samples from output structures using gradient descent and minimizes the ranking violation of the sampled structures with respect to a scalar scoring function defined with domain knowledge. We have successfully trained SPEN for citation field extraction without any labeled data instances, where the only source of supervision is a simple human-written scoring function. Such scoring functions are often easy to provide; the SPEN then furnishes an efficient structured prediction inference procedure.- Anthology ID:
- N18-2021
- Volume:
- Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)
- Month:
- June
- Year:
- 2018
- Address:
- New Orleans, Louisiana
- Editors:
- Marilyn Walker, Heng Ji, Amanda Stent
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 130–135
- Language:
- URL:
- https://aclanthology.org/N18-2021
- DOI:
- 10.18653/v1/N18-2021
- Cite (ACL):
- Amirmohammad Rooshenas, Aishwarya Kamath, and Andrew McCallum. 2018. Training Structured Prediction Energy Networks with Indirect Supervision. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 130–135, New Orleans, Louisiana. Association for Computational Linguistics.
- Cite (Informal):
- Training Structured Prediction Energy Networks with Indirect Supervision (Rooshenas et al., NAACL 2018)
- PDF:
- https://preview.aclanthology.org/fix-dup-bibkey/N18-2021.pdf