Augmented Natural Language for Generative Sequence Labeling
Ben Athiwaratkun, Cicero Nogueira dos Santos, Jason Krone, Bing Xiang
Abstract
We propose a generative framework for joint sequence labeling and sentence-level classification. Our model performs multiple sequence labeling tasks at once using a single, shared natural language output space. Unlike prior discriminative methods, our model naturally incorporates label semantics and shares knowledge across tasks. Our framework general purpose, performing well on few-shot learning, low resource, and high resource tasks. We demonstrate these advantages on popular named entity recognition, slot labeling, and intent classification benchmarks. We set a new state-of-the-art for few-shot slot labeling, improving substantially upon the previous 5-shot (75.0% to 90.9%) and 1-shot (70.4% to 81.0%) state-of-the-art results. Furthermore, our model generates large improvements (46.27% to 63.83%) in low resource slot labeling over a BERT baseline by incorporating label semantics. We also maintain competitive results on high resource tasks, performing within two points of the state-of-the-art on all tasks and setting a new state-of-the-art on the SNIPS dataset.- Anthology ID:
- 2020.emnlp-main.27
- Volume:
- Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 375–385
- Language:
- URL:
- https://aclanthology.org/2020.emnlp-main.27
- DOI:
- 10.18653/v1/2020.emnlp-main.27
- Cite (ACL):
- Ben Athiwaratkun, Cicero Nogueira dos Santos, Jason Krone, and Bing Xiang. 2020. Augmented Natural Language for Generative Sequence Labeling. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 375–385, Online. Association for Computational Linguistics.
- Cite (Informal):
- Augmented Natural Language for Generative Sequence Labeling (Athiwaratkun et al., EMNLP 2020)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2020.emnlp-main.27.pdf
- Data
- ATIS