Intent Features for Rich Natural Language Understanding
Brian Lester, Sagnik Ray Choudhury, Rashmi Prasad, Srinivas Bangalore
Abstract
Complex natural language understanding modules in dialog systems have a richer understanding of user utterances, and thus are critical in providing a better user experience. However, these models are often created from scratch, for specific clients and use cases and require the annotation of large datasets. This encourages the sharing of annotated data across multiple clients. To facilitate this we introduce the idea of intent features: domain and topic agnostic properties of intents that can be learnt from the syntactic cues only, and hence can be shared. We introduce a new neural network architecture, the Global-Local model, that shows significant improvement over strong baselines for identifying these features in a deployed, multi-intent natural language understanding module, and more generally in a classification setting where a part of an utterance has to be classified utilizing the whole context.- Anthology ID:
- 2021.naacl-industry.27
- Volume:
- Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers
- Month:
- June
- Year:
- 2021
- Address:
- Online
- Editors:
- Young-bum Kim, Yunyao Li, Owen Rambow
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 214–221
- Language:
- URL:
- https://preview.aclanthology.org/ingest_wac_2008/2021.naacl-industry.27/
- DOI:
- 10.18653/v1/2021.naacl-industry.27
- Cite (ACL):
- Brian Lester, Sagnik Ray Choudhury, Rashmi Prasad, and Srinivas Bangalore. 2021. Intent Features for Rich Natural Language Understanding. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers, pages 214–221, Online. Association for Computational Linguistics.
- Cite (Informal):
- Intent Features for Rich Natural Language Understanding (Lester et al., NAACL 2021)
- PDF:
- https://preview.aclanthology.org/ingest_wac_2008/2021.naacl-industry.27.pdf
- Code
- blester125/global-local-model