Mason Bretan
Fixing paper assignments
- Please select all papers that belong to the same person.
- Indicate below which author they should be assigned to.
TODO: "submit" and "cancel" buttons here
2021
Data Augmentation for Voice-Assistant NLU using BERT-based Interchangeable Rephrase
Akhila Yerukola
|
Mason Bretan
|
Hongxia Jin
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
We introduce a data augmentation technique based on byte pair encoding and a BERT-like self-attention model to boost performance on spoken language understanding tasks. We compare and evaluate this method with a range of augmentation techniques encompassing generative models such as VAEs and performance-boosting techniques such as synonym replacement and back-translation. We show our method performs strongly on domain and intent classification tasks for a voice assistant and in a user-study focused on utterance naturalness and semantic similarity.