Vishal Vasudevan
2023
Unified Contextual Query Rewriting
Yingxue Zhou
|
Jie Hao
|
Mukund Rungta
|
Yang Liu
|
Eunah Cho
|
Xing Fan
|
Yanbin Lu
|
Vishal Vasudevan
|
Kellen Gillespie
|
Zeynab Raeesy
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track)
Query rewriting (QR) is an important technique for user friction (i.e. recovering ASR error or system error) reduction and contextual carryover (i.e. ellipsis and co-reference) in conversational AI systems. Recently, generation-based QR models have achieved promising results on these two tasks separately. Although these two tasks have many similarities such as they both use the previous dialogue along with the current request as model input, there is no unified model to solve them jointly. To this end, we propose a unified contextual query rewriting model that unifies QR for both reducing friction and contextual carryover purpose. Moreover, we involve multiple auxiliary tasks such as trigger prediction and NLU interpretation tasks to boost the performance of the rewrite. We leverage the text-to-text unified framework which uses independent tasks with weighted loss to account for task importance. Then we propose new unified multitask learning strategies including a sequential model which outputs one sentence for multi-tasks, and a hybrid model where some tasks are independent and some tasks are sequentially generated. Our experimental results demonstrate the effectiveness of the proposed unified learning methods.
Search
Co-authors
- Yingxue Zhou 1
- Jie Hao 1
- Mukund Rungta 1
- Yang Liu 1
- Eunah Cho 1
- show all...
Venues
- acl1