Dialog History Construction with Long-Short Term Memory for Robust Generative Dialog State Tracking

Byung-Jun Lee, Kee-Eung Kim


Abstract
One of the crucial components of dialog system is the dialog state tracker, which infers user’s intention from preliminary speech processing. Since the overall performance of the dialog system is heavily affected by that of the dialog tracker, it has been one of the core areas of research on dialog systems. In this paper, we present a dialog state tracker that combines a generative probabilistic model of dialog state tracking with the recurrent neural network for encoding important aspects of the dialog history. We describe a two-step gradient descent algorithm that optimizes the tracker with a complex loss function. We demonstrate that this approach yields a dialog state tracker that performs competitively with top-performing trackers participated in the first and second Dialog State Tracking Challenges.
Anthology ID:
2016.dnd-7.3
Volume:
Dialogue Discourse Volume 7
Month:
Year:
2016
Address:
Editors:
Massimo Poesio, Barbara Di Eugenio, David Schlangen, Jason D. Williams, Antoine Raux, Matthew Henderson, Jonathan Ginzburg
Venue:
DND
SIG:
SIGDIAL
Publisher:
Note:
Pages:
47–64
Language:
URL:
https://preview.aclanthology.org/ingest-dnd/2016.dnd-7.3/
DOI:
10.5087/dad.2016.302
Bibkey:
Cite (ACL):
Byung-Jun Lee and Kee-Eung Kim. 2016. Dialog History Construction with Long-Short Term Memory for Robust Generative Dialog State Tracking. Dialogue & Discourse, 7:47–64.
Cite (Informal):
Dialog History Construction with Long-Short Term Memory for Robust Generative Dialog State Tracking (Lee & Kim, DND 2016)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-dnd/2016.dnd-7.3.pdf