Dialogue Act Classification with Context-Aware Self-Attention

Vipul Raheja, Joel Tetreault


Abstract
Recent work in Dialogue Act classification has treated the task as a sequence labeling problem using hierarchical deep neural networks. We build on this prior work by leveraging the effectiveness of a context-aware self-attention mechanism coupled with a hierarchical recurrent neural network. We conduct extensive evaluations on standard Dialogue Act classification datasets and show significant improvement over state-of-the-art results on the Switchboard Dialogue Act (SwDA) Corpus. We also investigate the impact of different utterance-level representation learning methods and show that our method is effective at capturing utterance-level semantic text representations while maintaining high accuracy.
Anthology ID:
N19-1373
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3727–3733
Language:
URL:
https://aclanthology.org/N19-1373
DOI:
10.18653/v1/N19-1373
Bibkey:
Cite (ACL):
Vipul Raheja and Joel Tetreault. 2019. Dialogue Act Classification with Context-Aware Self-Attention. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 3727–3733, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Dialogue Act Classification with Context-Aware Self-Attention (Raheja & Tetreault, NAACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/autopr/N19-1373.pdf
Data
MRDASwitchboard-1 Corpus