Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding
Guokan Shang, Antoine Tixier, Michalis Vazirgiannis, Jean-Pierre Lorré
Abstract
Abstractive community detection is an important spoken language understanding task, whose goal is to group utterances in a conversation according to whether they can be jointly summarized by a common abstractive sentence. This paper provides a novel approach to this task. We first introduce a neural contextual utterance encoder featuring three types of self-attention mechanisms. We then train it using the siamese and triplet energy-based meta-architectures. Experiments on the AMI corpus show that our system outperforms multiple energy-based and non-energy based baselines from the state-of-the-art. Code and data are publicly available.- Anthology ID:
- 2020.aacl-main.34
- Volume:
- Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing
- Month:
- December
- Year:
- 2020
- Address:
- Suzhou, China
- Editors:
- Kam-Fai Wong, Kevin Knight, Hua Wu
- Venue:
- AACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 313–327
- Language:
- URL:
- https://aclanthology.org/2020.aacl-main.34
- DOI:
- Cite (ACL):
- Guokan Shang, Antoine Tixier, Michalis Vazirgiannis, and Jean-Pierre Lorré. 2020. Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 313–327, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding (Shang et al., AACL 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2020.aacl-main.34.pdf
- Code
- guokan_shang/abscomm