Evidence-Aware Inferential Text Generation with Vector Quantised Variational AutoEncoder

Daya Guo, Duyu Tang, Nan Duan, Jian Yin, Daxin Jiang, Ming Zhou


Abstract
Generating inferential texts about an event in different perspectives requires reasoning over different contexts that the event occurs. Existing works usually ignore the context that is not explicitly provided, resulting in a context-independent semantic representation that struggles to support the generation. To address this, we propose an approach that automatically finds evidence for an event from a large text corpus, and leverages the evidence to guide the generation of inferential texts. Our approach works in an encoderdecoder manner and is equipped with Vector Quantised-Variational Autoencoder, where the encoder outputs representations from a distribution over discrete variables. Such discrete representations enable automatically selecting relevant evidence, which not only facilitates evidence-aware generation, but also provides a natural way to uncover rationales behind the generation. Our approach provides state-of-the-art performance on both Event2mind and Atomic datasets. More importantly, we find that with discrete representations, our model selectively uses evidence to generate different inferential texts.
Anthology ID:
2020.acl-main.544
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6118–6129
Language:
URL:
https://aclanthology.org/2020.acl-main.544
DOI:
10.18653/v1/2020.acl-main.544
Bibkey:
Cite (ACL):
Daya Guo, Duyu Tang, Nan Duan, Jian Yin, Daxin Jiang, and Ming Zhou. 2020. Evidence-Aware Inferential Text Generation with Vector Quantised Variational AutoEncoder. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6118–6129, Online. Association for Computational Linguistics.
Cite (Informal):
Evidence-Aware Inferential Text Generation with Vector Quantised Variational AutoEncoder (Guo et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/starsem-semeval-split/2020.acl-main.544.pdf
Video:
 http://slideslive.com/38928852
Code
 microsoft/EA-VQ-VAE
Data
Event2Mind