Qintong Li
2022
Event Transition Planning for Open-ended Text Generation
Qintong Li
|
Piji Li
|
Wei Bi
|
Zhaochun Ren
|
Yuxuan Lai
|
Lingpeng Kong
Findings of the Association for Computational Linguistics: ACL 2022
Open-ended text generation tasks, such as dialogue generation and story completion, require models to generate a coherent continuation given limited preceding context. The open-ended nature of these tasks brings new challenges to the neural auto-regressive text generators nowadays. Despite these neural models are good at producing human-like text, it is difficult for them to arrange causalities and relations between given facts and possible ensuing events. To bridge this gap, we propose a novel two-stage method which explicitly arranges the ensuing events in open-ended text generation. Our approach can be understood as a specially-trained coarse-to-fine algorithm, where an event transition planner provides a “coarse” plot skeleton and a text generator in the second stage refines the skeleton. Experiments on two open-ended text generation tasks demonstrate that our proposed method effectively improves the quality of the generated text, especially in coherence and diversity. We will release the codes to the community for further exploration.
ZeroGen: Efficient Zero-shot Learning via Dataset Generation
Jiacheng Ye
|
Jiahui Gao
|
Qintong Li
|
Hang Xu
|
Jiangtao Feng
|
Zhiyong Wu
|
Tao Yu
|
Lingpeng Kong
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
There is a growing interest in dataset generation recently due to the superior generative capacity of large pre-trained language models (PLMs). In this paper, we study a flexible and efficient zero-short learning method, ZeroGen.Given a zero-shot task, we first generate a dataset from scratch using PLMs in an unsupervised manner. Then, we train a tiny task model (e.g., LSTM) under the supervision of the synthesized dataset. This approach allows highly efficient inference as the final task model only has orders of magnitude fewer parameters comparing to PLMs (e.g., GPT2-XL).Apart from being annotation-free and efficient, we argue that ZeroGen can also provide useful insights from the perspective of data-free model-agnostic knowledge distillation, and unreferenced text generation evaluation. Experiments and analysis on different NLP tasks, namely, text classification, question answering, and natural language inference, show the effectiveness of ZeroGen.
2020
EmpDG: Multi-resolution Interactive Empathetic Dialogue Generation
Qintong Li
|
Hongshen Chen
|
Zhaochun Ren
|
Pengjie Ren
|
Zhaopeng Tu
|
Zhumin Chen
Proceedings of the 28th International Conference on Computational Linguistics
A humanized dialogue system is expected to generate empathetic replies, which should be sensitive to the users’ expressed emotion. The task of empathetic dialogue generation is proposed to address this problem. The essential challenges lie in accurately capturing the nuances of human emotion and considering the potential of user feedback, which are overlooked by the majority of existing work. In response to this problem, we propose a multi-resolution adversarial model – EmpDG, to generate more empathetic responses. EmpDG exploits both the coarse-grained dialogue-level and fine-grained token-level emotions, the latter of which helps to better capture the nuances of user emotion. In addition, we introduce an interactive adversarial learning framework which exploits the user feedback, to identify whether the generated responses evoke emotion perceptivity in dialogues. Experimental results show that the proposed approach significantly outperforms the state-of-the-art baselines in both content quality and emotion perceptivity.
Search
Co-authors
- Zhaochun Ren 2
- Lingpeng Kong 2
- Piji Li 1
- Wei Bi 1
- Yuxuan Lai 1
- show all...