@inproceedings{wu-etal-2021-transferable,
    title = "Transferable Persona-Grounded Dialogues via Grounded Minimal Edits",
    author = "Wu, Chen Henry  and
      Zheng, Yinhe  and
      Mao, Xiaoxi  and
      Huang, Minlie",
    editor = "Moens, Marie-Francine  and
      Huang, Xuanjing  and
      Specia, Lucia  and
      Yih, Scott Wen-tau",
    booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2021",
    address = "Online and Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2021.emnlp-main.183/",
    doi = "10.18653/v1/2021.emnlp-main.183",
    pages = "2368--2382",
    abstract = "Grounded dialogue models generate responses that are grounded on certain concepts. Limited by the distribution of grounded dialogue data, models trained on such data face the \textit{transferability} challenges in terms of the data distribution and the type of grounded concepts. To address the challenges, we propose the \textit{grounded minimal editing} framework, which minimally edits existing responses to be grounded on the given concept. Focusing on personas, we propose Grounded Minimal Editor (GME), which learns to edit by disentangling and recombining persona-related and persona-agnostic parts of the response. To evaluate persona-grounded minimal editing, we present the PersonaMi-nEdit dataset, and experimental results show that GME outperforms competitive baselines by a large margin. To evaluate the transferability, we experiment on the test set of BlendedSkillTalk and show that GME can edit dialogue models' responses to largely improve their persona consistency while preserving the use of knowledge and empathy."
}Markdown (Informal)
[Transferable Persona-Grounded Dialogues via Grounded Minimal Edits](https://preview.aclanthology.org/ingest-emnlp/2021.emnlp-main.183/) (Wu et al., EMNLP 2021)
ACL