Generating Faithful Text From a Knowledge Graph with Noisy Reference Text

Tahsina Hashem, Weiqing Wang, Derry Tanti Wijaya, Mohammed Eunus Ali, Yuan-Fang Li


Abstract
Knowledge Graph (KG)-to-Text generation aims at generating fluent natural-language text that accurately represents the information of a given knowledge graph. While significant progress has been made in this task by exploiting the power of pre-trained language models (PLMs) with appropriate graph structure-aware modules, existing models still fall short of generating faithful text, especially when the ground-truth natural-language text contains additional information that is not present in the graph. In this paper, we develop a KG-to-text generation model that can generate faithful natural-language text from a given graph, in the presence of noisy reference text. Our framework incorporates two core ideas: Firstly, we utilize contrastive learning to enhance the model’s ability to differentiate between faithful and hallucinated information in the text, thereby encouraging the decoder to generate text that aligns with the input graph. Secondly, we empower the decoder to control the level of hallucination in the generated text by employing a controllable text generation technique. We evaluate our model’s performance through the standard quantitative metrics as well as a ChatGPT-based quantitative and qualitative analysis. Our evaluation demonstrates the superior performance of our model over state-of-the-art KG-to-text models on faithfulness.
Anthology ID:
2023.inlg-main.8
Volume:
Proceedings of the 16th International Natural Language Generation Conference
Month:
September
Year:
2023
Address:
Prague, Czechia
Editors:
C. Maria Keet, Hung-Yi Lee, Sina Zarrieß
Venues:
INLG | SIGDIAL
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
106–122
Language:
URL:
https://aclanthology.org/2023.inlg-main.8
DOI:
10.18653/v1/2023.inlg-main.8
Bibkey:
Cite (ACL):
Tahsina Hashem, Weiqing Wang, Derry Tanti Wijaya, Mohammed Eunus Ali, and Yuan-Fang Li. 2023. Generating Faithful Text From a Knowledge Graph with Noisy Reference Text. In Proceedings of the 16th International Natural Language Generation Conference, pages 106–122, Prague, Czechia. Association for Computational Linguistics.
Cite (Informal):
Generating Faithful Text From a Knowledge Graph with Noisy Reference Text (Hashem et al., INLG-SIGDIAL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2023.inlg-main.8.pdf
Supplementary attachment:
 2023.inlg-main.8.Supplementary_Attachment.pdf