KFCNet: Knowledge Filtering and Contrastive Learning for Generative Commonsense Reasoning

Haonan Li, Yeyun Gong, Jian Jiao, Ruofei Zhang, Timothy Baldwin, Nan Duan


Abstract
Pre-trained language models have led to substantial gains over a broad range of natural language processing (NLP) tasks, but have been shown to have limitations for natural language generation tasks with high-quality requirements on the output, such as commonsense generation and ad keyword generation. In this work, we present a novel Knowledge Filtering and Contrastive learning Network (KFCNet) which references external knowledge and achieves better generation performance. Specifically, we propose a BERT-based filter model to remove low-quality candidates, and apply contrastive learning separately to each of the encoder and decoder, within a general encoder–decoder architecture. The encoder contrastive module helps to capture global target semantics during encoding, and the decoder contrastive module enhances the utility of retrieved prototypes while learning general features. Extensive experiments on the CommonGen benchmark show that our model outperforms the previous state of the art by a large margin: +6.6 points (42.5 vs. 35.9) for BLEU-4, +3.7 points (33.3 vs. 29.6) for SPICE, and +1.3 points (18.3 vs. 17.0) for CIDEr. We further verify the effectiveness of the proposed contrastive module on ad keyword generation, and show that our model has potential commercial value.
Anthology ID:
2021.findings-emnlp.249
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2918–2928
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2021.findings-emnlp.249/
DOI:
10.18653/v1/2021.findings-emnlp.249
Bibkey:
Cite (ACL):
Haonan Li, Yeyun Gong, Jian Jiao, Ruofei Zhang, Timothy Baldwin, and Nan Duan. 2021. KFCNet: Knowledge Filtering and Contrastive Learning for Generative Commonsense Reasoning. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2918–2928, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
KFCNet: Knowledge Filtering and Contrastive Learning for Generative Commonsense Reasoning (Li et al., Findings 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2021.findings-emnlp.249.pdf
Video:
 https://preview.aclanthology.org/build-pipeline-with-new-library/2021.findings-emnlp.249.mp4
Data
CommonGen