An Enhanced Knowledge Injection Model for Commonsense Generation

Zhihao Fan, Yeyun Gong, Zhongyu Wei, Siyuan Wang, Yameng Huang, Jian Jiao, Xuanjing Huang, Nan Duan, Ruofei Zhang


Abstract
Commonsense generation aims at generating plausible everyday scenario description based on a set of provided concepts. Digging the relationship of concepts from scratch is non-trivial, therefore, we retrieve prototypes from external knowledge to assist the understanding of the scenario for better description generation. We integrate two additional modules into the pretrained encoder-decoder model for prototype modeling to enhance the knowledge injection procedure. We conduct experiment on CommonGen benchmark, experimental results show that our method significantly improves the performance on all the metrics.
Anthology ID:
2020.coling-main.182
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2014–2025
Language:
URL:
https://aclanthology.org/2020.coling-main.182
DOI:
10.18653/v1/2020.coling-main.182
Bibkey:
Cite (ACL):
Zhihao Fan, Yeyun Gong, Zhongyu Wei, Siyuan Wang, Yameng Huang, Jian Jiao, Xuanjing Huang, Nan Duan, and Ruofei Zhang. 2020. An Enhanced Knowledge Injection Model for Commonsense Generation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2014–2025, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
An Enhanced Knowledge Injection Model for Commonsense Generation (Fan et al., COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/2020.coling-main.182.pdf
Data
CommonGenCommonsenseQASWAG