DualTKB: A Dual Learning Bridge between Text and Knowledge Base

Pierre Dognin, Igor Melnyk, Inkit Padhi, Cicero Nogueira dos Santos, Payel Das


Abstract
In this work, we present a dual learning approach for unsupervised text to path and path to text transfers in Commonsense Knowledge Bases (KBs). We investigate the impact of weak supervision by creating a weakly supervised dataset and show that even a slight amount of supervision can significantly improve the model performance and enable better-quality transfers. We examine different model architectures, and evaluation metrics, proposing a novel Commonsense KB completion metric tailored for generative models. Extensive experimental results show that the proposed method compares very favorably to the existing baselines. This approach is a viable step towards a more advanced system for automatic KB construction/expansion and the reverse operation of KB conversion to coherent textual descriptions.
Anthology ID:
2020.emnlp-main.694
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8605–8616
Language:
URL:
https://aclanthology.org/2020.emnlp-main.694
DOI:
10.18653/v1/2020.emnlp-main.694
Bibkey:
Cite (ACL):
Pierre Dognin, Igor Melnyk, Inkit Padhi, Cicero Nogueira dos Santos, and Payel Das. 2020. DualTKB: A Dual Learning Bridge between Text and Knowledge Base. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 8605–8616, Online. Association for Computational Linguistics.
Cite (Informal):
DualTKB: A Dual Learning Bridge between Text and Knowledge Base (Dognin et al., EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.emnlp-main.694.pdf
Video:
 https://slideslive.com/38939126
Data
ATOMICConceptNet