Dheeraj Sreedhar
2023
Self-Supervised Rule Learning to Link Text Segments to Relational Elements of Structured Knowledge
Shajith Ikbal
|
Udit Sharma
|
Hima Karanam
|
Sumit Neelam
|
Ronny Luss
|
Dheeraj Sreedhar
|
Pavan Kapanipathi
|
Naweed Khan
|
Kyle Erwin
|
Ndivhuwo Makondo
|
Ibrahim Abdelaziz
|
Achille Fokoue
|
Alexander Gray
|
Maxwell Crouse
|
Subhajit Chaudhury
|
Chitra Subramanian
Findings of the Association for Computational Linguistics: EMNLP 2023
We present a neuro-symbolic approach to self-learn rules that serve as interpretable knowledge to perform relation linking in knowledge base question answering systems. These rules define natural language text predicates as a weighted mixture of knowledge base paths. The weights learned during training effectively serve the mapping needed to perform relation linking. We use popular masked training strategy to self-learn the rules. A key distinguishing aspect of our work is that the masked training operate over logical forms of the sentence instead of their natural language text form. This offers opportunity to extract extended context information from the structured knowledge source and use that to build robust and human readable rules. We evaluate accuracy and usefulness of such learned rules by utilizing them for prediction of missing kinship relation in CLUTRR dataset and relation linking in a KBQA system using SWQ-WD dataset. Results demonstrate the effectiveness of our approach - its generalizability, interpretability and ability to achieve an average performance gain of 17% on CLUTRR dataset.
Search