MERIt: Meta-Path Guided Contrastive Learning for Logical Reasoning

Fangkai Jiao, Yangyang Guo, Xuemeng Song, Liqiang Nie


Abstract
Logical reasoning is of vital importance to natural language understanding. Previous studies either employ graph-based models to incorporate prior knowledge about logical relations, or introduce symbolic logic into neural models through data augmentation. These methods, however, heavily depend on annotated training data, and thus suffer from over-fitting and poor generalization problems due to the dataset sparsity. To address these two problems, in this paper, we propose MERIt, a MEta-path guided contrastive learning method for logical ReasonIng of text, to perform self-supervised pre-training on abundant unlabeled text data. Two novel strategies serve as indispensable components of our method. In particular, a strategy based on meta-path is devised to discover the logical structure in natural texts, followed by a counterfactual data augmentation strategy to eliminate the information shortcut induced by pre-training. The experimental results on two challenging logical reasoning benchmarks, i.e., ReClor and LogiQA, demonstrate that our method outperforms the SOTA baselines with significant improvements.
Anthology ID:
2022.findings-acl.276
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3496–3509
Language:
URL:
https://aclanthology.org/2022.findings-acl.276
DOI:
10.18653/v1/2022.findings-acl.276
Bibkey:
Cite (ACL):
Fangkai Jiao, Yangyang Guo, Xuemeng Song, and Liqiang Nie. 2022. MERIt: Meta-Path Guided Contrastive Learning for Logical Reasoning. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3496–3509, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
MERIt: Meta-Path Guided Contrastive Learning for Logical Reasoning (Jiao et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.findings-acl.276.pdf
Code
 sparkjiao/merit
Data
LogiQAReClor