Textual Enhanced Contrastive Learning for Solving Math Word Problems

Yibin Shen, Qianying Liu, Zhuoyuan Mao, Fei Cheng, Sadao Kurohashi


Abstract
Solving math word problems is the task that analyses the relation of quantities e and requires an accurate understanding of contextual natural language information. Recent studies show that current models rely on shallow heuristics to predict solutions and could be easily misled by small textual perturbations. To address this problem, we propose a Textual Enhanced Contrastive Learning framework, which enforces the models to distinguish semantically similar examples while holding different mathematical logic. We adopt a self-supervised manner strategy to enrich examples with subtle textual variance by textual reordering or problem re-construction. We then retrieve the hardest to differentiate samples from both equation and textual perspectives and guide the model to learn their representations. Experimental results show that our method achieves state-of-the-art on both widely used benchmark datasets and also exquisitely designed challenge datasets in English and Chinese.
Anthology ID:
2022.findings-emnlp.316
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4297–4307
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.316
DOI:
10.18653/v1/2022.findings-emnlp.316
Bibkey:
Cite (ACL):
Yibin Shen, Qianying Liu, Zhuoyuan Mao, Fei Cheng, and Sadao Kurohashi. 2022. Textual Enhanced Contrastive Learning for Solving Math Word Problems. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 4297–4307, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Textual Enhanced Contrastive Learning for Solving Math Word Problems (Shen et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-emnlp.316.pdf