Diff-Explainer: Differentiable Convex Optimization for Explainable Multi-hop Inference
Mokanarangan Thayaparan, Marco Valentino, Deborah Ferreira, Julia Rozanova, André Freitas
Abstract
This paper presents Diff-Explainer, the first hybrid framework for explainable multi-hop inference that integrates explicit constraints with neural architectures through differentiable convex optimization. Specifically, Diff- Explainer allows for the fine-tuning of neural representations within a constrained optimization framework to answer and explain multi-hop questions in natural language. To demonstrate the efficacy of the hybrid framework, we combine existing ILP-based solvers for multi-hop Question Answering (QA) with Transformer-based representations. An extensive empirical evaluation on scientific and commonsense QA tasks demonstrates that the integration of explicit constraints in a end-to-end differentiable framework can significantly improve the performance of non- differentiable ILP solvers (8.91%–13.3%). Moreover, additional analysis reveals that Diff-Explainer is able to achieve strong performance when compared to standalone Transformers and previous multi-hop approaches while still providing structured explanations in support of its predictions.- Anthology ID:
- 2022.tacl-1.64
- Volume:
- Transactions of the Association for Computational Linguistics, Volume 10
- Month:
- Year:
- 2022
- Address:
- Cambridge, MA
- Editors:
- Brian Roark, Ani Nenkova
- Venue:
- TACL
- SIG:
- Publisher:
- MIT Press
- Note:
- Pages:
- 1103–1119
- Language:
- URL:
- https://preview.aclanthology.org/remove-affiliations/2022.tacl-1.64/
- DOI:
- 10.1162/tacl_a_00508
- Cite (ACL):
- Mokanarangan Thayaparan, Marco Valentino, Deborah Ferreira, Julia Rozanova, and André Freitas. 2022. Diff-Explainer: Differentiable Convex Optimization for Explainable Multi-hop Inference. Transactions of the Association for Computational Linguistics, 10:1103–1119.
- Cite (Informal):
- Diff-Explainer: Differentiable Convex Optimization for Explainable Multi-hop Inference (Thayaparan et al., TACL 2022)
- PDF:
- https://preview.aclanthology.org/remove-affiliations/2022.tacl-1.64.pdf