Abstract
Learning contextual text embeddings that represent causal graphs has been useful in improving the performance of downstream tasks like causal treatment effect estimation. However, existing causal embeddings which are trained to predict direct causal links, fail to capture other indirect causal links of the graph, thus leading to spurious correlations in downstream tasks. In this paper, we define the faithfulness property of contextual embeddings to capture geometric distance-based properties of directed acyclic causal graphs. By incorporating these faithfulness properties, we learn text embeddings that are 31.3% more faithful to human validated causal graphs with about 800K and 200K causal links and achieve 21.1% better Precision-Recall AUC in a link prediction fine-tuning task. Further, in a crowdsourced causal question-answering task on Yahoo! Answers with questions of the form “What causes X?”, our faithful embeddings achieved a precision of the first ranked answer (P@1) of 41.07%, outperforming the existing baseline by 10.2%.- Anthology ID:
- 2021.acl-long.69
- Volume:
- Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
- Month:
- August
- Year:
- 2021
- Address:
- Online
- Venues:
- ACL | IJCNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 839–850
- Language:
- URL:
- https://aclanthology.org/2021.acl-long.69
- DOI:
- 10.18653/v1/2021.acl-long.69
- Cite (ACL):
- Ananth Balashankar and Lakshminarayanan Subramanian. 2021. Learning Faithful Representations of Causal Graphs. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 839–850, Online. Association for Computational Linguistics.
- Cite (Informal):
- Learning Faithful Representations of Causal Graphs (Balashankar & Subramanian, ACL-IJCNLP 2021)
- PDF:
- https://preview.aclanthology.org/nodalida-main-page/2021.acl-long.69.pdf