Model-Agnostic Bias Measurement in Link Prediction

Lena Schwertmann, Manoj Prabhakar Kannan Ravi, Gerard de Melo


Abstract
Link prediction models based on factual knowledge graphs are commonly used in applications such as search and question answering. However, work investigating social bias in these models has been limited. Previous work focused on knowledge graph embeddings, so more recent classes of models achieving superior results by fine-tuning Transformers have not yet been investigated. We therefore present a model-agnostic approach for bias measurement leveraging fairness metrics to compare bias in knowledge graph embedding-based predictions (KG only) with models that use pre-trained, Transformer-based language models (KG+LM). We further create a dataset to measure gender bias in occupation predictions and assess whether the KG+LM models are more or less biased than KG only models. We find that gender bias tends to be higher for the KG+LM models and analyze potential connections to the accuracy of the models and the data bias inherent in our dataset. Finally, we discuss the limitations and ethical considerations of our work. The repository containing the source code and the data set is publicly available at https://github.com/lena-schwert/comparing-bias-in-KG-models.
Anthology ID:
2023.findings-eacl.121
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1632–1648
Language:
URL:
https://aclanthology.org/2023.findings-eacl.121
DOI:
10.18653/v1/2023.findings-eacl.121
Bibkey:
Cite (ACL):
Lena Schwertmann, Manoj Prabhakar Kannan Ravi, and Gerard de Melo. 2023. Model-Agnostic Bias Measurement in Link Prediction. In Findings of the Association for Computational Linguistics: EACL 2023, pages 1632–1648, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Model-Agnostic Bias Measurement in Link Prediction (Schwertmann et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-eacl.121.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-eacl.121.mp4