Unsupervised Mitigating Gender Bias by Character Components: A Case Study of Chinese Word Embedding
Xiuying Chen, Mingzhe Li, Rui Yan, Xin Gao, Xiangliang Zhang
Abstract
Word embeddings learned from massive text collections have demonstrated significant levels of discriminative biases.However, debias on the Chinese language, one of the most spoken languages, has been less explored.Meanwhile, existing literature relies on manually created supplementary data, which is time- and energy-consuming.In this work, we propose the first Chinese Gender-neutral word Embedding model (CGE) based on Word2vec, which learns gender-neutral word embeddings without any labeled data.Concretely, CGE utilizes and emphasizes the rich feminine and masculine information contained in radicals, i.e., a kind of component in Chinese characters, during the training procedure.This consequently alleviates discriminative gender biases.Experimental results on public benchmark datasets show that our unsupervised method outperforms the state-of-the-art supervised debiased word embedding models without sacrificing the functionality of the embedding model.- Anthology ID:
- 2022.gebnlp-1.14
- Volume:
- Proceedings of the 4th Workshop on Gender Bias in Natural Language Processing (GeBNLP)
- Month:
- July
- Year:
- 2022
- Address:
- Seattle, Washington
- Venue:
- GeBNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 121–128
- Language:
- URL:
- https://aclanthology.org/2022.gebnlp-1.14
- DOI:
- 10.18653/v1/2022.gebnlp-1.14
- Cite (ACL):
- Xiuying Chen, Mingzhe Li, Rui Yan, Xin Gao, and Xiangliang Zhang. 2022. Unsupervised Mitigating Gender Bias by Character Components: A Case Study of Chinese Word Embedding. In Proceedings of the 4th Workshop on Gender Bias in Natural Language Processing (GeBNLP), pages 121–128, Seattle, Washington. Association for Computational Linguistics.
- Cite (Informal):
- Unsupervised Mitigating Gender Bias by Character Components: A Case Study of Chinese Word Embedding (Chen et al., GeBNLP 2022)
- PDF:
- https://preview.aclanthology.org/remove-xml-comments/2022.gebnlp-1.14.pdf