Xiaopu Zhou
2022
SNP2Vec: Scalable Self-Supervised Pre-Training for Genome-Wide Association Study
Samuel Cahyawijaya
|
Tiezheng Yu
|
Zihan Liu
|
Xiaopu Zhou
|
Tze Wing Tiffany Mak
|
Yuk Yu Nancy Ip
|
Pascale Fung
Proceedings of the 21st Workshop on Biomedical Language Processing
Self-supervised pre-training methods have brought remarkable breakthroughs in the understanding of text, image, and speech. Recent developments in genomics has also adopted these pre-training methods for genome understanding. However, they focus only on understanding haploid sequences, which hinders their applicability towards understanding genetic variations, also known as single nucleotide polymorphisms (SNPs), which is crucial for genome-wide association study. In this paper, we introduce SNP2Vec, a scalable self-supervised pre-training approach for understanding SNP. We apply SNP2Vec to perform long-sequence genomics modeling, and we evaluate the effectiveness of our approach on predicting Alzheimer’s disease risk in a Chinese cohort. Our approach significantly outperforms existing polygenic risk score methods and all other baselines, including the model that is trained entirely with haploid sequences.
Search
Co-authors
- Samuel Cahyawijaya 1
- Tiezheng Yu 1
- Zihan Liu 1
- Tze Wing Tiffany Mak 1
- Yuk-Yu Nancy Ip 1
- show all...