Hao Chen

HKUST

Other people with similar names: Hao Chen (UC Davis), Hao Chen (Tsinghua), Hao Chen (Chinese Academy of Sciences), Hao Chen (South China Normal University), Hao Chen (Nankai), Hao Chen (Hong Kong Polytechnic), Hao Chen, Hao Chen (Zhejiang), Hao Chen (Dalian, Alibaba)

Unverified author pages with similar names: Hao Chen


2025

The recent advancements in language models have significantly catalyzed progress in computational biology. A growing body of research strives to construct unified foundation models for single-cell biology, with language models serving as the cornerstone. In this paper, we systematically review the developments in foundation language models designed specifically for single-cell biology. Our survey offers a thorough analysis of various incarnations of single-cell foundation language models, viewed through the lens of both pre-trained language models (PLMs) and large language models (LLMs). This includes an exploration of data tokenization strategies, pre-training/tuning paradigms, and downstream single-cell data analysis tasks. Additionally, we discuss the current challenges faced by these pioneering works and speculate on future research directions. Overall, this survey provides a comprehensive overview of the existing single-cell foundation language models, paving the way for future research endeavors.