SubmissionNumber#=%=#12 FinalPaperTitle#=%=#Puer at SemEval-2024 Task 2: A BioLinkBERT Approach to Biomedical Natural Language Inference ShortPaperTitle#=%=# NumberOfPages#=%=#6 CopyrightSigned#=%=#Jiaxu Dao JobTitle#==# Organization#==# Abstract#==#This paper delineates our investigation into the application of BioLinkBERT for enhancing clinical trials, presented at SemEval-2024 Task 2. Centering on the medical biomedical NLI task, our approach utilized the BioLinkBERT-large model, refined with a pioneering mixed loss function that amalgamates contrastive learning and cross-entropy loss. This methodology demonstrably surpassed the established benchmark, securing an impressive F1 score of 0.72 and positioning our work prominently in the field. Additionally, we conducted a comparative analysis of various deep learning architectures, including BERT, ALBERT, and XLM-RoBERTa, within the context of medical text mining. The findings not only showcase our method's superior performance but also chart a course for future research in biomedical data processing. Our experiment source code is available on GitHub at: https://github.com/daojiaxu/semeval2024_task2. Author{1}{Firstname}#=%=#Jiaxu Author{1}{Lastname}#=%=#Dao Author{1}{Email}#=%=#daojiaxu@peu.edu.cn Author{1}{Affiliation}#=%=#Pu'er University Author{2}{Firstname}#=%=#Zhuoying Author{2}{Lastname}#=%=#Li Author{2}{Email}#=%=#lizhuoying@peu.edu.cn Author{2}{Affiliation}#=%=#Pu'er University Author{3}{Firstname}#=%=#Xiuzhong Author{3}{Lastname}#=%=#Tang Author{3}{Email}#=%=#tangxiuzhong@peu.edu.cn Author{3}{Affiliation}#=%=#Pu'er University Author{4}{Firstname}#=%=#Xiaoli Author{4}{Lastname}#=%=#Lan Author{4}{Email}#=%=#lanxiaoli@peu.edu.cn Author{4}{Affiliation}#=%=#Pu'er University Author{5}{Firstname}#=%=#Junde Author{5}{Lastname}#=%=#Wang Author{5}{Email}#=%=#wangjunde@peu.edu.cn Author{5}{Affiliation}#=%=#Pu'er University ========== èéáğö