SubmissionNumber#=%=#183 FinalPaperTitle#=%=#silp_nlp at SemEval-2024 Task 1: Cross-lingual Knowledge Transfer for Mono-lingual Learning ShortPaperTitle#=%=# NumberOfPages#=%=#7 CopyrightSigned#=%=#Sumit Singh JobTitle#==# Organization#==#Indian Institute of information Technology, Allahabad Abstract#==#Our team, silp_nlp, participated in all three tracks of SemEval2024 Task 1: Semantic Textual Relatedness (STR). We created systems for a total of 29 subtasks across all tracks: nine subtasks for track A, 10 subtasks for track B, and ten subtasks for track C. To make the most of our knowledge across all subtasks, we used transformer-based pre-trained models, which are known for their strong cross-lingual transferability. For track A, we trained our model in two stages. In the first stage, we focused on multi-lingual learning from all tracks. In the second stage, we fine-tuned the model for individual tracks. For track B, we used a unigram and bigram representation with suport vector regression (SVR) and eXtreme Gradient Boosting (XGBoost) regression. For track C, we again utilized cross-lingual transferability without the use of targeted subtask data. Our work highlights the fact that knowledge gained from all subtasks can be transferred to an individual subtask if the base language model has strong cross-lingual characteristics. Our system ranked first in the Indonesian subtask of Track B (C7) and in the top three for four other subtasks. Author{1}{Firstname}#=%=#Sumit Author{1}{Lastname}#=%=#Singh Author{1}{Username}#=%=#sumitrsch Author{1}{Email}#=%=#sumitrsch@gmail.com Author{1}{Affiliation}#=%=#IIIT Allahabad Author{2}{Firstname}#=%=#Pankaj Kumar Author{2}{Lastname}#=%=#Goyal Author{2}{Email}#=%=#pankajgoyal02003@gmail.com Author{2}{Affiliation}#=%=#IIIT Allahabad Author{3}{Firstname}#=%=#Uma Shanker Author{3}{Lastname}#=%=#Tiwary Author{3}{Email}#=%=#ust@iiita.ac.in Author{3}{Affiliation}#=%=#IIIT Allahabad ========== èéáğö