SubmissionNumber#=%=#222 FinalPaperTitle#=%=#Calc-CMU at SemEval-2024 Task 7: Pre-Calc - Learning to Use the Calculator Improves Numeracy in Language Models ShortPaperTitle#=%=# NumberOfPages#=%=#8 CopyrightSigned#=%=#Vishruth Veerendranath JobTitle#==# Organization#==#Carnegie Mellon University 5000 Forbes Avenue, Pittsburgh, PA Abstract#==#Quantitative and numerical comprehension in language is an important task in many fields like education and finance, but still remains a challenging task for language models. While tool and calculator usage has shown to be helpful to improve mathematical reasoning in large pretrained decoder-only language models, this remains unexplored for smaller language models with encoders. In this paper, we propose Pre-Calc, a simple pre-finetuning objective of learning to use the calculator for both encoder-only and encoder-decoder architectures, formulated as a discriminative and generative task respectively. We pre-train BERT and RoBERTa for discriminative calculator use and Flan-T5 for generative calculator use on the MAWPS, SVAMP, and AsDiv-A datasets, which improves performance on downstream tasks that require numerical understanding. Our code and data are available at https://github.com/calc-cmu/pre-calc. Author{1}{Firstname}#=%=#Vishruth Author{1}{Lastname}#=%=#Veerendranath Author{1}{Username}#=%=#vishruthnath Author{1}{Email}#=%=#vveerend@andrew.cmu.edu Author{1}{Affiliation}#=%=#Carnegie Mellon University Author{2}{Firstname}#=%=#Vishwa Author{2}{Lastname}#=%=#Shah Author{2}{Username}#=%=#vishwa_shah_27 Author{2}{Email}#=%=#vishwavs@andrew.cmu.edu Author{2}{Affiliation}#=%=#Carnegie Mellon University Author{3}{Firstname}#=%=#Kshitish Author{3}{Lastname}#=%=#Ghate Author{3}{Username}#=%=#kghate Author{3}{Email}#=%=#kghate@andrew.cmu.edu Author{3}{Affiliation}#=%=#Carnegie Mellon University ========== èéáğö