Abstract
In recent years, Large Language Models such as GPT-3 showed remarkable capabilities in performing NLP tasks in the zero and few shot settings. On the other hand, the experiments highlighted the difficulty of GPT-3 in carrying out tasks that require a certain degree of reasoning, such as arithmetic operations. In this paper we evaluate the ability of Transformer Language Models to perform arithmetic operations following a pipeline that, before performing computations, decomposes numbers in units, tens, and so on. We denote the models fine-tuned with this pipeline with the name Calculon and we test them in the task of performing additions, subtractions and multiplications on the same test sets of GPT-3. Results show an increase of accuracy of 63% in the five-digit addition task. Moreover, we demonstrate the importance of the decomposition pipeline introduced, since fine-tuning the same Language Model without decomposing numbers results in 0% accuracy in the five-digit addition task.- Anthology ID:
- 2022.lrec-1.30
- Volume:
- Proceedings of the Thirteenth Language Resources and Evaluation Conference
- Month:
- June
- Year:
- 2022
- Address:
- Marseille, France
- Editors:
- Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
- Venue:
- LREC
- SIG:
- Publisher:
- European Language Resources Association
- Note:
- Pages:
- 291–297
- Language:
- URL:
- https://aclanthology.org/2022.lrec-1.30
- DOI:
- Cite (ACL):
- Matteo Muffo, Aldo Cocco, and Enrico Bertino. 2022. Evaluating Transformer Language Models on Arithmetic Operations Using Number Decomposition. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 291–297, Marseille, France. European Language Resources Association.
- Cite (Informal):
- Evaluating Transformer Language Models on Arithmetic Operations Using Number Decomposition (Muffo et al., LREC 2022)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2022.lrec-1.30.pdf
- Code
- mmuffo94/TransformerLM_arithmetics