Isabella Olariu


Fixing paper assignments

  1. Please select all papers that belong to the same person.
  2. Indicate below which author they should be assigned to.
Provide a valid ORCID iD here. This will be used to match future papers to this author.
Provide the name of the school or the university where the author has received or will receive their highest degree (e.g., Ph.D. institution for researchers, or current affiliation for students). This will be used to form the new author page ID, if needed.

TODO: "submit" and "cancel" buttons here


2023

pdf bib
Evaluating Parameter-Efficient Finetuning Approaches for Pre-trained Models on the Financial Domain
Isabella Olariu | Cedric Lothritz | Jacques Klein | Tegawendé Bissyandé | Siwen Guo | Shohreh Haddadan
Findings of the Association for Computational Linguistics: EMNLP 2023

Large-scale language models with millions, billions, or trillions of trainable parameters are becoming increasingly popular. However, they risk becoming rapidly over-parameterized and the adaptation cost of fully fine-tuning them increases significantly. Storing them becomes progressively impractical as it requires keeping a separate copy of all the fine-tuned weights for each task. By freezing all pre-trained weights during fine-tuning, parameter-efficient tuning approaches have become an appealing alternative to traditional fine-tuning. The performance of these approaches has been evaluated on common NLP tasks of the GLUE benchmark and shown to match full fine-tuning performance, however, their impact is less researched in domain-specific fields such as finance. This work compares the performance of a set of financial BERT-like models to their fully fine-tuned counterparts by leveraging different parameter-efficient tuning methods. We see that results are comparable to traditional fine-tuning while gaining in time and resource efficiency.

pdf bib
Comparing Pre-Training Schemes for Luxembourgish BERT Models
Cedric Lothritz | Saad Ezzini | Christoph Purschke | Tegawendé Bissyandé | Jacques Klein | Isabella Olariu | Andrey Boytsov | Clément LeFebvre | Anne Goujon
Proceedings of the 19th Conference on Natural Language Processing (KONVENS 2023)

pdf bib
Evaluating Data Augmentation Techniques for the Training of Luxembourgish Language Models
Isabella Olariu | Cedric Lothritz | Tegawendé Bissyandé | Jacques Klein
Proceedings of the 19th Conference on Natural Language Processing (KONVENS 2023)