Class Balancing for Efficient Active Learning in Imbalanced Datasets

Yaron Fairstein, Oren Kalinsky, Zohar Karnin, Guy Kushilevitz, Alexander Libov, Sofia Tolmach


Abstract
Recent developments in active learning algorithms for NLP tasks show promising results in terms of reducing labelling complexity. In this paper we extend this effort to imbalanced datasets; we bridge between the active learning approach of obtaining diverse andinformative examples, and the heuristic of class balancing used in imbalanced datasets. We develop a novel tune-free weighting technique that canbe applied to various existing active learning algorithms, adding a component of class balancing. We compare several active learning algorithms to their modified version on multiple public datasetsand show that when the classes are imbalanced, with manual annotation effort remaining equal the modified version significantly outperforms the original both in terms of the test metric and the number of obtained minority examples. Moreover, when the imbalance is mild or non-existent (classes are completely balanced), our technique does not harm the base algorithms.
Anthology ID:
2024.law-1.8
Volume:
Proceedings of The 18th Linguistic Annotation Workshop (LAW-XVIII)
Month:
March
Year:
2024
Address:
St. Julians, Malta
Editors:
Sophie Henning, Manfred Stede
Venues:
LAW | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
77–86
Language:
URL:
https://aclanthology.org/2024.law-1.8
DOI:
Bibkey:
Cite (ACL):
Yaron Fairstein, Oren Kalinsky, Zohar Karnin, Guy Kushilevitz, Alexander Libov, and Sofia Tolmach. 2024. Class Balancing for Efficient Active Learning in Imbalanced Datasets. In Proceedings of The 18th Linguistic Annotation Workshop (LAW-XVIII), pages 77–86, St. Julians, Malta. Association for Computational Linguistics.
Cite (Informal):
Class Balancing for Efficient Active Learning in Imbalanced Datasets (Fairstein et al., LAW-WS 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2024.law-1.8.pdf