Nick Pogrebnyakov


Predicting the Success of Domain Adaptation in Text Similarity
Nick Pogrebnyakov | Shohreh Shaghaghian
Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021)

Transfer learning methods, and in particular domain adaptation, help exploit labeled data in one domain to improve the performance of a certain task in another domain. However, it is still not clear what factors affect the success of domain adaptation. This paper models adaptation success and selection of the most suitable source domains among several candidates in text similarity. We use descriptive domain information and cross-domain similarity metrics as predictive features. While mostly positive, the results also point to some domains where adaptation success was difficult to predict.

Active Curriculum Learning
Borna Jafarpour | Dawn Sepehr | Nick Pogrebnyakov
Proceedings of the First Workshop on Interactive Learning for Natural Language Processing

This paper investigates and reveals the relationship between two closely related machine learning disciplines, namely Active Learning (AL) and Curriculum Learning (CL), from the lens of several novel curricula. This paper also introduces Active Curriculum Learning (ACL) which improves AL by combining AL with CL to benefit from the dynamic nature of the AL informativeness concept as well as the human insights used in the design of the curriculum heuristics. Comparison of the performance of ACL and AL on two public datasets for the Named Entity Recognition (NER) task shows the effectiveness of combining AL and CL using our proposed framework.