Anika Singh
2025
Mitigating Forgetting in Continual Learning with Selective Gradient Projection
Anika Singh
|
David Martinez
|
Aayush Dhaulakhandi
|
Varun Chopade
|
Likhith Malipati
|
Vasu Sharma
|
Kevin Zhu
|
Sunishchal Dev
|
Ryan Lagasse
The 14th International Joint Conference on Natural Language Processing and The 4th Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics
As neural networks are increasingly deployed in dynamic environments, they face the challenge of catastrophic forgetting, the tendency to overwrite previously learned knowledge when adapting to new tasks, resulting in severe performance degradation on earlier tasks. We propose Selective Forgetting-Aware Optimization (SFAO), a dynamic method that regulates gradient directions via cosine similarity and per-layer gating, enabling controlled forgetting while balancing plasticity and stability. SFAO selectively projects, accepts, or discards updates using a tunable mechanism with efficient Monte Carlo approximation. Experiments on standard continual learning benchmarks show that SFAO achieves competitive accuracy with markedly lower memory cost, a 90% reduction, and improved forgetting on MNIST datasets, making it suitable for resource-constrained scenarios.
Search
Fix author
Co-authors
- Varun Chopade 1
- Sunishchal Dev 1
- Aayush Dhaulakhandi 1
- Ryan Lagasse 1
- Likhith Malipati 1
- show all...