
How-ToMachine Learning
Why Adam May Be Hurting Your Neural Network’s Memory
via HackernoonAdam Optimizer
This study shows that optimizer choice has a major impact on catastrophic forgetting in neural networks, with SGD consistently outperforming Adam and RMSProp. While hyperparameters influence outcomes, the optimizer itself plays a larger role. The findings also reveal that commonly used metrics like activation overlap may not reliably explain forgetting, highlighting the need for multi-metric evaluation—especially retention and relearning—in continual learning systems.
Continue reading on Hackernoon
Opens in a new tab
0 views


