
NewsMachine Learning
Does the Adam Optimizer Amplify Catastrophic Forgetting?
via HackernoonAdam Optimizer
Catastrophic forgetting in neural networks isn’t just a model problem—it’s heavily influenced by how we train and measure it. This study shows that optimizer choice, especially between SGD and Adam, significantly affects forgetting, with simpler methods like SGD often performing better. It also reveals that commonly used metrics can lead to wildly different conclusions, suggesting that current evaluation approaches are unreliable. The takeaway: understanding and mitigating forgetting requires more rigorous, multi-metric evaluation frameworks.
Continue reading on Hackernoon
Opens in a new tab
0 views


