Back to articles
Study Finds Optimizer Choice Significantly Impacts Model Retention

Study Finds Optimizer Choice Significantly Impacts Model Retention

via HackernoonAdam Optimizer

This work revisits catastrophic forgetting in machine learning, showing that optimizer choice—alongside dataset and metrics—plays a far more significant role than previously understood. By comparing modern gradient-based optimizers like SGD, RMSProp, and Adam across supervised and reinforcement learning settings, the study reveals that forgetting is not just a function of model architecture or data exposure, but also of how learning itself is optimized.

Continue reading on Hackernoon

Opens in a new tab

Read Full Article
2 views

Related Articles