Back to articles
Measuring Catastrophic Forgetting in AI

Measuring Catastrophic Forgetting in AI

via HackernoonAdam Optimizer

Catastrophic forgetting occurs when neural networks lose previously learned knowledge while acquiring new tasks. This section outlines three core ways to measure it: retention (performance drop on old tasks), relearning (speed of reacquisition), and activation overlap (shared internal representations). Together, these metrics provide complementary views into how and why AI systems forget, bridging insights from psychology and modern machine learning.

Continue reading on Hackernoon

Opens in a new tab

Read Full Article
2 views

Related Articles