Catastrophic forgetting is the phenomenon where a neural network's performance on a previously learned task degrades dramatically after it is trained on a new, distinct task. This occurs because standard gradient-based learning updates all of the network's connection weights to minimize loss on the new data, which inadvertently overwrites the representations crucial for the old task. It is a primary obstacle in continual learning and lifelong learning systems, which aim to accumulate knowledge sequentially like biological brains.
