The mechanism by which humans and animals store retrievable information for lifelong time scales has been a key and hitherto unresolved problem in neuroscience. Most current models of long-term memory exhibit “catastrophic forgetting,” as new memories overwrite old ones. The consolidation of memories (in part by the replay of old memories) is well documented. However, no existing neuronal circuit model shows how consolidation leads to lifelong learning of memories at the capacity scale expected in humans and animals. Our article in Scientific Reports introduces a recurrent neural network model that enables lifelong memory.
Our model can store a continuous stream of memories by Hebbian learning with synaptic weight decay (synaptic recycling). Consolidation is characterized as a regenerative stochastic process, whereby memories are revisited randomly, with a probability that increases with the effect that the memory has on network connectivity. This mechanism gives rise to realistic, gracefully decaying forgetting curves and memory lifetime that are orders of magnitude longer than the characteristic synaptic recycling time scales. Our article includes analyses of the memory capabilities and intrinsic network properties, with novel outcomes such as a power-law relation between memory capacity and the number of neurons in the circuit. Finally, by perturbing our model, we study effects similar to known human memory deficits.
Figures: