Neural noise limits the fidelity of representations in the brain. This limitation has been extensively analyzed for sensory coding. However, in short-term memory and integrator networks, where noise accumulates and can play an even more prominent role, much less is known about how neural noise interacts with neural and network parameters to determine the accuracy of the computation. Here we analytically derive how the stored memory in continuous attractor networks of probabilistically spiking neurons will degrade over time through diffusion. By combining statistical and dynamical approaches, we establish a fundamental limit on the network’s ability to maintain a persistent state: The noise-induced drift of the memory state over time within the network is strictly lower-bounded by the accuracy of estimation of the network’s instantaneous memory state by an ideal external observer. This result takes the form of an information-diffusion inequality. We derive some unexpected consequences: Despite the persistence time of short-term memory networks, it does not pay to accumulate spikes for longer than the cellular time-constant to read out their contents. For certain neural transfer functions, the conditions for optimal sensory coding coincide with those for optimal storage, implying that short-term memory may be co-localized with sensory representation.