Publications

Neural networks with nonlinear synapses and a static noise

The theory of neural networks is extended to include a static noise as well as nonlinear updating of synapses by learning. The noise appears either in the form of spin-glass interactions, which are independent of the learning process, or as a random decaying of synapses. In an unsaturated network, the nonlinear learning algorithms may modify the energy surface and lead to interesting new computational capabilities. Close to saturation, they act as an additional source of a static noise. The effect of the noise on memory storage is calculated.

Authors: H. Sompolinsky
Year of publication: 1986
Journal: Phys. Rev. A 34, 2571(R) – Published 1 September 1986

Link to publication:

Labs:

“Working memory”