Mutual information of population codes and distance measures in probability space

We studied the mutual information between a stimulus and a system consisting of stochastic, statistically independent elements that respond to a stimulus. Using statistical mechanical methods the properties of the mutual information (MI) in the limit of a large system size N are calculated. For continuous valued stimuli, the MI increases logarithmically with N and is related to the log of the Fisher information of the system. For discrete stimuli the MI saturates exponentially with N. We find that the exponent of saturation of the MI is the Chernoff distance between response probabilities that are induced by different stimuli.

Authors: Kang K, Sompolinsky H.
Year of publication: 2001
Journal: Phys Rev Lett. 2001 May 21;86(21):4958-61

Link to publication:


“Working memory”