An oscillator neural network model that is capable of processing local and global attributes of sensory input is proposed and analyzed. Local features in the input are encoded in the average firing rate of the neurons while the relationships between these features can modulate the temporal structure of the neuronal output. Neurons that share the same receptive field interact via relatively strong feedback connections, while neurons with different fields interact via specific, relatively weak connections. The model is studied in the context of processing visual stimuli that are coded for orientation. The effect of axonal propagation delays on synchronization of oscillatory activity is analyzed. We compare our theoretical results with recent experimental evidence on coherent oscillatory activity in the cat visual cortex. The computational capabilities of the model for performing discrimination and segmentation tasks are demonstrated. Coding and linking of visual features other than orientation are discussed.
Cooperative dynamics in visual processing
Link to publication: