Paper of the month

Paper of the month – September 2021 (Sompolinsky’s Lab)

Sompolinsky's Lab: Statistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization

Qianyi Li, Haim Sompolinsky

Phys. Rev. X, Volume 11, Issue 3, Pages 031059  (2021)

Lay summary:

Machine vision, speech recognition, and natural language processing programs all rely on deep learning: a form of artificial intelligence where neural networks analyze raw input data and generate desired outputs. Deep learning works remarkably well at solving real-world problems—but researchers don’t fully understand why. The authors propose a new theory of deep learning that could help reveal why deep neural networks work. The method explains how task information propagates from layer to layer through a network, shaping its performance. The authors then evaluate which network features contribute the most to deep learning’s success. The theory’s results describe performance of a large family of neural networks. The researchers are extending their work to different families of neural networks, including those commonly used for image and speech recognition.

“Working memory”