FOLDS seminar: Propagation-of-Chaos in Shallow Neural Networks beyond Logarithmic Time.
September 25, 2025 at 12:00 PM - 1:00 PM
Share this event
Zoom link: https://upenn.zoom.us/j/98220304722
The analysis of gradient-based learning of Neural Networks remains an outstanding challenge, even for the simplest shallow architectures.
A powerful mathematical framework that has emerged over recent years lifts the optimization in the space of probability measures, and captures important empirical phenomena such as the ‘blessing of overparametrization’. However, the resulting learning guarantees remain mostly qualitative.
In this talk, we study the fluctuations between idealized mean-field dynamics and polynomially-sized networks over suitable time horizons — the so-called Propagation of Chaos. We provide a novel analysis that goes beyond traditional Gronwall-based PoC by exploiting certain geometric properties of the optimization landscape, and apply these results to representative models such as single-index models, establishing polynomial learning guarantees.
Joint work with Margalit Glasgow and Denny Wu.

