The school is aimed primarily at the growing audience of theoretical physicists, applied mathematicians, computer scientists and colleagues from other computational fields interested in machine learning, neural networks, and high-dimensional data analysis. We shall cover basics and frontiers of high-dimensional statistics, machine learning, theory of computing and statistical learning, and the related mathematics and probability theory. We will put a special focus on methods of statistical physics and their results in the context of current questions and theories related to these problems. Open questions and directions will be discussed as well.

**Teachers**

- Francis Bach (Inria, ENS) :
*Sums-of-squares: from polynomials to kernels*[videos] - Yasaman Bahri (Google) and Boris Hanin (Princeton) :
*Deep Learning at Large and Infinite Width*[videos] - Boaz Barak (Harvard) :
*Computational Complexity of Deep learning: Fundamental limitations and Empirical phenomena*[videos] - Giulio Biroli (ENS Paris) :
*High-Dimensional Non-Convex Landscapes and Gradient Descent Dynamics*[videos] - Michael I. Jordan (Berkeley)
*On decision, dynamics, incentives, and mechanisms design*[videos] - Julia Kempe (NYU):
*Data, Physics and Kernels and how can (statistical) physics tools help the DL practitioner***slides**[1], [videos] - Yann LeCun (Facebook & NYU) :
*From machine learning to autonomous intelligence***slides**: [1], [2], [3], [videos] - Marc Mézard (ENS Paris) :
*Belief propagation, Message-Passing & Sparse models* - Remi Monasson (ENS Paris) :
*Replica method for computational problems with randomness: principles and illustrations***Slides**[1], [2], [videos] - Andrea Montanari (Stanford) :
*Neural networks from a nonparametric viewpoint*[videos] - Sara Solla (Northwestern Univ.):
*Statistical Physics, Bayesian Inference and Neural Information Processing***Slides**[1], [2],[3], [videos] - Haim Sompolinsky (Harvard & Hebrew Univ.):
*Statistical Mechanics of Machine Learning*[videos] - Nathan Srebro (TTI Chicago)
*Applying statistical learning theory to deep learning***Slides**[1], [2], [3], [4], [5], [6], [videos] - Eric Vanden-Eijnden (NYU Courant) :
*Benefits of overparametrization in statistical learning, & Enhancing MCMC Sampling with Learning*[videos] - Matthieu Wyart (EPFL) :
*Loss landscape, over-parametrization and curse of dimensionality in deep learning***Slides**[1][videos]

**Topics:**

- Phase transition in machine learning
- Computational complexity
- Dynamics of learning in high-dimension
- Message passing algorithms
- Challenges in machine learning
- Statistical physics of deep neural networks
- High-dimensional statistics
- Optimization and implicit regularisation
- Replica and cavity methods
- Probability theory and rigorous approaches
- Statistical inference in high-dimension
- Computational learning theory

**Organisers:**

Florent Krzakala and Lenka Zdeborova, EPFL

**Participants:**

Find here the list of participants to the school.

Click here for the poster of the conference.

Click here to access the school website