Simons Institute | Overview of Statistical Learning Theory Part 2 @SimonsInstituteTOC | Uploaded 1 week ago | Updated 2 minutes ago
Nati Srebro (Toyota Technological Institute at Chicago)
https://simons.berkeley.edu/talks/nati-srebro-toyota-technological-institute-chicago-2024-08-28
Modern Paradigms in Generalization Boot Camp
In this tutorial I will mostly survey classical, mostly 20th century, statistical learning theory, focusing on generalization by controlling capacity. We will discuss:
- Vapnik and Chervonenkis's Fundamental Theorem of Learning
- Scale sensitive capacity control and marking
- Minimum Description Length / Occam's Rule / Structural Risk Minimization and PAC-Bayes
- Parallels with Stochastic Optimization
- Generalization and capacity control from optimization: online-to-batch, stochastic approximation, boosting, min norm and max margin.
We will ask how the classic theory fits with current interests, including interpolation learning, benign oversitting and implicit bias.
Nati Srebro (Toyota Technological Institute at Chicago)
https://simons.berkeley.edu/talks/nati-srebro-toyota-technological-institute-chicago-2024-08-28
Modern Paradigms in Generalization Boot Camp
In this tutorial I will mostly survey classical, mostly 20th century, statistical learning theory, focusing on generalization by controlling capacity. We will discuss:
- Vapnik and Chervonenkis's Fundamental Theorem of Learning
- Scale sensitive capacity control and marking
- Minimum Description Length / Occam's Rule / Structural Risk Minimization and PAC-Bayes
- Parallels with Stochastic Optimization
- Generalization and capacity control from optimization: online-to-batch, stochastic approximation, boosting, min norm and max margin.
We will ask how the classic theory fits with current interests, including interpolation learning, benign oversitting and implicit bias.