Overview
This lecture titled "Algorithmic Dependent Generalization Bounds" features Associate Professor Roi Livni from Tel Aviv University exploring one of machine learning's fundamental challenges: understanding how algorithms influence generalization. Discover why classical theories like VC-theory and PAC learning fail to explain the crucial role algorithms play in preventing overfitting. Examine Stochastic Convex Optimization as a framework for investigating this phenomenon, with special focus on Gradient Descent and Stochastic Gradient Descent. Learn about the first tight sample complexity bound for Gradient Descent and new generalization lower bounds for SGD variants. Explore how these findings reshape our understanding of stability, regularization, and implicit bias in machine learning. The lecture will be delivered in Hebrew on Thursday, May 8th, 2025, at 10:30 AM in room B220. Prof. Livni brings impressive credentials including Best Paper Awards at FOCS 2020 and ICML 2024, and a recent ERC Starting Grant for his "FoG" project.
Syllabus
Thursday, May 8th, 2025, 10:30 AM, room B220
Taught by
HUJI Machine Learning Club