This Computer Science/Discrete Mathematics Seminar presents a lecture on "A Theory of Generalized Boosting" delivered by Nataly Brukhim from the Institute for Advanced Study. Explore how boosting, a fundamental machine learning method that transforms weak learners into strong ones, can be extended beyond traditional symmetric 0-1 loss functions. Discover recent research developing a comprehensive theory of boosting for cost-sensitive and multi-objective loss functions in both binary and multiclass settings. Learn about the game-theoretic perspective that reveals a nuanced taxonomy of learning guarantees—categorized as trivial, boostable, or intermediate—and how these techniques extend to PAC learning itself. The seminar provides a fine-grained characterization of PAC learning and identifies the Pareto frontier of attainable guarantees for the class. The session takes place at 10:30am in Simonyi 101 with remote access options available on March 4, 2025.
Overview
Syllabus
10:30am|Simonyi 101 and Remote Access
Taught by
Institute for Advanced Study