Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

PAC-Bayesian Approaches to Understanding Generalization in Deep Learning - Gintare Dziugaite

Institute for Advanced Study via YouTube

Overview

This course covers PAC-Bayesian approaches to understanding generalization in deep learning. The learning outcomes include gaining insights into risk bounds for Gibbs classifiers, generalization bounds, and bounds on deterministic classifiers. Students will learn about optimal priors, distribution-dependent priors, and data-dependent oracle priors for neural networks. The teaching method involves theoretical discussions, empirical evaluations, and practical applications. This course is intended for individuals interested in the theoretical aspects of deep learning and those looking to enhance their understanding of generalization in machine learning algorithms.

Syllabus

Intro
Setup
Outline
PAC-Bayes yields risk bounds for Gibbs classifiers
PAC-Bayes generalization bounds
PAC-Bayes bounds on deterministic classifiers
Recap: Towards a nonvacuous bound on SGD
Can we exploit optimal priors?
Distribution-dependent priors (Lever et al. 2010)
Empirical evaluation of Lever et al.'s bounds
Distribution-dependent approximations of optimal priors via privacy
A question of interpretation
Data-dependent oracle priors for neural networks
Coupled data-dependent approximate oracle priors and posteriors
Gaussian network bounds for Coupled data-dependent priors
Oracle access to optimal prior covariance
Directly optimizing Variational data-dependent PAC-Bayes generalization bound.
Recap and Conclusion

Taught by

Institute for Advanced Study

Reviews

Start your review of PAC-Bayesian Approaches to Understanding Generalization in Deep Learning - Gintare Dziugaite

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.