Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Stanford University

Federated Hyperparameter Tuning - Challenges, Baselines, and Connections

Stanford University via YouTube

Overview

This course explores the challenges and connections in federated hyperparameter tuning, focusing on adapting standard approaches to form baselines for the federated setting. The teaching method includes theoretical discussions and empirical demonstrations, showcasing a new method called FedEx that accelerates federated hyperparameter tuning. The intended audience for this course includes individuals interested in machine learning, federated learning, and hyperparameter optimization. Upon completion, learners will gain insights into the challenges of hyperparameter tuning in federated settings, learn about the FedEx method, and understand how it can outperform natural baselines in various benchmarks.

Syllabus

Introduction
Machine Learning
Outline
Hyperparameter Tuning Global vs Local
Hyperparameter Tuning Methods
Baseline Challenges
Success of Having
Issues
Resource Limitations
Local vs Global Validation
Baselines vs Bayesian Optimization
Neural Architecture Search
Architecture Search
Weight Sharing
Constraints
Federated Learning
Federated Averaging
Local Hyperparameters
Local Hyperparameter Tuning
Summary
Solution
Methods
Results
Key takeaways
Questions

Taught by

Stanford MedAI

Reviews

Start your review of Federated Hyperparameter Tuning - Challenges, Baselines, and Connections

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.