Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

The First Asynchronous SGD with Optimal Time Complexity - Seminar #126

Federated Learning One World Seminar via YouTube

Overview

Coursera Plus Monthly Sale: All Certificates & Courses 40% Off!
Watch this 59-minute presentation from the Federated Learning One World Seminar series where Arto Maranjyan from KAUST discusses the first asynchronous Stochastic Gradient Descent (SGD) algorithm with optimal time complexity. Delivered on April 16, 2025, this talk explores groundbreaking developments in asynchronous optimization methods. Learn about the theoretical foundations and practical implications of this advancement in distributed machine learning. For more information, visit the seminar website or connect with the speaker through his personal webpage. The presentation is part of the FLOW Seminar series (#126) focused on federated learning innovations.

Syllabus

FLOW Seminar #126: Arto Maranjyan (KAUST) The First Asynchronous SGD with Optimal Time Complexity

Taught by

Federated Learning One World Seminar

Reviews

Start your review of The First Asynchronous SGD with Optimal Time Complexity - Seminar #126

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.