Overview
This course on Gradient Descent aims to teach learners the fundamental optimization method behind most Machine Learning techniques. By following a step-by-step approach, the course covers the main concepts behind Gradient Descent, optimization of variables, loss functions, and the algorithm itself. The course assumes prior knowledge of Least Squares and Linear Regression. The teaching method involves a video tutorial format. This course is intended for individuals interested in understanding the core optimization technique used in Machine Learning.
Syllabus
Awesome song and introduction
Main ideas behind Gradient Descent
Gradient Descent optimization of a single variable, part 1
An important note about why we use Gradient Descent
Gradient Descent optimization of a single variable, part 2
Review of concepts covered so far
Gradient Descent optimization of two or more variables
A note about Loss Functions
Gradient Descent algorithm
Stochastic Gradient Descent
Taught by
StatQuest with Josh Starmer