Overview
This course will cover all aspects of Gradient Boost (GB) as it applies to Regression and Classification. In Part 1, we will discuss the main ideas behind GB, including the different loss functions, the boosting process, and base learner selection. In Part 2, we will get into the details of using GB for regression, such as hyperparameter tuning and evaluation techniques. In Part 3, we will introduce the fundamentals of GB for classification, including the decision tree and logistic regression examples. Lastly, in Part 4, we will discuss in depth the details of using GB for classification, including the cost-complexity pruning and boosting tree strategies. Throughout the course, we will cover best practices and provide practical examples to help you gain a deeper understanding of the GB technique.
Syllabus
Gradient Boost Part 1 (of 4): Regression Main Ideas.
Gradient Boost Part 2 (of 4): Regression Details.
Gradient Boost Part 3 (of 4): Classification.
Gradient Boost Part 4 (of 4): Classification Details.
Taught by
StatQuest with Josh Starmer