Overview
Learn how AdaBoost, a powerful ensemble method for machine learning models, combines average learners into a superior model in this 18-minute educational video. Explore the fundamental concepts of bagging and boosting before diving into AdaBoost's core mechanics. Master techniques for rescaling mistakes, combining learners effectively, and implementing weighted voting systems. Understand the relationships between probability, odds, and logit functions in the context of ensemble learning. Follow along with hands-on examples using the provided codelab, which demonstrates practical implementation of random forests and AdaBoost algorithms. Access complementary learning materials through the accompanying GitHub repository and explore additional resources in the Grokking Machine Learning Book.
Syllabus
Introduction
Bagging and Boosting
AdaBoost
Rescaling the mistakes
Combining the learners
Probability, Odds, and the Logit
Weighted voting
Codelab
Taught by
Serrano.Academy