Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

DataCamp

Tree-Based Models in R

via DataCamp

Overview

In this course, you'll learn how to use tree-based models and ensembles for regression and classification.

In this course you'll learn how to work with tree-based models in R. This course covers everything from using a single tree for regression or classification to more advanced ensemble methods. You'll learn to implement bagged trees, Random Forests, and boosted trees using the Gradient Boosting Machine, or GBM. These powerful techinques will allow you to create high performance regression and classification models for your data.

Syllabus

Classification Trees
-This chapter covers supervised machine learning with classification trees.

Regression Trees
-In this chapter you'll learn how to use a single tree for regression, instead of classification.

Bagged Trees
-In this chapter, you will learn about Bagged Trees, an ensemble method, that uses a combination of trees (instead of only one).

Random Forests
-In this chapter, you will learn about the Random Forest algorithm, another tree-based ensemble method. Random Forest is a modified version of bagged trees with better performance. Here you'll learn how to train, tune and evaluate Random Forest models in R.

Boosted Trees
-In this chapter, you will see the boosting methodology with a focus on the Gradient Boosting Machine (GBM) algorithm, another popular tree-based ensemble method. Here you'll learn how to train, tune and evaluate GBM models in R.


Taught by

Gabriela de Queiroz and Erin LeDell

Reviews

Start your review of Tree-Based Models in R

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.