Skip to main content
HomeRMachine Learning with Tree-Based Models in R

Machine Learning with Tree-Based Models in R

Learn how to use tree-based models and ensembles to make classification and regression predictions with tidymodels.

Start Course for Free
4 Hours16 Videos58 Exercises
6,452 LearnersTrophyStatement of Accomplishment

Create Your Free Account

GoogleLinkedInFacebook

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.

Loved by learners at thousands of companies


Course Description

Tree-based machine learning models can reveal complex non-linear relationships in data and often dominate machine learning competitions. In this course, you'll use the tidymodels package to explore and build different tree-based models—from simple decision trees to complex random forests. You’ll also learn to use boosted trees, a powerful machine learning technique that uses ensemble learning to build high-performing predictive models. Along the way, you'll work with health and credit risk data to predict the incidence of diabetes and customer churn.
  1. 1

    Classification Trees

    Free

    Ready to build a real machine learning pipeline? Complete step-by-step exercises to learn how to create decision trees, split your data, and predict which patients are most likely to suffer from diabetes. Last but not least, you’ll build performance measures to assess your models and judge your predictions.

    Play Chapter Now
    Welcome to the course!
    50 xp
    Why tree-based methods?
    100 xp
    Specify that tree
    100 xp
    Train that model
    100 xp
    How to grow your tree
    50 xp
    Train/test split
    100 xp
    Avoiding class imbalances
    100 xp
    From zero to hero
    100 xp
    Predict and evaluate
    50 xp
    Make predictions
    100 xp
    Crack the matrix
    100 xp
    Are you predicting correctly?
    100 xp
  2. 2

    Regression Trees and Cross-Validation

    Ready for some candy? Use a chocolate rating dataset to build regression trees and assess their performance using suitable error measures. You’ll overcome statistical insecurities of single train/test splits by applying sweet techniques like cross-validation and then dive even deeper by mastering the bias-variance tradeoff.

    Play Chapter Now
  3. 3

    Hyperparameters and Ensemble Models

    Time to get serious with tuning your hyperparameters and interpreting receiver operating characteristic (ROC) curves. In this chapter, you’ll leverage the wisdom of the crowd with ensemble models like bagging or random forests and build ensembles that forecast which credit card customers are most likely to churn.

    Play Chapter Now
  4. 4

    Boosted Trees

    Ready for the high society of tree-based models? Apply gradient boosting to create powerful ensembles that perform better than anything that you have seen or built. Learn about their fine-tuning and how to compare different models to pick a winner for production.

    Play Chapter Now

In the following tracks

Machine Learning Fundamentals in RMachine Learning Scientist with RSupervised Machine Learning in R

Collaborators

Collaborator's avatar
Maggie Matsui
Collaborator's avatar
Justin Saddlemyer
Collaborator's avatar
James Chapman
Sandro Raabe HeadshotSandro Raabe

Data Scientist

Sandro is an aspiring Data Scientist, mathematician, teacher, and developer. He strongly believes that anyone - not only professionals - can create data applications using R's open interfaces. Having completed his studies in Germany, Oxford, Sydney, Pretoria, and online, he has gained professional experience in the finance and healthcare sector, providing companies with data-driven insights to solve significant problems. As an active contributor to the open-source community, he created vistime, an R package for generating timeline plots.
See More

What do other learners have to say?

Join over 13 million learners and start Machine Learning with Tree-Based Models in R today!

Create Your Free Account

GoogleLinkedInFacebook

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.