Ensemble Methods in Python

Learn how to build advanced and effective machine learning models in Python using ensemble techniques such as bagging, boosting, and stacking.

Start Course for Free
4 Hours15 Videos52 Exercises4,728 Learners
4050 XP

Create Your Free Account

GoogleLinkedInFacebook

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA. You confirm you are at least 16 years old (13 if you are an authorized Classrooms user).

Loved by learners at thousands of companies


Course Description

Continue your machine learning journey by diving into the wonderful world of ensemble learning methods! These are an exciting class of machine learning techniques that combine multiple individual algorithms to boost performance and solve complex problems at scale across different industries. Ensemble techniques regularly win online machine learning competitions as well! In this course, you’ll learn all about these advanced ensemble techniques, such as bagging, boosting, and stacking. You’ll apply them to real-world datasets using cutting edge Python machine learning libraries such as scikit-learn, XGBoost, CatBoost, and mlxtend.

  1. 1

    Combining Multiple Models

    Free

    Do you struggle to determine which of the models you built is the best for your problem? You should give up on that, and use them all instead! In this chapter, you'll learn how to combine multiple models into one using "Voting" and "Averaging". You'll use these to predict the ratings of apps on the Google Play Store, whether or not a Pokémon is legendary, and which characters are going to die in Game of Thrones!

    Play Chapter Now
    Introduction to ensemble methods
    50 xp
    Exploring Google apps data
    50 xp
    Predicting the rating of an app
    100 xp
    Voting
    50 xp
    Choosing the best model
    100 xp
    Assembling your first ensemble
    100 xp
    Evaluating your ensemble
    100 xp
    Averaging
    50 xp
    Journey to Westeros
    50 xp
    Predicting GoT deaths
    100 xp
    Soft vs. hard voting
    100 xp
  2. 4

    Stacking

    Get ready to see how things stack up! In this final chapter you'll learn about the stacking ensemble method. You'll learn how to implement it from scratch as well as using the mlxtend library! You'll apply stacking to predict the edibility of North American mushrooms, and revisit the ratings of Google apps with this more advanced approach.

    Play Chapter Now

Datasets

App ratingsApp reviewsGame of ThronesPokémonSECOM (Semiconductor Manufacturing)TMDb (The Movie Database)

Collaborators

Yashas RoyHillary Green-Lerman
Román de las Heras Headshot

Román de las Heras

Data Scientist at SAP and Data Science Team Lead at Agile Solutions

Román de las Heras is a Data Scientist at Sinch and a Data Science Team Lead at Agile Solutions. He studied Systems Engineering and took a simultaneous degree in Mathematics with Computer Science Orientation at the National Autonomous University of Honduras (UNAH). The ensemble of these two careers is what drove him into the world of data science. His daily work includes developing machine learning models, applying time series techniques to financial forecasting, training junior team members on the field, and doing ad-hoc data analyses to present results and insights. He is also a passionate and experienced educator, as well as a strong believer in "learn by doing".
See More

What do other learners have to say?

I've used other sites—Coursera, Udacity, things like that—but DataCamp's been the one that I've stuck with.

Devon Edwards Joseph
Lloyds Banking Group

DataCamp is the top resource I recommend for learning data science.

Louis Maiden
Harvard Business School

DataCamp is by far my favorite website to learn from.

Ronald Bowers
Decision Science Analytics, USAA