Loved by learners at thousands of companies
Continue your machine learning journey by diving into the wonderful world of ensemble learning methods! These are an exciting class of machine learning techniques that combine multiple individual algorithms to boost performance and solve complex problems at scale across different industries. Ensemble techniques regularly win online machine learning competitions as well! In this course, you’ll learn all about these advanced ensemble techniques, such as bagging, boosting, and stacking. You’ll apply them to real-world datasets using cutting edge Python machine learning libraries such as scikit-learn, XGBoost, CatBoost, and mlxtend.
Combining Multiple ModelsFree
Do you struggle to determine which of the models you built is the best for your problem? You should give up on that, and use them all instead! In this chapter, you'll learn how to combine multiple models into one using "Voting" and "Averaging". You'll use these to predict the ratings of apps on the Google Play Store, whether or not a Pokémon is legendary, and which characters are going to die in Game of Thrones!Introduction to ensemble methods50 xpExploring Google apps data50 xpPredicting the rating of an app100 xpVoting50 xpChoosing the best model100 xpAssembling your first ensemble100 xpEvaluating your ensemble100 xpAveraging50 xpJourney to Westeros50 xpPredicting GoT deaths100 xpSoft vs. hard voting100 xp
Bagging is the ensemble method behind powerful machine learning algorithms such as random forests. In this chapter you'll learn the theory behind this technique and build your own bagging models using scikit-learn.The strength of “weak” models50 xpRestricted and unrestricted decision trees100 xp"Weak" decision tree50 xpBootstrap aggregating50 xpTraining with bootstrapping100 xpA first attempt at bagging100 xpBaggingClassifier: nuts and bolts50 xpBagging: the scikit-learn way100 xpChecking the out-of-bag score100 xpBagging parameters: tips and tricks50 xpExploring the UCI SECOM data50 xpA more complex bagging model100 xpTuning bagging hyperparameters100 xp
Boosting is class of ensemble learning algorithms that includes award-winning models such as AdaBoost. In this chapter, you'll learn about this award-winning model, and use it to predict the revenue of award-winning movies! You'll also learn about gradient boosting algorithms such as CatBoost and XGBoost.The effectiveness of gradual learning50 xpIntroducing the movie database50 xpExploring movie features50 xpPredicting movie revenue100 xpBoosting for predicted revenue100 xpAdaptive boosting: award winning model50 xpYour first AdaBoost model100 xpTree-based AdaBoost regression100 xpMaking the most of AdaBoost100 xpGradient boosting50 xpRevisiting Google app reviews50 xpSentiment analysis with GBM100 xpGradient boosting flavors50 xpMovie revenue prediction with CatBoost100 xpBoosting contest: Light vs Extreme100 xp
Get ready to see how things stack up! In this final chapter you'll learn about the stacking ensemble method. You'll learn how to implement it using scikit-learn as well as with the mlxtend library! You'll apply stacking to predict the edibility of North American mushrooms, and revisit the ratings of Google apps with this more advanced approach.The intuition behind stacking50 xpExploring the mushroom dataset50 xpPredicting mushroom edibility100 xpK-nearest neighbors for mushrooms100 xpBuild your first stacked ensemble50 xpApplying stacking to predict app ratings100 xpBuilding the stacking classifier100 xpStacked predictions for app ratings100 xpLet's mlxtend it!50 xpA first attempt with mlxtend100 xpBack to regression with stacking100 xpMushrooms: a matter of life or death100 xpEnsembling it all together50 xp
DatasetsApp ratingsApp reviewsGame of ThronesPokémonSECOM (Semiconductor Manufacturing)TMDb (The Movie Database)
Román de las Heras
Data Scientist at Goodwall
Román de las Heras is a Data Scientist at Goodwall. He studied Systems Engineering and took a simultaneous degree in Mathematics with Computer Science Orientation at the National Autonomous University of Honduras (UNAH). The ensemble of these two careers is what drove him into the world of data science. His daily work includes developing machine learning models, applying time series techniques to financial forecasting, training junior team members on the field, and doing ad-hoc data analyses to present results and insights. He is also a passionate and experienced educator, as well as a strong believer in "learn by doing".