Skip to main content

Extreme Gradient Boosting with XGBoost

Learn the fundamentals of gradient boosting and build state-of-the-art machine learning models using XGBoost to solve classification and regression problems.

Start Course for Free
4 Hours16 Videos49 Exercises42,159 Learners3750 XPMachine Learning Scientist Track

Create Your Free Account

GoogleLinkedInFacebook

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.

Loved by learners at thousands of companies


Course Description

Do you know the basics of supervised learning and want to use state-of-the-art models on real-world datasets? Gradient boosting is currently one of the most popular techniques for efficient modeling of tabular datasets of all sizes. XGboost is a very fast, scalable implementation of gradient boosting, with models using XGBoost regularly winning online data science competitions and being used at scale across different industries. In this course, you'll learn how to use this powerful library alongside pandas and scikit-learn to build and tune supervised learning models. You'll work with real-world datasets to solve classification and regression problems.
  1. 1

    Classification with XGBoost

    Free

    This chapter will introduce you to the fundamental idea behind XGBoost—boosted learners. Once you understand how XGBoost works, you'll apply it to solve a common classification problem found in industry: predicting whether a customer will stop being a customer at some point in the future.

    Play Chapter Now
    Welcome to the course!
    50 xp
    Which of these is a classification problem?
    50 xp
    Which of these is a binary classification problem?
    50 xp
    Introducing XGBoost
    50 xp
    XGBoost: Fit/Predict
    100 xp
    What is a decision tree?
    50 xp
    Decision trees
    100 xp
    What is Boosting?
    50 xp
    Measuring accuracy
    100 xp
    Measuring AUC
    100 xp
    When should I use XGBoost?
    50 xp
    Using XGBoost
    50 xp
  2. 2

    Regression with XGBoost

    After a brief review of supervised regression, you'll apply XGBoost to the regression task of predicting house prices in Ames, Iowa. You'll learn about the two kinds of base learners that XGboost can use as its weak learners, and review how to evaluate the quality of your regression models.

    Play Chapter Now

In the following tracks

Machine Learning Scientist

Collaborators

hugobowne
Hugo Bowne-Anderson
yashas
Yashas Roy
Sergey Fogelson Headshot

Sergey Fogelson

Head of Data Science, TelevisaUnivision

I enjoy applying my quantitative skills to large-scale data-intensive problems and mentoring junior colleagues. I am also an avid learner and am always trying to refine my programming chops and apply state of the art analytical and statistical methods. In my current role as Head of Data Science at Univision, I build proprietary data products that allow us to efficiently engage with and grow our audience. I also enjoy sharing and communicating what knowledge I have. To that end, and when time permits, I teach data science courses at several NYC area bootcamps/hacker academies/universities. Prior to Univision, I built a state of the art cross-platform media measurement solution at Viacom, automated back office processes using machine learning for clients in the financial industry, and worked at small cybersecurity and digital advertising startups. I obtained my Ph.D. in Cognitive Neuroscience at Dartmouth College.
See More

What do other learners have to say?

Join over 10 million learners and start Extreme Gradient Boosting with XGBoost today!

Create Your Free Account

GoogleLinkedInFacebook

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.