Skip to main content

Hyperparameter Tuning in R

Learn how to tune your model's hyperparameters to get the best predictive results.

Start Course for Free
4 Hours14 Videos47 Exercises5,236 Learners
3500 XP

Create Your Free Account



By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA. You confirm you are at least 16 years old (13 if you are an authorized Classrooms user).

Loved by learners at thousands of companies

Course Description

For many machine learning problems, simply running a model out-of-the-box and getting a prediction is not enough; you want the best model with the most accurate prediction. One way to perfect your model is with hyperparameter tuning, which means optimizing the settings for that specific model. In this course, you will work with the caret, mlr and h2o packages to find the optimal combination of hyperparameters in an efficient manner using grid search, random search, adaptive resampling and automatic machine learning (AutoML). Furthermore, you will work with different datasets and tune different supervised learning models, such as random forests, gradient boosting machines, support vector machines, and even neural nets. Get ready to tune!

  1. 1

    Introduction to hyperparameters


    Why do we use the strange word "hyperparameter"? What makes it hyper? Here, you will understand what model parameters are, and why they are different from hyperparameters in machine learning. You will then see why we would want to tune them and how the default setting of caret automatically includes hyperparameter tuning.

    Play Chapter Now
    Parameters vs hyperparameters
    50 xp
    Model parameters vs. hyperparameters
    100 xp
    Hyperparameters in linear models
    50 xp
    What are the coefficients?
    100 xp
    Recap of machine learning basics
    50 xp
    Machine learning with caret
    100 xp
    Resampling schemes
    50 xp
    Hyperparameter tuning in caret
    50 xp
    Hyperparameters in Stochastic Gradient Boosting
    50 xp
    Changing the number of hyperparameters to tune
    100 xp
    Tune hyperparameters manually
    100 xp
  2. 2

    Hyperparameter tuning with caret

    In this chapter, you will learn how to tune hyperparameters with a Cartesian grid. Then, you will implement faster and more efficient approaches. You will use Random Search and adaptive resampling to tune the parameter grid, in a way that concentrates on values in the neighborhood of the optimal settings.

    Play Chapter Now
  3. 3

    Hyperparameter tuning with mlr

    Here, you will use another package for machine learning that has very convenient hyperparameter tuning functions. You will define a Cartesian grid or perform Random Search, as well as advanced techniques. You will also learn different ways to plot and evaluate models with different hyperparameters.

    Play Chapter Now
  4. 4

    Hyperparameter tuning with h2o

    In this final chapter, you will use h2o, another package for machine learning with very convenient hyperparameter tuning functions. You will use it to train different models and define a Cartesian grid. Then, You will implement a Random Search use stopping criteria. Finally, you will learn AutoML, an h2o interface which allows for very fast and convenient model and hyperparameter tuning with just one function.

    Play Chapter Now

In the following tracks

Machine Learning ScientistSupervised Machine Learning


Chester IsmayHadrien Lacroix
Shirin  Elsinghorst (formerly Glander) Headshot

Shirin Elsinghorst (formerly Glander)

Data Scientist @ codecentric

I’m Shirin, a biologist turned bioinformatician turned data scientist. During my PhD and Postdoc I worked with Next Generation Sequencing data to analyze diseases like arthritis. However, I then chose to become a data scientist for a German IT company, called codecentric. In this capacity, I have been working on many different projects, e.g. building fraud detection models, creating a chatbot, implementing predictive maintenance, and more. My tool of choice for data analysis so far has been R but I also work with Python. I am also very passionate about teaching and sharing knowledge, so I give workshops, talk at conferences or meetups, write blog posts and organize the MünsteR R-users group.
See More

What do other learners have to say?

I've used other sites—Coursera, Udacity, things like that—but DataCamp's been the one that I've stuck with.

Devon Edwards Joseph
Lloyds Banking Group

DataCamp is the top resource I recommend for learning data science.

Louis Maiden
Harvard Business School

DataCamp is by far my favorite website to learn from.

Ronald Bowers
Decision Science Analytics, USAA