Hyperparameter Tuning in Python
Learn techniques for automated hyperparameter tuning in Python, including Grid, Random, and Informed Search.
Start Course for Free4 hours13 videos44 exercises19,311 learnersStatement of Accomplishment
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.Training 2 or more people?
Try DataCamp for BusinessLoved by learners at thousands of companies
Course Description
As a data or machine learning scientist, building powerful machine learning models depends heavily on the set of hyperparameters used. But with increasingly complex models with lots of options, how do you efficiently find the best settings for your particular problem? The answer is hyperparameter tuning!
Learn the difference between hyperparameters and parameters and best practices for setting and analyzing hyperparameter values. This foundation will prepare you to understand the significance of hyperparameters in machine learning models.
You will be introduced to Random Search, and learn about its advantages over Grid Search, such as efficiency in large parameter spaces.
These informed search techniques are demonstrated through practical examples, allowing you to compare and contrast them with uninformed search methods. By the end, you will have a comprehensive understanding of how to optimize hyperparameters effectively to improve model performance.
Hyperparameters vs. parameters
Gain practical experience using various methodologies for automated hyperparameter tuning in Python with Scikit-Learn.Learn the difference between hyperparameters and parameters and best practices for setting and analyzing hyperparameter values. This foundation will prepare you to understand the significance of hyperparameters in machine learning models.
Grid search
Master several hyperparameter tuning techniques, starting with Grid Search. Using credit card default data, you will practice conducting Grid Search to exhaustively search for the best hyperparameter combinations and interpret the results.You will be introduced to Random Search, and learn about its advantages over Grid Search, such as efficiency in large parameter spaces.
Informed search
In the final part of the course, you will explore advanced optimization methods, such as Bayesian and Genetic algorithms.These informed search techniques are demonstrated through practical examples, allowing you to compare and contrast them with uninformed search methods. By the end, you will have a comprehensive understanding of how to optimize hyperparameters effectively to improve model performance.
For Business
Training 2 or more people?
Get your team access to the full DataCamp library, with centralized reporting, assignments, projects and moreIn the following Tracks
Machine Learning Scientist in Python
Go To TrackSupervised Machine Learning in Python
Go To Track- 1
Hyperparameters and Parameters
FreeIn this introductory chapter you will learn the difference between hyperparameters and parameters. You will practice extracting and analyzing parameters, setting hyperparameter values for several popular machine learning algorithms. Along the way you will learn some best practice tips & tricks for choosing which hyperparameters to tune and what values to set & build learning curves to analyze your hyperparameter choices.
Introduction & 'Parameters'50 xpParameters in Logistic Regression50 xpExtracting a Logistic Regression parameter100 xpExtracting a Random Forest parameter100 xpIntroducing Hyperparameters50 xpHyperparameters in Random Forests50 xpExploring Random Forest Hyperparameters100 xpHyperparameters of KNN100 xpSetting & Analyzing Hyperparameter Values50 xpAutomating Hyperparameter Choice100 xpBuilding Learning Curves100 xp - 2
Grid search
This chapter introduces you to a popular automated hyperparameter tuning methodology called Grid Search. You will learn what it is, how it works and practice undertaking a Grid Search using Scikit Learn. You will then learn how to analyze the output of a Grid Search & gain practical experience doing this.
Introducing Grid Search50 xpBuild Grid Search functions100 xpIteratively tune multiple hyperparameters100 xpHow Many Models?50 xpGrid Search with Scikit Learn50 xpGridSearchCV inputs50 xpGridSearchCV with Scikit Learn100 xpUnderstanding a grid search output50 xpUsing the best outputs50 xpExploring the grid search results100 xpAnalyzing the best results100 xpUsing the best results100 xp - 3
Random Search
In this chapter you will be introduced to another popular automated hyperparameter tuning methodology called Random Search. You will learn what it is, how it works and importantly how it differs from grid search. You will learn some advantages and disadvantages of this method and when to choose this method compared to Grid Search. You will practice undertaking a Random Search with Scikit Learn as well as visualizing & interpreting the output.
Introducing Random Search50 xpRandomly Sample Hyperparameters100 xpRandomly Search with Random Forest100 xpVisualizing a Random Search100 xpRandom Search in Scikit Learn50 xpRandomSearchCV inputs50 xpThe RandomizedSearchCV Object100 xpRandomSearchCV in Scikit Learn100 xpComparing Grid and Random Search50 xpComparing Random & Grid Search50 xpGrid and Random Search Side by Side100 xp - 4
Informed Search
In this final chapter you will be given a taste of more advanced hyperparameter tuning methodologies known as ''informed search''. This includes a methodology known as Coarse To Fine as well as Bayesian & Genetic hyperparameter tuning algorithms. You will learn how informed search differs from uninformed search and gain practical skills with each of the mentioned methodologies, comparing and contrasting them as you go.
Informed Search: Coarse to Fine50 xpVisualizing Coarse to Fine100 xpCoarse to Fine Iterations100 xpInformed Search: Bayesian Statistics50 xpBayes Rule in Python100 xpBayesian Hyperparameter tuning with Hyperopt100 xpInformed Search: Genetic Algorithms50 xpGenetic Hyperparameter Tuning with TPOT100 xpAnalysing TPOT's stability100 xpCongratulations!50 xp
For Business
Training 2 or more people?
Get your team access to the full DataCamp library, with centralized reporting, assignments, projects and moreIn the following Tracks
Machine Learning Scientist in Python
Go To TrackSupervised Machine Learning in Python
Go To TrackAlex Scriven
See MoreSenior Data Scientist @ Atlassian
Alex is a Senior Data Scientist working for Atlassian in Sydney, Australia and has previous experience in government, agency and startup. He also holds lecturing and research positions at the University of Technology Sydney and the University of New South Wales. He has built and delivered several Masters-level courses in machine learning & deep learning whilst researching on applications of machine learning & data science in industry. From a heavily commercial background, Alex greatly enjoys bridging the gap between cutting-edge technology and business applications.
FAQs
Join over 14 million learners and start Hyperparameter Tuning in Python today!
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.