深受数千家公司学习者的喜爱
培训2人或更多?
试用DataCamp for Business课程描述
先决条件
Extreme Gradient Boosting with XGBoost1
Kaggle competitions process
In this first chapter, you will get exposure to the Kaggle competition process. You will train a model and prepare a csv file ready for submission. You will learn the difference between Public and Private test splits, and how to prevent overfitting.
2
Dive into the Competition
Now that you know the basics of Kaggle competitions, you will learn how to study the specific problem at hand. You will practice EDA and get to establish correct local validation strategies. You will also learn about data leakage.
3
Feature Engineering
You will now get exposure to different types of features. You will modify existing features and create new ones. Also, you will treat the missing data accordingly.
4
Modeling
Time to bring everything together and build some models! In this last chapter, you will build a base model before tuning some hyperparameters and improving your results with ensembles. You will then get some final tips and tricks to help you compete more efficiently.
Winning a Kaggle Competition in Python
课程完成 通过 DataCamp for Mobile 提升您的数据技能
随时随地通过我们的移动课程和每日 5 分钟编程挑战提升技能。