Intermediate Regression with statsmodels in Python
4.2+
11 reviewsAdvanced
Learn to perform linear and logistic regression with multiple explanatory variables.
Start Course for Free4 Hours14 Videos52 Exercises6,438 Learners
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.Loved by learners at thousands of companies
Course Description
Linear regression and logistic regression are the two most widely used statistical models and act like master keys, unlocking the secrets hidden in datasets. In this course, you’ll build on the skills you gained in "Introduction to Regression in Python with statsmodels", as you learn about linear and logistic regression with multiple explanatory variables. Through handson exercises, you’ll explore the relationships between variables in realworld datasets, Taiwan house prices and customer churn modeling, and more. By the end of this course, you’ll know how to include multiple explanatory variables in a model, discover how interactions between variables affect predictions, and understand how linear and logistic regression work.
 1
Parallel Slopes
FreeExtend your linear regression skills to parallel slopes regression, with one numeric and one categorical explanatory variable. This is the first step towards conquering multiple linear regression.
Parallel slopes linear regression50 xpFitting a parallel slopes linear regression100 xpInterpreting parallel slopes coefficients100 xpVisualizing each explanatory variable100 xpVisualizing parallel slopes100 xpPredicting parallel slopes50 xpPredicting with a parallel slopes model100 xpVisualizing parallel slopes model predictions100 xpManually calculating predictions100 xpAssessing model performance50 xpComparing coefficients of determination100 xpComparing residual standard error100 xp  2
Interactions
Explore the effect of interactions between explanatory variables. Considering interactions allows for more realistic models that can have better predictive power. You'll also deal with Simpson's Paradox: a nonintuitive result that arises when you have multiple explanatory variables.
Models for each category50 xpOne model per category100 xpPredicting multiple models100 xpVisualizing multiple models100 xpAssessing model performance100 xpOne model with an interaction50 xpSpecifying an interaction100 xpInteractions with understandable coeffs100 xpMaking predictions with interactions50 xpPredicting with interactions100 xpManually calculating predictions with interactions100 xpSimpson's Paradox50 xpModeling eBay auctions100 xpModeling each auction type100 xp  3
Multiple Linear Regression
See how modeling and linear regression make it easy to work with more than two explanatory variables. Once you've mastered fitting linear regression models, you'll get to implement your own linear regression algorithm.
Two numeric explanatory variables50 xpInteractive 3D scatter plot50 xpVisualizing three numeric variables100 xpModeling two numeric explanatory variables100 xpVisualizing two numeric explanatory variables100 xpIncluding an interaction100 xpMore than two explanatory variables50 xpVisualizing many variables100 xpDifferent levels of interaction100 xpPredicting again100 xpHow linear regression works50 xpThe sum of squares50 xpLinear regression algorithm100 xp  4
Multiple Logistic Regression
Extend your logistic regression skills to multiple explanatory variables. You’ll also learn about logistic distribution, which underpins this form of regression, before implementing your own logistic regression algorithm.
Multiple logistic regression50 xpLogistic regression with two explanatory variables100 xpLogistic regression prediction100 xpVisualizing multiple explanatory variables100 xpConfusion matrix100 xpThe logistic distribution50 xpCumulative distribution function100 xpInverse cumulative distribution function100 xpLogistic distribution parameters50 xpHow logistic regression works50 xpLikelihood & loglikelihood50 xpLogistic regression algorithm100 xpCongratulations!50 xp
In the following tracks
Statistics Fundamentals with PythonCollaborators
Maarten Van den Broeck
See MoreSenior Content Developer at DataCamp
Maarten is an aquatic ecologist and teacher by training and a data scientist by profession. He is also a certified Power BI and Tableau data analyst. After his career as a PhD researcher at KU Leuven, he wished that he had discovered DataCamp sooner. He loves to combine education and data science to develop DataCamp courses. In his spare time, he runs a symphonic orchestra.
Don’t just take our word for it
*4.2from 11 reviews
36%
55%
9%
0%
0%
Sort by
 Yannick D.2 months
I like the process used by Datacamp to teach us each subject. The interface is really easy and effecient to use. Learning programming is fun.
 Joanna K.4 months
Easy to understand, good job :)
 Aldo M.8 months
Very good, useful
 Yasser A.9 months
It’s great. It need a cheat sheet please!! :)
 Ioannis K.20 days

Loading ...
"I like the process used by Datacamp to teach us each subject. The interface is really easy and effecient to use. Learning programming is fun."
Yannick D.
"Easy to understand, good job :)"
Joanna K.
"Very good, useful"
Aldo M.
Join over 11 million learners and start Intermediate Regression with statsmodels in Python today!
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.