Loved by learners at thousands of companies
In this course you'll learn all about using linear classifiers, specifically logistic regression and support vector machines, with scikit-learn. Once you've learned how to apply these methods, you'll dive into the ideas behind them and find out what really makes them tick. At the end of this course you'll know how to train, test, and tune these linear classifiers in Python. You'll also have a conceptual foundation for understanding many other machine learning algorithms.
Applying logistic regression and SVMFree
In this chapter you will learn the basics of applying logistic regression and support vector machines (SVMs) to classification problems. You'll use the
scikit-learnlibrary to fit classification models to real data.scikit-learn refresher50 xpKNN classification100 xpComparing models50 xpOverfitting50 xpApplying logistic regression and SVM50 xpRunning LogisticRegression and SVC100 xpSentiment analysis for movie reviews100 xpLinear classifiers50 xpWhich decision boundary is linear?50 xpVisualizing decision boundaries100 xp
In this chapter you will discover the conceptual framework behind logistic regression and SVMs. This will let you delve deeper into the inner workings of these models.Linear classifiers: the coefficients50 xpHow models make predictions50 xpChanging the model coefficients100 xpWhat is a loss function?50 xpThe 0-1 loss50 xpMinimizing a loss function100 xpLoss function diagrams50 xpClassification loss functions50 xpComparing the logistic and hinge losses100 xpImplementing logistic regression100 xp
In this chapter you will delve into the details of logistic regression. You'll learn all about regularization and how to interpret model output.Logistic regression and regularization50 xpRegularized logistic regression100 xpLogistic regression and feature selection100 xpIdentifying the most positive and negative words100 xpLogistic regression and probabilities50 xpGetting class probabilities50 xpRegularization and probabilities100 xpVisualizing easy and difficult examples100 xpMulti-class logistic regression50 xpCounting the coefficients50 xpFitting multi-class logistic regression100 xpVisualizing multi-class logistic regression100 xpOne-vs-rest SVM100 xp
Support Vector Machines
In this chapter you will learn all about the details of support vector machines. You'll learn about tuning hyperparameters for these models and using kernels to fit non-linear decision boundaries.Support vectors50 xpSupport vector definition50 xpEffect of removing examples100 xpKernel SVMs50 xpGridSearchCV warm-up100 xpJointly tuning gamma and C with GridSearchCV100 xpComparing logistic regression and SVM (and beyond)50 xpAn advantage of SVMs50 xpAn advantage of logistic regression50 xpUsing SGDClassifier100 xpConclusion50 xp
PrerequisitesSupervised Learning with scikit-learn
Instructor, the University of British Columbia
Mike Gelbart is an Instructor in the Department of Computer Science at the University of British Columbia (UBC) in Vancouver, Canada. He also teaches in, and co-designed, the Master of Data Science program at UBC. Mike received his undergraduate degree in physics from Princeton University and his PhD from the machine learning group at Harvard University, working on hyperparameter optimization for machine learning.
What do other learners have to say?
I've used other sites—Coursera, Udacity, things like that—but DataCamp's been the one that I've stuck with.
Devon Edwards Joseph
Lloyds Banking Group
DataCamp is the top resource I recommend for learning data science.
Harvard Business School
DataCamp is by far my favorite website to learn from.
Decision Science Analytics, USAA