Linear Classifiers in Python
In this course you will learn the details of linear classifiers like logistic regression and SVM.Start Course for Free
4 Hours13 Videos44 Exercises45,860 Learners
Create Your Free Account
Loved by learners at thousands of companies
In this course you'll learn all about using linear classifiers, specifically logistic regression and support vector machines, with scikit-learn. Once you've learned how to apply these methods, you'll dive into the ideas behind them and find out what really makes them tick. At the end of this course you'll know how to train, test, and tune these linear classifiers in Python. You'll also have a conceptual foundation for understanding many other machine learning algorithms.
Applying logistic regression and SVMFree
In this chapter you will learn the basics of applying logistic regression and support vector machines (SVMs) to classification problems. You'll use the
scikit-learnlibrary to fit classification models to real data.scikit-learn refresher50 xpKNN classification100 xpComparing models50 xpOverfitting50 xpApplying logistic regression and SVM50 xpRunning LogisticRegression and SVC100 xpSentiment analysis for movie reviews100 xpLinear classifiers50 xpWhich decision boundary is linear?50 xpVisualizing decision boundaries100 xp
In this chapter you will discover the conceptual framework behind logistic regression and SVMs. This will let you delve deeper into the inner workings of these models.Linear classifiers: the coefficients50 xpHow models make predictions50 xpChanging the model coefficients100 xpWhat is a loss function?50 xpThe 0-1 loss50 xpMinimizing a loss function100 xpLoss function diagrams50 xpClassification loss functions50 xpComparing the logistic and hinge losses100 xpImplementing logistic regression100 xp
In this chapter you will delve into the details of logistic regression. You'll learn all about regularization and how to interpret model output.Logistic regression and regularization50 xpRegularized logistic regression100 xpLogistic regression and feature selection100 xpIdentifying the most positive and negative words100 xpLogistic regression and probabilities50 xpGetting class probabilities50 xpRegularization and probabilities100 xpVisualizing easy and difficult examples100 xpMulti-class logistic regression50 xpCounting the coefficients50 xpFitting multi-class logistic regression100 xpVisualizing multi-class logistic regression100 xpOne-vs-rest SVM100 xp
Support Vector Machines
In this chapter you will learn all about the details of support vector machines. You'll learn about tuning hyperparameters for these models and using kernels to fit non-linear decision boundaries.Support vectors50 xpSupport vector definition50 xpEffect of removing examples100 xpKernel SVMs50 xpGridSearchCV warm-up100 xpJointly tuning gamma and C with GridSearchCV100 xpComparing logistic regression and SVM (and beyond)50 xpAn advantage of SVMs50 xpAn advantage of logistic regression50 xpUsing SGDClassifier100 xpConclusion50 xp
In the following tracksMachine Learning Fundamentals with PythonMachine Learning Scientist with Python
PrerequisitesSupervised Learning with scikit-learn
Mike GelbartSee More
Instructor, the University of British Columbia
Mike Gelbart is an Instructor in the Department of Computer Science at the University of British Columbia (UBC) in Vancouver, Canada. He also teaches in, and co-designed, the Master of Data Science program at UBC. Mike received his undergraduate degree in physics from Princeton University and his PhD from the machine learning group at Harvard University, working on hyperparameter optimization for machine learning.
Don’t just take our word for it
*4.2from 16 reviews
- Andrew G.about 1 month
Really good course.
- Kayleigh W.about 1 month
- Isaac H.about 1 month
Professor is incredibly clear. Concepts are explained in a simple and efficient way. Overall, one of the best courses I have taken on datacamp!
- 찬 박.3 months
복습하고 실제 사용하는데 좋았어요.
- Megan S.4 months
Revised the application of Lasso and Ridge Regularisation, understood linear classification through visualistion with linear boundaries, learned about KNearestNeighbors, LogisticRegression, SVMs and their applications. Detailed exploration on how loss functions work. Great content from DataCamp as always.
"Really good course."
"Professor is incredibly clear. Concepts are explained in a simple and efficient way. Overall, one of the best courses I have taken on datacamp!"
Join over 11 million learners and start Linear Classifiers in Python today!
Create Your Free Account