This course is part of these tracks:

Mike Gelbart
Mike Gelbart

Instructor, the University of British Columbia

Mike Gelbart is an Instructor in the Department of Computer Science at the University of British Columbia (UBC) in Vancouver, Canada. He also teaches in, and co-designed, the Master of Data Science program at UBC. Mike received his undergraduate degree in physics from Princeton University and his PhD from the machine learning group at Harvard University, working on hyperparameter optimization for machine learning.

See More
Collaborator(s)
  • Nick Solomon

    Nick Solomon

  • Kara Woo

    Kara Woo

Course Description

In this course you'll learn all about using linear classifiers, specifically logistic regression and support vector machines, with scikit-learn. Once you've learned how to apply these methods, you'll dive into the ideas behind them and find out what really makes them tick. At the end of this course you'll know how to train, test, and tune these linear classifiers in Python. You'll also have a conceptual foundation for understanding many other machine learning algorithms.

  1. 1

    Applying logistic regression and SVM

    Free

    In this chapter you will learn the basics of applying logistic regression and support vector machines (SVMs) to classification problems. You'll use the scikit-learn library to fit classification models to real data.

  2. Loss functions

    In this chapter you will discover the conceptual framework behind logistic regression and SVMs. This will let you delve deeper into the inner workings of these models.

  3. Logistic regression

    In this chapter you will delve into the details of logistic regression. You'll learn all about regularization and how to interpret model output.

  4. Support Vector Machines

    In this chapter you will learn all about the details of support vector machines. You'll learn about tuning hyperparameters for these models and using kernels to fit non-linear decision boundaries.

info