Linear algebra is one of the most important set of tools in applied mathematics and data science. In this course, you’ll learn how to work with vectors and matrices, solve matrix-vector equations, perform eigenvalue/eigenvector analyses and use principal component analysis to do dimension reduction on real-world datasets. All analyses will be performed in R, one of the world’s most-popular programming languages.
Introduction to Linear AlgebraFree
In this chapter, you will learn about the key objects in linear algebra, such as vectors and matrices. You will understand why they are important and how they interact with each other.
Many machine learning algorithms boil down to solving a matrix-vector equation. In this chapter, you learn what matrix-vector equations are trying to accomplish and how to solve them in R.
Eigenvalues and Eigenvectors
Matrix operations are complex. Eigenvalue/eigenvector analyses allow you
to decompose these operations into simpler ones for the sake of image recognition, genomic analysis, and more!
Principal Component Analysis
“Big Data” is ubiquitous in data science and its applications. However, redundancy in these datasets can be problematic. In this chapter, we learn about principal component analysis and how it can be used in dimension reduction.
In the following tracksStatistician
Data Scientist at Pro Football Focus
Eric Eager is a data scientist for Pro Football Focus, where he analyzes data for all 32 National Football League teams and over 40 college football teams. Before joining PFF in 2018, he was a professor in the Department of Mathematics and Statistics at the University of Wisconsin – La Crosse, where he published over 20 papers in mathematical biology and the scholarship of teaching and learning while securing more than $300,000 in National Science Foundation funding for undergraduate mentorship.