Linear algebra is one of the most important set of tools in applied mathematics and data science. In this course, you’ll learn how to work with vectors and matrices, solve matrix-vector equations, perform eigenvalue/eigenvector analyses and use principal component analysis to do dimension reduction on real-world datasets. All analyses will be performed in R, one of the world’s most-popular programming languages.
Introduction to Linear AlgebraFree
In this chapter, you will learn about the key objects in linear algebra, such as vectors and matrices. You will understand why they are important and how they interact with each other.Motivations50 xpCreating Vectors in R100 xpThe Algebra of Vectors100 xpCreating Matrices in R100 xpMatrix-Vector Operations50 xpMatrix-Vector Compatibility50 xpMatrix Multiplication as a Transformation100 xpReflections100 xpMatrix-Matrix Calculations50 xpMatrix Multiplication Compatibility50 xpMatrix Multiplication - Order Matters100 xpIntro to The Matrix Inverse100 xp
Many machine learning algorithms boil down to solving a matrix-vector equation. In this chapter, you learn what matrix-vector equations are trying to accomplish and how to solve them in R.Motivation for Solving Matrix-Vector Equations50 xpThe Meaning of Ax = b50 xpExploring WNBA Data100 xpMatrix-Vector Equations - Some Theory50 xpWhy is a Matrix Not Invertible?50 xpUnderstanding a Linear System's Three Outcomes50 xpUnderstanding the Massey Matrix50 xpAdjusting the Massey Matrix100 xpInverting the Massey Matrix100 xpSolving Matrix-Vector Equations50 xpAn Analogy with Regular Algebra50 xp2017 WNBA Ratings!100 xpWho Was the Champion?50 xpOther Considerations for Matrix-Vector Equations50 xpOther Methods for Matrix-Vector Equations50 xpAlternatives to the Regular Matrix Inverse100 xp
Eigenvalues and Eigenvectors
Matrix operations are complex. Eigenvalue/eigenvector analyses allow you to decompose these operations into simpler ones for the sake of image recognition, genomic analysis, and more!Intro to Eigenvalues and Eigenvectors50 xpInterpreting Scalar Multiplication50 xpScaling Different Axes100 xpDefinition of Eigenvalues and Eigenvectors50 xpWhy "Eigen"?50 xpFinding Eigenvalues in R100 xpScalar Multiplies of Eigenvectors are Eigenvectors100 xpComputing Eigenvalues and Eigenvectors in R50 xpHow Many Eigenvalues?50 xpVerifying the Math on Eigenvalues100 xpComputing Eigenvectors in R100 xpSome More on Eigenvalues and Eigenvectors50 xpEigenvalue Ordering50 xpMarkov Models for Allele Frequencies100 xp
Principal Component Analysis
“Big Data” is ubiquitous in data science and its applications. However, redundancy in these datasets can be problematic. In this chapter, we learn about principal component analysis and how it can be used in dimension reduction.Intro to the Idea of PCA50 xpWhat Does "Big Data" Mean?50 xpFinding Redundancies100 xpThe Linear Algebra Behind PCA50 xpCovariance Explored50 xpStandardizing Your Data100 xpVariance/Covariance Calculations100 xpEigenanalyses of Combine Data100 xpWhere's the Variance?50 xpPerforming PCA in R50 xpScaling Data Before PCA100 xpSummarizing PCA in R100 xpDoes Subsetting Change Things?50 xpWrap-Up50 xp
PrerequisitesIntroduction to R
Eric EagerSee More
VP of Research and Development at SumerSports
Eric Eager is the VP of R&D at SumerSports. Prior to joining Sumer in 2022, he ran R&D at Pro Football Focus, where he analyzed data for all 32 National Football League teams and over 130 college football teams, along with every major media entity and thousands of subscribers. From 2012-2018 he was a professor in the Department of Mathematics and Statistics at the University of Wisconsin – La Crosse, where he published over 25 papers in mathematical biology and the scholarship of teaching and learning while securing more than $300,000 in National Science Foundation funding for undergraduate mentorship.