Description
In this Linear Algebra course, we will look at what linear algebra is and how it relates to vectors and matrices. Then we go over what vectors and matrices are and how to work with them, as well as the tricky problem of eigenvalues and eigenvectors and how to use them to solve problems. Finally, we'll look at how to use these to do fun things with datasets, such as rotating images of faces and extracting eigenvectors to investigate how the Pagerank algorithm works.
Syllabus :
1. Introduction to Linear Algebra and to Mathematics for Machine Learning
- Introduction: Solving data science challenges with mathematics
- Motivations for linear algebra
- Getting a handle on vectors
- Operations with vectors
2. Vectors are objects that move around space
- Modulus & inner product
- Cosine & dot product
- Projection
- Changing basis
- Basis, vector space, and linear independence
- Applications of changing basis
3. Matrices in Linear Algebra: Objects that operate on Vectors
- Matrices, vectors, and solving simultaneous equation problems
- How matrices transform space
- Types of matrix transformation
- Composition or combination of matrix transformations
- Solving the apples and bananas problem: Gaussian elimination
- Going from Gaussian elimination to finding the inverse matrix
- Determinants and inverses
4. Matrices make linear mappings
- Introduction: Einstein summation convention and the symmetry of the dot product
- Matrices changing basis
- Doing a transformation in a changed basis
- Orthogonal matrices
- The Gram–Schmidt process
5. Eigenvalues and Eigenvectors: Application to Data Problems
- What are eigenvalues and eigenvectors?
- Special eigen-cases
- Calculating eigenvectors
- Changing to the eigenbasis
- Eigenbasis example
- Introduction to PageRank