# Linear Algebra

Our Linear Algebra course builds on students' prior experience with matrices and determinants to develop their understanding of crucial linear algebra concepts. These include systems of equations, vector spaces, linear transformations, eigenvectors and eigenvalues, diagonalization, inner products, orthogonal projections, quadratic forms, and singular value decomposition. Real-world applications such as least-squares problems, Markov chains, and regression will also be explored.

This course is designed to give students a deep understanding of linear algebra, equipping them with the knowledge and skills necessary for advanced study in mathematics, computer science, physics, and engineering.

## Content

After briefly looking at some essential set theory, logic, and vector geometry, students explore matrices in-depth. They will become fluent in Gaussian elimination, using it to solve systems of equations and compute inverse matrices.

As part of this course, students perform a deep dive into vector spaces. They will explore vectors in N-dimensional space, linear independence, subspaces, bases, dimension, rank, nullity, and the rank-nullity theorem. They also generalize key concepts to abstract vector spaces and inner product spaces.

This course develops students' understanding of linear transformations, including images, kernels, linear maps between different vector spaces, and change-of-coordinates matrices.

Students will learn how to find and solve the characteristic equation of a matrix and fluently compute eigenvectors. They will study matrix diagonalization and its properties and apply the rotational-scaling theorem. They will also cover generalized eigenvectors and Jordan canonical decomposition.

This course covers various aspects of orthogonality in vector spaces, including orthogonal sets, complements, and orthogonal matrices. In addition, students will learn about orthogonal projections and the Gram-Schmidt orthogonalization process.

Students will explore quadratic forms, including positive-definite and negative-definite quadratic forms, and the diagonalization of symmetric matrices. They will also classify quadratic curves and learn how to reduce a quadratic curve to its principal axes.

Finally, this course explores singular value decomposition and other applications of linear algebra, such as linear least-squares problems, Markov chains, and regression.

Upon successful completion of this course, students will have mastered the following:
• Cover essential concepts from set theory and logic.
• Compute equations of lines and planes, and convert between their respective vector, parametric, and Cartesian forms.
• Master determinants, including Laplace expansions, properties of determinants, Cramer's rule, computing determinants using row and column operations, and partitioned and block matrices.
• Cover Gaussian elimination, including row echelon form, solving NxM systems of equations, and interpreting the solutions.
• Compute the inverse of a square matrix using row operations, understand elementary matrices, and apply the invertible matrix theorem.
• Master LU factorization and use it to solve systems of linear equations.
• Explore vectors in N-dimensional space, including computing the span of a set of vectors and determining linear independence.
• Determine whether a subset of a vector space forms a subspace, and compute the column and null spaces of a matrix.
• Develop a thorough understanding of bases, dimension, rank, nullity, and the rank-nullity theorem.
• Generalize their understanding of vector spaces to abstract vector spaces.
• Extend existing knowledge of linear transformations to include the kernel and image of a linear transformation, the rank-nullity theorem in terms of linear transformations, and analyze linear maps between different vector spaces and their matrix representations.
• Master the concepts and computations related to finding eigenvalues, eigenvectors, diagonalization of matrices, and the Cayley-Hamilton theorem.
• Extend their understanding of eigenvectors and diagonalization to include generalized eigenvectors and Jordan canonical decomposition.
• Explore complex vectors and matrices, including finding complex eigenvalues of real matrices, rotation-scaling matrices, and block diagonalization.
• Understand the concept of an inner product space.
• Delve into orthogonality in vector spaces, including orthogonal complements, orthogonal matrices, and the four fundamental subspaces of a matrix.
• Learn about orthogonal projections, projection matrices, the Gram-Schmidt process, and QR factorization.
• Explore quadratic forms and their applications, including the diagonalization of symmetric matrices and the spectral theorem.
• Delve into singular value decomposition, its computation, and applications, including the pseudoinverse matrix.
• Apply linear algebra concepts to real-world problems such as linear least-squares problems, Markov chains, and linear regression.
1.
Preliminaries
25 topics
1.1. Introduction to Set Theory
 1.1.1. Special Sets 1.1.2. Statements and Predicates 1.1.3. Equivalent Sets 1.1.4. The Constructive Definition of a Set 1.1.5. The Conditional Definition of a Set 1.1.6. Describing Sets Using Set-Builder Notation 1.1.7. Describing Planar Regions Using Set-Builder Notation 1.1.8. Cardinality of Finite Sets 1.1.9. Infinite Sets 1.1.10. Subsets
1.2. Set Operations
 1.2.1. The Difference of Sets 1.2.2. Set Complements 1.2.3. The Cartesian Product 1.2.4. Visualizing Cartesian Products 1.2.5. Indexed Sets 1.2.6. Sets and Functions
1.3. Vector Geometry
 1.3.1. The Vector Equation of a Line 1.3.2. The Parametric Equations of a Line 1.3.3. The Cartesian Equation of a Line 1.3.4. The Vector Equation of a Plane 1.3.5. The Cartesian Equation of a Plane 1.3.6. The Parametric Equations of a Plane 1.3.7. The Intersection of Two Planes 1.3.8. The Shortest Distance Between a Plane and a Point 1.3.9. The Intersection Between a Line and a Plane
2.
Matrices
36 topics
2.4. Determinants
 2.4.1. Cramer’s Rule for 2x2 Systems of Linear Equations 2.4.2. Cramer’s Rule for 3x3 Systems 2.4.3. The Determinant of an NxN Matrix 2.4.4. Finding Determinants Using Laplace Expansions 2.4.5. Basic Properties of Determinants 2.4.6. Further Properties of Determinants 2.4.7. Row and Column Operations on Determinants 2.4.8. Conditions When a Determinant Equals Zero 2.4.9. Finding Determinants Using Row and Column Operations 2.4.10. Partitioned and Block Matrices
2.5. Gaussian Elimination
 2.5.1. Systems of Equations as Augmented Matrices 2.5.2. Row Echelon Form 2.5.3. Solving Systems of Equations Using Back Substitution 2.5.4. Elementary Row Operations 2.5.5. Creating Rows or Columns Containing Zeros Using Gaussian Elimination 2.5.6. Solving 2x2 Systems of Equations Using Gaussian Elimination 2.5.7. Solving 2x2 Singular Systems of Equations Using Gaussian Elimination 2.5.8. Solving 3x3 Systems of Equations Using Gaussian Elimination 2.5.9. Identifying the Pivot Columns of a Matrix 2.5.10. Solving 3x3 Singular Systems of Equations Using Gaussian Elimination 2.5.11. Reduced Row Echelon Form 2.5.12. Gaussian Elimination For NxM Systems of Equations
2.6. Elementary Matrices
 2.6.1. Elementary 2x2 Matrices 2.6.2. Row Operations on 2x2 Matrices as Products of Elementary Matrices 2.6.3. Elementary 3x3 Matrices 2.6.4. Row Operations on 3x3 Matrices as Products of Elementary Matrices
2.7. The Inverse of a Matrix
 2.7.1. The Invertible Matrix Theorem in Terms of 2x2 Systems of Equations 2.7.2. Finding the Inverse of a 2x2 Matrix Using Row Operations 2.7.3. Finding the Inverse of a 3x3 Matrix Using Row Operations 2.7.4. Finding the Inverse of an NxN Square Matrix Using Row Operations 2.7.5. Matrices With Easy-to-Find Inverses
2.8. LU Factorization
 2.8.1. Triangular Matrices 2.8.2. LU Factorization of 2x2 Matrices 2.8.3. LU Factorization of 3x3 Matrices 2.8.4. LU Factorization of NxN Matrices 2.8.5. Solving Systems of Equations Using LU Factorization
3.
Vector Spaces
27 topics
3.9. Vectors in N-Dimensional Space
 3.9.1. Vectors in N-Dimensional Euclidean Space 3.9.2. Linear Combinations of Vectors in N-Dimensional Euclidean Space 3.9.3. Linear Span of Vectors in N-Dimensional Euclidean Space 3.9.4. Linear Dependence and Independence
3.10. Subspaces of N-Dimensional Space
 3.10.1. Subspaces of N-Dimensional Space 3.10.2. Subspaces of N-Dimensional Space: Geometric Interpretation 3.10.3. The Column Space of a Matrix 3.10.4. The Null Space of a Matrix
3.11. Bases of N-Dimensional Space
 3.11.1. Finding a Basis of a Span 3.11.2. Finding a Basis of the Column Space of a Matrix 3.11.3. Finding a Basis of the Null Space of a Matrix 3.11.4. Expressing the Coordinates of a Vector in a Given Basis 3.11.5. Writing Vectors in Different Bases 3.11.6. The Change-of-Coordinates Matrix 3.11.7. Changing a Basis Using the Change-of-Coordinates Matrix
3.12. Dimension and Rank in N-Dimensional Space
 3.12.1. The Dimension of a Span 3.12.2. The Rank of a Matrix 3.12.3. The Dimension of the Null Space of a Matrix 3.12.4. The Invertible Matrix Theorem in Terms of Dimension, Rank and Nullity 3.12.5. The Rank-Nullity Theorem
3.13. Abstract Vector Spaces
 3.13.1. Introduction to Abstract Vector Spaces 3.13.2. Defining Abstract Vector Spaces 3.13.3. Linear Independence in Abstract Vector Spaces 3.13.4. Subspaces of Abstract Vector Spaces 3.13.5. Bases in Abstract Vector Spaces 3.13.6. The Coordinates of a Vector Relative to a Basis in Abstract Vector Spaces 3.13.7. Dimension in Abstract Vector Spaces
4.
Linear Transformations
10 topics
4.14. Linear Transformations
 4.14.1. The Standard Matrix of a Linear Transformation in Terms of the Standard Basis 4.14.2. The Kernel of a Linear Transformation 4.14.3. The Image and Rank of a Linear Transformation 4.14.4. The Image of a Linear Transformation in the Cartesian Plane 4.14.5. The Invertible Matrix Theorem in Terms of Linear Transformations 4.14.6. The Rank-Nullity Theorem in Terms of Linear Transformations
4.15. Linear Maps and Their Matrix Representations
 4.15.1. The Matrix of a Linear Transformation Relative to a Basis 4.15.2. Connections Between Matrix Representations of a Linear Transformation 4.15.3. Linear Maps Between Two Different Vector Spaces 4.15.4. The Matrix of a Linear Map Relative to Two Bases
5.
Diagonalization of Matrices
28 topics
5.16. Eigenvectors and Eigenvalues
 5.16.1. The Eigenvalues and Eigenvectors of a 2x2 Matrix 5.16.2. Calculating the Eigenvalues of a 2x2 Matrix 5.16.3. Calculating the Eigenvectors of a 2x2 Matrix 5.16.4. The Characteristic Equation of a Matrix 5.16.5. The Cayley-Hamilton Theorem and Its Applications 5.16.6. Calculating the Eigenvectors of a 3x3 Matrix With Distinct Eigenvalues 5.16.7. Calculating the Eigenvectors of a 3x3 Matrix in the General Case 5.16.8. The Invertible Matrix Theorem in Terms of Eigenvalues
5.17. Diagonalization
 5.17.1. Diagonalizing a 2x2 Matrix 5.17.2. Properties of Diagonalization 5.17.3. Diagonalizing a 3x3 Matrix With Distinct Eigenvalues 5.17.4. Diagonalizing a 3x3 Matrix in the General Case
5.18. Real Matrices With Complex Eigenvalues
 5.18.1. Vectors Over the Complex Numbers 5.18.2. Matrices Over the Complex Numbers 5.18.3. Finding Complex Eigenvalues of Real 2x2 Matrices 5.18.4. Finding Complex Eigenvectors of Real 2x2 Matrices 5.18.5. Rotation-Scaling Matrices 5.18.6. Reducing Real 2x2 Matrices to Rotation-Scaling Form 5.18.7. Block Diagonalization of NxN Matrices
5.19. Generalized Eigenvectors
 5.19.1. Nilpotent and Idempotent Matrices 5.19.2. Generalized Eigenvectors 5.19.3. Ranks of Generalized Eigenvectors 5.19.4. Finding Generalized Eigenvectors of Specific Ranks
5.20. Jordan Canonical Decomposition
 5.20.1. Jordan Blocks and Jordan Matrices 5.20.2. Jordan Canonical Form of a 2x2 Matrix 5.20.3. Jordan Canonical Decomposition of a 2x2 Matrix 5.20.4. Jordan Canonical Form of a 3x3 Matrix 5.20.5. Jordan Canonical Decomposition of a 3x3 Matrix
6.
Projections
25 topics
6.21. Inner Products
 6.21.1. The Dot Product in N-Dimensional Euclidean Space 6.21.2. The Norm of a Vector in N-Dimensional Euclidean Space 6.21.3. Inner Product Spaces 6.21.4. The Inner Product in Vector Spaces Over the Complex Numbers 6.21.5. The Norm of a Vector in Inner Product Spaces
6.22. Orthogonality
 6.22.1. Orthogonal Vectors in Euclidean Spaces 6.22.2. Orthogonal Vectors in Inner Product Spaces 6.22.3. The Cauchy-Schwarz Inequality and the Angle Between Two Vectors 6.22.4. The Pythagorean Theorem and the Triangle Inequality 6.22.5. Orthogonal Complements 6.22.6. Orthogonal Sets in Euclidean Spaces 6.22.7. Orthogonal Sets in Inner Product Spaces 6.22.8. Orthogonal Matrices 6.22.9. Orthogonal Linear Transformations 6.22.10. The Four Fundamental Subspaces of a Matrix
6.23. Orthogonal Projections
 6.23.1. Projecting Vectors Onto One-Dimensional Subspaces 6.23.2. The Components of a Vector with Respect to an Orthogonal or Orthonormal Basis 6.23.3. Projecting Vectors Onto Subspaces in Euclidean Spaces (Orthogonal Bases) 6.23.4. Projecting Vectors Onto Subspaces in Euclidean Spaces (Arbitrary Bases) 6.23.5. Projecting Vectors Onto Subspaces in Euclidean Spaces (Arbitrary Bases): Applications 6.23.6. Projection Matrices, Linear Transformations and Their Properties 6.23.7. Projecting Vectors Onto Subspaces in Inner Product Spaces
6.24. Orthogonalization Processes
 6.24.1. The Gram-Schmidt Process for Two Vectors 6.24.2. The Gram-Schmidt Process in the General Case 6.24.3. QR Factorization
7.
20 topics
7.25. Diagonalization of Symmetric Matrices
 7.25.1. Symmetric Matrices 7.25.2. Diagonalization of 2x2 Symmetric Matrices 7.25.3. Diagonalization of 3x3 Symmetric Matrices 7.25.4. The Spectral Theorem
 7.26.1. Bilinear Forms 7.26.2. Quadratic Forms 7.26.3. Change of Variables in Quadratic Forms 7.26.4. Finding the Canonical Form of a Quadratic Form Using Lagrange's Method 7.26.5. Finding the Canonical Form of a Quadratic Form Using Orthogonal Transformations 7.26.6. Positive-Definite and Negative-Definite Quadratic Forms
7.27. Quadratic Forms in Euclidian Space
 7.27.1. Reducing a Quadratic Curve to Its Principal Axes 7.27.2. Classifying Quadratic Curves
7.28. Singular Value Decomposition
 7.28.1. Constrained Optimization of Quadratic Forms 7.28.2. Constrained Optimization of Quadratic Forms: Determining Where Extrema are Attained 7.28.3. The Singular Values of a Matrix 7.28.4. Computing the Singular Values of a Matrix 7.28.5. Singular Value Decomposition of 2x2 Matrices 7.28.6. Singular Value Decomposition of 2x2 Matrices With Zero or Repeated Eigenvalues 7.28.7. Singular Value Decomposition of Larger Matrices 7.28.8. Singular Value Decomposition and the Pseudoinverse Matrix
8.
Applications of Linear Algebra
9 topics
8.29. Linear Least-Squares Problems
 8.29.1. The Least-Squares Solution of a Linear System (Without Collinearity) 8.29.2. The Least-Squares Solution of a Linear System (With Collinearity) 8.29.3. Finding a Least-Squares Solution Using QR Factorization 8.29.4. Weighted Least-Squares
8.30. Markov Chains
 8.30.1. Markov Chains 8.30.2. Steady-State Vectors
8.31. Linear Regression
 8.31.1. Linear Regression 8.31.2. Polynomial Regression 8.31.3. Multiple Linear Regression