UE23MA242B: Mathematics for Machine Learning

This is a basic subject on matrix theory and linear algebra. Emphasis is given to topics that will be useful in computer science discipline, including systems of equations, vector spaces, eigenvalues, similarity, and positive definite matrices. The course provides hands-on experience in basic programming concepts using Python/R for solving problems relevant to these areas.

Course Objectives

  • The first goal of the course is to teach students how to use linear algebra as a powerful tool for computation.
  • The second goal is to show how these computations can be conceptualized in a geometric framework.
  • The third goal is to give a gentle introduction to the theory of abstract vector spaces.
  • To visualize solution to linear system of equations with different approaches using Python.

Course Outcomes

  • Solve systems of Linear Equations using Matrix Transformations, Interpret the nature of Solutions, Visualize Consistency of Linear system of Equations and also compute inverse of a Matrix.
  • Demonstrate the ability to work within Vector Spaces, distil Vector Space properties and understand the concepts of the four fundamental Subspaces, Linear Span, Linear Independence, Dimension and Basis
  • Learn the concepts of Orthogonal Vectors and Orthogonal Subspaces and apply the Gram-Schmidt process to find an Orthonormal Basis in a Subspace, Eigenvalues, Eigenvectors and Diagonalization of a Matrix.
  • Apply the concept of Positive Definite Matrices, Singular Value Decomposition into Application problems.

Course Content

U1: Matrices, Gaussian Elimination and Vector Spaces

Introduction, The Geometry of Linear Equations, Gaussian Elimination, Singular Cases, Elimination Matrices, Triangular Factors -LU decomposition &Cholesky’s methodand Row Exchanges, Inverses and Transposes, Inverse by Gauss -Jordan method, Vector Spaces and Subspaces (definitions only)

Application:

  1. Basic operations with matrices in Matlab
  2. Matrix operations and image manipulation
  3. Matrix multiplication, inversion, and photo filters
  4. Solving linear systems

Self-Learning Component: Algebra of Matrices.

U2: Four Fundamental Subspaces & Linear Transformations

Linear Independence, Basis and Dimensions, Row reduced Echelon form, The Four Fundamental Subspaces, Rank-Nullity theorem. Linear Transformations, Algebra of Linear transformations.

Application:

  1. Systems of linear equations and college football team ranking (with an example of the Big 12)
  2. Convolution, inner product, and image processing revisited
  3. Norms, angles, and your movie choices
  4. Interpolation, extrapolation, and climate change

Self-Learning Component: Examples of Vector Spaces and Subspaces.

U3: Orthogonalization, Eigenvalues and Eigenvectors

Orthogonal Vectors and Subspaces, Orthogonal Bases, Cosines and Projections onto Lines, Projections and Least Squares. Orthogonalization, The Gram-Schmidt Orthogonalization process, Introduction to Eigenvalues and Eigenvectors, Properties of Eigenvalues and Eigenvectors, Cayley-Hamilton theorem (statement only), Diagonalization of a Matrix

Applications:

  1. Orthogonal matrices and 3D graphics
  2. Discrete dynamical systems, linear transformations of the plane, and the Chaos Game
  3. Projections, eigenvectors
  4. Matrix eigenvalues and the Google’s PageRank algorithm

U4: Singular Value Decomposition

Symmetric Matrices, Quadratic Forms, Definitions of Positive definite, negative definite, positive semi-definite, negative semi-definite, indefinite forms and matrices, Tests for Positive Definiteness, Singular Values and Singular Vectros, Image Processing by Linear Algebra, Principal Component Analysis(PCA by the SVD), Minimizing a Multivariate Functions, Back Propagation and Stochastic Gradient Descent.

Applications:

  1. Principal Component Analysis, and face recognition algorithms
  2. Social networks, clustering, and eigenvalue problems
  3. Singular Value Decomposition and image compression

prerequisites: UE23CS151A