Linear Algebra: Matrices

Linear algebra is a branch of mathematics that studies vectors, vector spaces, and linear transformations

System of Equations

Augmented Matrix

The augmented matrix for the above system of equations is:

Gauss Elimination

Using Gauss Elimination, we perform row operations to simplify the augmented matrix. Our goal is to transform it into an upper triangular form.

Step 1: Fix and modify and

Performing these operations, the updated matrix becomes:

Step 2: Simplify further

Divide by 5 to normalize the pivot:

Next, eliminate the second element in :

Step 3: Normalize

The system is now ready for back substitution to find the solution.


Types of Solutions

  1. Consistent Systems: at least one solution. - Unique Solution: exactly one solution. - -
    • Infinitely Many Solutions: infinite number of solutions.
  2. Inconsistent Systems: no solution.

Row Echelon Form (REF)

  • All rows consisting entirely of zeros are at the bottom.
  • The first nonzero element (pivot) in each row is to the right of the pivot in the row above it.
  • Below each pivot, all entries are zero.

Reduced Row Echelon Form (RREF)

  • The matrix is in REF.
  • Each pivot is 1.
  • Each pivot is the only nonzero entry in its column.

Example

The given matrix represents an augmented matrix for a system of linear equations. It is written as:

We aim to solve this system using Gaussian Elimination and also find its Row Echelon Form (REF) and Reduced Row Echelon Form (RREF).

Step 1: Fix (First row)

Keep R1 as it is. Eliminate the first element in and by performing the following row operations:

Performing these operations gives:

Step 2: Normalize and Eliminate Below It

Next, eliminate the second element in using :

This gives:

At this stage, the matrix is in Row Echelon Form (REF):

Reduced Row Echelon Form (RREF)

Each pivot is , and all elements above and below pivots are Proceed as follows:

Step 1: Normalize

Divide by :

Step 2: Eliminate Above β€˜s Pivot

Using , eliminate the fourth element in and :

Performing these gives:

Step 3: Eliminate Above β€˜s Pivot

Using , eliminate the second element in :

This gives the final Reduced Row Echelon Form (RREF):


Geometry of Linear Equations

Example

  • In the Row Picture, the solution is found by intersecting lines (or planes in higher dimensions).
  • In the Column Picture, the solution is a combination of vectors that equals the result vector.

Row Picture

The Row Picture represents each equation as a line in the plane. The solution to the system of equations is the point where these lines intersect. For the system:

The intersection point is , which satisfies both equations.

Column Picture

The Column Picture focuses on expressing the equations as combinations of column vectors. For a system like:

  • The first column vector is , multiplied by .
  • The second column vector is , multiplied by .
  • The right-hand side is the result vector .

The equation can be rewritten as:

Geometrically, this means the vector can be expressed as a linear combination of and . The coefficients of this combination are and .


Upper Triangular Form

Given the system of equations , it can be transformed into the form , where is an upper triangular matrix.

Breakdown of Gaussian Elimination

Gaussian Elimination is a method used to solve a system of linear equations by transforming the matrix into an upper triangular matrix . This is achieved through a series of row operations:

  • Curable (Row Operations): These are row operations that can transform the matrix into upper triangular form.
    • Swapping rows.
    • Multiplying a row by a non-zero scalar.
    • Adding a multiple of one row to another row.
  • Incurable (Singular): If at any point, the matrix becomes singular (a row becomes all zeros or the determinant becomes zero), the system is classified as incurable. This indicates that the system does not have a unique solution (it may have no solution or infinitely many solutions).

When the matrix is in upper triangular form, solving the system is straightforward using back substitution.

Example

Consider the following system of equations represented by the matrix equation

We can apply Gaussian elimination to transform into an upper triangular matrix . Let’s perform the row operations:

  1. Eliminate the first column below the pivot (2 in the first row, first column):
    • Subtract times row 1 from row 2.
    • Subtract times row 1 from row 3.

After applying these row operations, we obtain the upper triangular matrix :

Now, the system is in the form , where:

The system is now easy to solve using back substitution:

  1. From the third row:
  2. From the second row:
  3. From the first row:

Thus, the solution is:

Singular Example (Incurable)

Consider the system:

Using Gaussian elimination, we would eventually encounter a row of zeros (i.e., a singular matrix), which indicates the system is either inconsistent or has infinitely many solutions. Specifically, after performing row operations:

The system has no unique solution, and we can conclude that it is incurable (the matrix is singular).


Matrix Notation and Multiplication

Elementary Matrices and Elimination

An elementary matrix represents a single row operation applied to the identity matrix . Performing a row operation on a matrix is equivalent to multiplying by an elementary matrix from the left.

For example:

  • : This represents an elementary matrix that performs a specific row operation (e.g., row addition, scaling, or swapping).

The elimination process transforms a matrix into an upper triangular matrix . This can be expressed as:

where are the elementary matrices that perform the necessary row operations to reduce to upper triangular form .


Cost of Elimination

  1. Right-Hand Side (RHS): The time required to perform operations for the elimination process. For an matrix, the cost is dominated by the forward elimination step, which involves approximately:
  2. Left-Hand Side (LHS): Refers to the result of applying the sequence of elementary matrices to the matrix . Each multiplication contributes to the overall cost.

Triangular Factors and Row Exchanges

The matrix can be factored into a lower triangular matrix and an upper triangular matrix through Gaussian elimination:

  • is a lower triangular matrix with ones on the diagonal, representing the multipliers used during the elimination process.
  • is an upper triangular matrix resulting from the elimination.

Permutation Matrix

A permutation matrix is a square matrix obtained by permuting the rows of the identity matrix. Here’s an example of a permutation matrix for :

is a permutation matrix if the rows are in a different order. Product can be permutated, Inverse can be permutated.

The matrix has been formed by swapping the first and second rows of the identity matrix. This is an example of a permutation matrix, where the rows of the identity matrix are rearranged in a specific order.

Properties of a Permutation Matrix

  1. Orthogonality: A permutation matrix is always orthogonal, meaning that its inverse is equal to its transpose:

2. **Product of Permutation Matrices**: When multiplying permutation matrices, the result is another permutation matrix. The product of two permutation matrices corresponds to applying one permutation after another. 3. **Determinant**: The determinant of a permutation matrix is either $+1$ or $-1$, depending on whether the number of row swaps is even or odd. Specifically, the determinant is $+1$ if the number of swaps is even, and $-1$ if the number of swaps is odd. 4. **Inverse**: The inverse of a permutation matrix is another permutation matrix, which is simply the matrix that "reverses" the row swaps performed by the original matrix. $P^{-1}$ is the same as $P$, as swapping the rows of $P$ back to their original order results in the same matrix.