Diagonalization of Matrices
Diagonalization Theory: Part 1 and Part 2
The Diagonalization Theorem (Part 1)
Statement: If a square matrix has an eigenvector basis, then can be expressed as a special product of matrices: .
is a matrix whose columns are the eigenvectors of . These eigenvectors form the eigenvector basis.
is a diagonal matrix. Its diagonal entries are the eigenvalues of . The eigenvalues are repeated according to their algebraic multiplicity.
Correspondence: The -th column of (an eigenvector) must correspond to the -th diagonal entry of (its respective eigenvalue).
Benefit: This factorization simplifies calculations involving powers of .
For example, (where intermediate terms cancel out).
Calculating for a diagonal matrix is straightforward: if , then .
Example 1: Triangular Matrix
Given matrix .
Eigenvalues: Since is a triangular matrix, its eigenvalues are the diagonal entries: , , .
Eigenvectors: (Provided) , , .
Factorization:
(columns are eigenvectors).
(diagonal entries are corresponding eigenvalues).
must be calculated (not immediately obvious).
Example 2: Markov Chain Transition Matrix
Eigenvectors: (for ) and (for ).
Factorization: . The matrix would need to be calculated explicitly.
The Diagonalization Theorem (Part 2)
Statement: Let be an matrix. If can be expressed as where is an invertible matrix and is a diagonal matrix, then:
The columns of form an eigenvector basis for .
The corresponding eigenvalues are the diagonal entries of .
Significance: This allows us to construct a matrix with pre-chosen eigenvalues and eigenvectors. It is the reverse of Part 1.
Example: Constructing a Matrix
Choose eigenvectors: and . (These must be linearly independent to form a basis for ).
Choose eigenvalues: and
Construct and .
The matrix will then have these exact chosen eigenvalues and eigenvectors.
Core Concepts and Terminology
Diagonalizable Matrix: A square matrix is called diagonalizable if it can be expressed in the form where is invertible and is diagonal.
Equivalence: A square matrix is diagonalizable if and only if there exists an eigenvector basis with respect to . This means the existence of and the existence of an eigenvector basis are two sides of the same coin.
Rephrased Equivalence: An matrix is diagonalizable if and only if there exist linearly independent eigenvectors of that matrix.
Note: All eigenvectors must have entries (i.e., be vectors in ) for the matrix multiplication to be valid.
Diagonalization of A: The process of writing in the form .
Diagonalization vs. Row Reduction: These are not the same. Row reduction typically changes the eigenvalues of a matrix, while diagonalization preserves them while transforming the matrix into a simpler form via its eigenvectors.
Determining if a Matrix is Diagonalizable
Case 1: Matrix with Distinct Eigenvalues
Principle: Eigenvectors corresponding to distinct eigenvalues are linearly independent.
Theorem: If an matrix has distinct eigenvalues, then it is diagonalizable.
Example: .
Eigenvalues: , (from diagonal of triangular matrix).
Since is a matrix and has two distinct eigenvalues, it is immediately diagonalizable. We don't need to find the eigenvectors explicitly to know this.
Example: A triangular matrix with eigenvalues is diagonalizable because it has distinct eigenvalues.
Case 2: Matrix with Repeated Eigenvalues
If a matrix does not have distinct eigenvalues, it might still be diagonalizable, but further investigation is required.
Example (Not Diagonalizable): .
Eigenvalue: (from diagonal of triangular matrix). Its algebraic multiplicity is (since it's a matrix).
Finding Eigenvectors: For , we solve . This is .
This implies and is a free variable.
The eigenvectors are of the form (for ).
Conclusion: There is only one linearly independent eigenvector (e.g., ) for this matrix. We cannot form a basis of consisting of eigenvectors. Thus, is not diagonalizable.
Key Criterion: Geometric vs. Algebraic Multiplicity
Algebraic Multiplicity (AM): The number of times an eigenvalue appears as a root of the characteristic polynomial.
Geometric Multiplicity (GM): The dimension of the eigenspace corresponding to an eigenvalue (). This is the maximum number of linearly independent eigenvectors for that eigenvalue.
Bounds: For any eigenvalue , always: .
Theorem (General Diagonalizability Condition): A square matrix is diagonalizable if and only if for every eigenvalue of , its geometric multiplicity equals its algebraic multiplicity ().
This means each eigenspace must be