Ch 2.3 - Characterizations of Invertible Matrices
The Invertible Matrix Theorem (Theorem 8)
Core Principle: Let A be a square matrix. The following statements are equivalent; that is, for a given A, the statements are either all true or all false.
List of Equivalent Statements (Characterizations of an Invertible Matrix):
A is an invertible matrix.
A is row equivalent to the identity matrix (I_n). This means A can be reduced to the identity matrix through elementary row operations.
A has n pivot positions. This indicates that every column and every row has a pivot.
The equation Ax = 0 has only the trivial solution (x = 0).
The columns of A form a linearly independent set.
The linear transformation T: \mathbb{R}^n \rightarrow \mathbb{R}^n defined by T(x) = Ax is one-to-one. This means that if T(u) = T(v), then u = v.
The equation Ax = b has at least one solution for each b in \mathbb{R}^n. This implies that the system is always consistent.
The columns of A span \mathbb{R}^n. This means that any vector in \mathbb{R}^n can be written as a linear combination of the columns of A.
The linear transformation T: \mathbb{R}^n \rightarrow \mathbb{R}^n maps \mathbb{R}^n onto \mathbb{R}^n. This means the range of T is all of \mathbb{R}^n.
There is an n \times n matrix C such that CA = I. Here, C is the left inverse.
There is an n \times n matrix D such that AD = I. Here, D is the right inverse.
A^T (the transpose of A) is an invertible matrix.
Further Implications and Corollaries of The Invertible Matrix Theorem
Unique Solution for Ax=b: Theorem 8 implies that the equation Ax = b has a unique solution for each b in \mathbb{R}^n. This statement on its own implies that A is row equivalent to the identity matrix (I_n) and, consequently, that A is invertible.
Product of Invertible Matrices: Let A and B be square matrices. If their product AB = I (the identity matrix), then both A and B are invertible matrices. In this specific case, B is the inverse of A (B = A^{-1}), and A is the inverse of B (A = B^{-1}). This property is a direct consequence of Theorem 8.
Classification of Square Matrices
The Invertible Matrix Theorem effectively categorizes the set of all n \times n matrices into two mutually exclusive groups:
Invertible (Nonsingular) Matrices: These are matrices that fulfill all the equivalent statements outlined in Theorem 8.
Noninvertible (Singular) Matrices: These are matrices that fail to satisfy any (and therefore all) of the equivalent statements in Theorem 8.
Properties of Singular Matrices
The negation of any statement within the Invertible Matrix Theorem directly describes a characteristic of an n \times n singular matrix.
Examples of Properties for an n \times n Singular Matrix: Such a matrix:
Is not row equivalent to the identity matrix (I_n).
Does not have n pivot positions; it will have fewer than n pivots.
Has linearly dependent columns.
The homogeneous equation Ax = 0 has non-trivial (non-zero) solutions.
The linear transformation T(x) = Ax is not one-to-one.
Its columns do not span \mathbb{R}^n.
The linear transformation T(x) = Ax does not map \mathbb{R}^n onto \mathbb{R}^n. This means there are vectors in the codomain that are not in the image of T.
The equation Ax = b does not have a unique solution for every b in \mathbb{R}^n; it might have infinitely many solutions or no solution for certain b.
Scope and Applicability of The Invertible Matrix Theorem
Exclusively for Square Matrices: It is crucial to remember that the Invertible Matrix Theorem is applicable only to square matrices (matrices where the number of rows equals the number of columns, i.e., n \times n matrices).
Limitation Example: For instance, if the columns of a non-square matrix, such as a 3 \times 2 matrix, are linearly independent, we cannot use the Invertible Matrix Theorem to draw conclusions about the existence or non-existence of solutions for equations like Ax = b. The properties of the theorem do not universally hold for non-square matrices, even if isolated conditions might appear to be met.
Invertible Linear Transformations
Relationship with Matrix Multiplication: Matrix multiplication can be directly interpreted as the composition of linear transformations. The equation A^{-1}AX = X visually represents how an inverse transformation "undoes" the original transformation.
Definition of an Invertible Linear Transformation: A linear transformation T: \mathbb{R}^n \rightarrow \mathbb{R}^n is defined as invertible if there exists another function S: \mathbb{R}^n \rightarrow \mathbb{R}^n such that:
S(T(x)) = x for all x \in \mathbb{R}^n
T(S(x)) = x for all x \in \mathbb{R}^n
The function S is often referred to as the inverse transformation of T.
Theorem 9: Invertibility of Transformation and Standard Matrix: Let T: \mathbb{R}^n \rightarrow \mathbb{R}^n be a linear transformation, and let A be its standard matrix. Then T is invertible if and only if A is an invertible matrix. This theorem establishes a direct equivalence between the invertibility of a linear transformation and the invertibility of its corresponding standard matrix.
Inverse Transformation Formula: In the event that T is invertible, the unique inverse linear transformation S is given by the formula S(x) = A^{-1}x. This means that applying the inverse matrix A^{-1} to a vector x performs the inverse transformation.