1/70
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
What is a Linear Map?
A function L:\mathbb{F}^n\rightarrow\mathbb{F}^m is a linear map if it satisfies:
L(U+v) = L(u)+L(v) (additivity)
L(\alpha u)=\alpha L(u) (homogeneity)
\forall u,v \in \mathbb{F}^n,\alpha\in\mathbb{F}
How is a matrix associated with a linear map?
Every linear map L:\mathbb{F}^n\rightarrow\mathbb{F}^m corresponds to a matrix A\in\mathbb{F}^{m\times n} such that L(x)=Ax. Conversely, every matrix defines a linear map via matrix multiplication.
What is the Standard Matrix of a linear map?
If T:\mathbb{F}^n\rightarrow\mathbb{F}^m and the standard basis is used, then the standard matrix of T is the matrix whose columns are T(e_1), T(e_2),…,T(e_n).
What are common examples of linear maps?
Examples include:
Identity map: T(v)=v
Zero map: T(v)=0
Projections: T(x,y,z)= (x,y,0)
Reflections and rotations (e.g. in 2D)
What is the Image of a linear map L(x)=Ax?
The image (or range) is the set of all outputs Im(A)=\{Ax\ | \ x\in\mathbb{F}^n\}. It is the subspace of \mathbb{F}^m spanned by the columns of A.
What is the Kernal of a matrix A?
The kernel is the set of all solutions to the homogeneous equation Ax=0. It is a subspace of \mathbb{F}^n.
What is the Rank of a matrix A?
The rank of A is the dimension of its image (column space). It equals the number of linearly independent columns of A.
What is the Nullity of a matrix A?
The nullity of A is the dimension of its kernel: the number of linearly independent solutions to Ax=0.
State the Rank-Nullity Theorem.
If A\in\mathbb{F}^{m\times n}, then rank(A) + nullity(A)=n. This relates the dimension of the domain \mathbb{F}^n to the image and kernel of A.
What is the Image of a linear map L(x)=Ax?
The image (or range) is the set of all outputs Im(A)=\{Ax\ | \ x\in\mathbb{F}^n\}. It is the subspace of \mathbb{F}^m spanned by the columns of A.
What is the Kernal of a matrix A?
The kernel is the set of all solutions to the homogeneous equation Ax=0. It is a subspace of \mathbb{F}^n.
What is the Rank of a matrix A?
The rank of A is the dimension of its image (column space). It equals the number of linearly independent columns of A.
What is the Nullity of a matrix A?
The nullity of A is the dimension of its kernel: the number of linearly independent solutions to Ax=0.
State the Rank-Nullity Theorem.
If A\in\mathbb{F}^{m\times n}, then rank(A) + nullity(A)=n. This relates the dimension of the domain \mathbb{F}^n to the image and kernel of A.
How do you compute kernel and image from a matrix?
For kernel:
Solve Ax=0 using row reduction (null space)
For image:
The span of the pivot columns in row echelon form (column space)
What is the kernel of A= \left[ \begin{array}{ccc} 1 & 1 & 0 \\ 0 & 1 & 1 \end{array} \right]?
Solve \left\{\right.\begin{array}{c}x_1+x_2=0\\ x_2+x_3=0\end{array} \Rightarrow x = x_3\cdot(1,-1,1). So kernel is 1-dimensional: Span((1,-1,1))
What is the image of A= \left[ \begin{array}{ccc} 1 & 1 & 0 \\ 0 & 1 & 1 \end{array} \right]?
Since there are 2 pivot row, the rank is 2. Image is 2D subspace of \mathbb{F}². The columns that correspond to pivots span the image
When is a linear map Invertible?
A linear map L:\mathbb{F}^n\rightarrow\mathbb{F}^n is invertible if it is bijective, which holds iff its matrix is invertible (i.e. has full rank and non-zero determinant).
When is a square matrix invertible?
A square matrix A\in\mathbb{F}^{n\times n} is invertible iff:
det(A)\neq 0
rank(A)=n
ker(A) = \{0\}
A^{-1} exists such that AA^{-1}=A^{-1}A=I
What is an Isomorphism?
A linear map that is invertible (bijective). Two vector spaces V and W are isomorphic if there exists an isomorphism T:V\rightarrow W. This implies dim(V)=dim(W).
What is the Identity Map?
I:V\rightarrow V, defined by I(v)=v \ \forall \ v\in V. It is a linear map and acts as the neutral element for composition: T\circ I=I\circ T=T.
What is the matrix of a linear map with respect to a basis?
Let B=\{v_1,…,v_n\} be a basis of V, and T:V\rightarrow V. Then the matrix [T]_B is defined by [T]_B = [[T(v_1)]_B|…|[T(v_n)_B] where [T(v_j)]_B are the coordinate vectors of T(v_j) in basis B.
What is a change of basis matrix?
Let B and C be bases of V. The change of basis matrix from B to C, denoted P_{C\leftarrow B}, is the matrix such that [x]_C = P_{C\leftarrow B}\cdot[x]_B
How does a matrix change under a change of basis?
If A represents a linear map in basis B, and P is the change-of-basis matrix to a new basis B’, then the matrix in the new basis is A’=P^{-1}AP. This is called similarity transformation.
How do you compute a change of basis matrix?
Let B=\{v_1,,..,v_n\}, C=\{w_1,…,w_n\}. Express each v_j as a linear combination of the w_i. The columns of the matrix are these coordinate vectors P_{C\leftarrow B}=[[v_1]_C \ … \ [v_n]_C]
Give an example of a change of basis in \mathbb{R}²
Let B=\{(1,0),(1,1)\}, C=\{(1,1),(0,1)\}. To compute P_{C\leftarrow B}, write each v_i\in B in terms of C:
(1,0)=1(1,1)+(-1)(0,1)\Rightarrow[1,-1]^T
(1,1)=1(1,1)+0(0,1)\Rightarrow[1,0]^T
So P_{C\leftarrow B}= \left[ \begin{array}{cc} 1 & 1 \\ -1 & 0 \end{array} \right]
Why is change of basis useful?
It simplifies matrix representations (e.g. diagonalization or Jordan form), helps understand the structure of a linear map, and allows us to work in more convenient coordinate systems.
What is an Eigenvector of a matrix?
A non-zero vector v\in\mathbb{F}^n is an eigenvector of a square matrix A\in\mathbb{F}^{n\times n} if Av=\lambda v for some scalar \lambda\in\mathbb{F}. The vector’s direction doesn’t change, just its magnitude (scaled by \lambda)
What is an Eigenvalue of a matrix?
A scalar \lambda is an eigenvalue of a square matrix A if there exists a non-zero vector v such that Av=\lambda v. The set of all such v forms the eigenspace associated with \lambda.
What is the Eigenspace of an eigenvalue \lambda?
The eigenspace is the subspace E_\lambda = ker(A-\lambda I). It contains all vectors v satisfying Av=\lambda v.
What is the characteristic polynomial of a matrix A\in\mathbb{F}^{n\times n}?
It is the polynomial p_A(\lambda)=det(A-\lambda I). The roots of this polynomial are the eigenvalues of A.
What is the Algebraic Multiplicity of an eigenvalue?
It is the multiplicity of the eigenvalue as a root of the characteristic polynomial. For example, if (\lambda-2)³ is a factor, then the algebraic multiplicity of 2 is 3.
What is the Geometric Multiplicity of an eigenvalue?
It is the dimension of the eigenspace associated with \lambda, i.e. dim(ker(A—\lambda I)). It satisfies 1\leq geom. mult.(\lambda)\leq alg.mult.(\lambda).
How are eigenvalues found?
Solve the characteristic equation det(A-\lambda I)=0. Each solution \lambda is an eigenvalue. For each eigenvalue, solve (A-\lambda I)v=0 to find eigenvalues.
Steps to compute eigenvalues and eigenvectors:
Compute det(A-\lambda I) (characteristic polynomial)
Solve det(A-\lambda I)=0 (eigenvalues \lambda)
For each \lambda, solve (A-\lambda I)v=0 (eigenspace)
Find the eigenvalues of A=\left[ \begin{array}{cc} 2 & 1 \\ 1 & 2 \end{array} \right]
det(A-\lambda I) = (2-\lambda)² - 1 = \lambda² - 4\lambda + 3. Roots: \lambda = 1,3. These are the eigenvalues.
Can a matrix have repeated eigenvalues?
Yes. An eigenvalue may have algebraic multiplicity \lt 1. For example, \lambda = 2 might be a double root. Diagonalizability depends on whether there are enough linearly independent eigenvectors (i.e. if geometric multiplicity = algebraic multiplicity).
What does it mean for a matrix to be diagonalizable?
A matrix A is diagonalizable is there exists an invertible matrix P such that P^{-1}AP=D where D is diagonal. This happens if A has a full basis of eigenvectors (i.e. n linearly independent eigenvectors for n\times n matrix).
What does the diagonal matrix D contains after diagonalization?
The eigenvalues of A on the diagonal, in the same order as their corresponding eigenvectors in the columns of P.
What are the necessary and sufficient conditions for diagonalizability?
A has n linearly independent eigenvectors \Leftrightarrow diagonalizable.
This is guaranteed if all eigenvalues are distinct.
Still possible if eigenvalues are repeated, but only if geometric multiplicity = algebraic multiplicity for each one.
What does it mean for a matrix A to be diagonalizable?
A matrix A\in\mathbb{F}^{n\times n} is diagonalizable if there exists an invertible matrix P such that P^{-1}AP=D where D is a diagonal matrix. This means A is similar to a diagonal matrix.
What is the structure of the diagonal matrix D in diagonalization?
The diagonal entries of D are the eigenvalues of A. If the columns of P are eigenvectors v_1, …, v_n then D=diag(\lambda_1,…,\lambda_n)
When is a matrix diagonalizable?
A matrix is diagonalizable iff it has n linearly independent eigenvectors. This holds automatically if
All eigenvalues are distinct, or
Geometric multiplicity = algebraic multiplicity for each eigenvalue
Can a matrix be diagonalizable with repeated eigenvalues?
Yes, but only if the geometric multiplicity of each eigenvalue equals its algebraic multiplicity.
How do you diagonalize a matrix A\in\mathbb{F}^{n\times n}?
Find eigenvalues by solving det(A-\lambda I)=0.
For each eigenvalue \lambda, find a basis for ker(A-\lambda I).
If you find n linearly independent eigenvectors, form matrix P from them.
Construct D=P^{-1}AP
Diagonalize A= \left[ \begin{array}{cc} 4 & 1 \\ 0 & 2 \end{array} \right]
det(A-\lambda I)=(4-\lambda)(2-\lambda)\Rightarrow\lambda=4,2
ker(A-4I)=Sp((1,0)),ker(A-2I)=Sp((-1,2))
P=\left[\begin{array}{cc}1&-1\\0&2\end{array}\right], D=\left[\begin{array}{cc}4&0\\0&2\end{array}\right]
What is the characteristic polynomial of a matrix A?
The characteristic polynomial of an n\times n matrix A is defined as p_A(\lambda)=det(A-\lambda I). Its roots are the eigenvalues of A.
What is the degree of the characteristic polynomial of an n\times n matrix?
It is a degree-n polynomial in \lambda, and the coefficient of \lambda^n is always (-1)^n.
What is the trace of a matrix, and how does it relate to eigenvalues?
The trace of a matrix A is denoted tr(A), is the sum of its diagonal entries. For diagonalizable matrices (and more generally all square matrices), tr(A)=\sum(eigenvalues, with \ multiplicity).
What is the determinant of a matrix of a matrix in terms of it's eigenvalues?
det(A)=\prod(eigenvalues, with \ multiplicity. This holds because det(A)=\prod^n_{i=1}\lambda_i where \lambda_i are the eigenvalues of A.
How do you compute the characteristic polynomial of a matrix A?
Form A-\lambda I
Compute det(A-\lambda I) using cofactor expansion or row reduction.
Simplify to a polynomial in \lambda.
Compute the characteristic polynomial of A=\left[ \begin{array}{cc} 1 & 2 \\ 2 & 1 \end{array}\right]
det(A-\lambda I)=det\left[\begin{array}{cc}1-\lambda&2\\2&1-\lambda\end{array}\right]=(1-\lambda)²-4=\lambda²-2\lambda-3
What is the determinant of a square matrix?
The determinant is a scalar value computed from a square matrix A\in\mathbb{F}^{n\times n} that encodes important properties such as invertibility, volume scaling, and orientation. It is denoted det(A) or |A|.
When is a matrix invertible in terms of its determinant?
A matrix A is invertible iff det(A)\neq0. If det(A)=0, the matrix is singular and has no inverse.
What is the determinant of a triangular matrix?
For any upper or lower triangular matrix, the determinant is the product of the diagonal entries.
What effect does a row operation have on the determinant?
Swapping rows \rightarrow changes sign of the determinant
Multiplying a row by c\rightarrow multiplies determinant by c
Adding a multiple of one row to another \rightarrow no change in determinant
How is the determinant used in geometry?
The determinant of a matrix whose columns are vectors gives the oriented volume of the parallelepiped formed by the vectors. In \mathbb{R}², it gives signed area; in \mathbb{R}³, signed volume.
What is the determinant of a 2\times2 matrix?
det\left[\begin{array}{cc} a&b\\c&d\end{array}\right]=ad-bc
What is the determinant of a 3\times 3 matric?
Use cofactor expansion det(A)=a(ei-fh)-b(di-fg)+c(dh-eg) for A=\left[\begin{array}{ccc}a&b&c\\d&e&f\\g&h&i\end{array}\right]
How do you compute the determinant of an n\times n matrix?
Use cofactor expansion along a row or column
For larger matrices, reduce to triangular form and multiply diagonal.
Be cautious of row operation effects on determinant
What’s the relationship between determinant and eigenvalues?
det(A)=\prod^n_{i=0}\lambda_i where \lambda_i are the eigenvalues (with multiplicity) of A.
What is the Cayley-Hamilton Theorem?
Every square matrix A\in\mathbb{F}^{n\times n} satisfies its own characteristic polynomial. If p_A(\lambda)=det(A-\lambda I), then p_a(A)=0.
What does it mean for a matrix to satisfy a polynomial?
A matrix A satisfies a polynomial f(\lambda) if f(A)=0, where scalar multiplication and powers of A are applies according to standard matrix operations.
What is the Cayley-Hamilton for a 2\times 2 matrix?
Let A=\left[\begin{array}{cc}2&1\\0&3\end{array}\right]. The characteristic polynomial is (2-\lambda)(3-\lambda)=\lambda²-5\lambda+6. Then A²-5A+6I=0.
Why is the Cayley-Hamilton theorem useful?
It allows you to express powers of A in terms of lower powers, useful in:
Computing A^k for large k
Finding minimal polynomials
Proving diagonalizability
Controlling recurrence relations
How do you verify Cayley-Hamilton theorem for a matrix?
Find the characteristic polynomial p(\lambda)
Substitute A into p: replace \lambda^k\rightarrowA^k, constants remain scalar multiples of I.
Show the resulting matrix expression evaluates to 0
What is the minimal polynomial of a matrix?
The minimal polynomial m_A(x) is the monic polynomial of lowest degree such that m_A(A)=0. It divides any other polynomial satisfied by A, including the characteristic polynomial.
How is the minimal polynomial related to the characteristic polynomial?
The minimal polynomial m_A(x) divides the characteristic polynomial p_A(x), and has the same roots (possibly with lower multiplicity).
How is the minimal polynomial useful?
It determines the structure of the matrix:
If all roots are distinct, and m_A(x) splits into linear factors, then A is diagonalizable
Helps compute functions of A (e.g. e^A, A^n)
What is the minimal polynomial of A=\left[\begin{array}{cc}2&1\\0&2\end{array}\right]?
p_A(x)=(x-2)², but (A-2I)²=\left[\begin{array}{cc}0&1\\0&0\end{array}\right]²=0. So m_A(x)=(x-2)² (not diagonalizable).
A matrix A is diagonalizable iff…
The minimal polynomial splits into distinct linear factors, i.e. all eigenvalues have algebraic = geometric multiplicity.