1/16
from chat
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Linear Independence
A set of vectors is linearly independent if none of the vectors can be written as a linear combination of the others.
Linearly Dependent
A set of vectors is linearly dependent if at least one of the vectors is a linear combination of the others.
Zero Vector in a Set
If a set of vectors includes the zero vector, it is automatically linearly dependent.
Orthogonal Vectors
Two vectors are orthogonal if their dot product is zero.
Orthonormal Vectors
A set of vectors is orthonormal if each vector has unit length and they are all mutually orthogonal.
Orthogonal Matrix
A square matrix is orthogonal if its columns (and rows) are orthonormal vectors, i.e., ( Q^T Q = I ).
Jordan Canonical Form
A form of a square matrix where it is broken into Jordan blocks corresponding to each eigenvalue, including generalized eigenvectors.
Diagonalizable Matrix
A matrix is diagonalizable if it has enough linearly independent eigenvectors to form a diagonal matrix under similarity transformation.
Similar Matrices
Two matrices are similar if they represent the same linear transformation under different bases, i.e., ( A = PBP^{-1} ).
Properties of Similar Matrices
Similar matrices have the same characteristic polynomial, eigenvalues, and determinant.
Singular Value Decomposition (SVD)
A = UΣVᵀ, where U and V are orthogonal matrices, and Σ contains the singular values on the diagonal.
Singular Values
The non-negative square roots of the eigenvalues of ( A^T A ); they measure how A stretches space.
U and V in SVD
The columns of U are eigenvectors of ( AA^T ); the columns of V are eigenvectors of ( A^T A ).
QR Factorization
A = QR, where Q is an orthogonal matrix and R is an upper triangular matrix.
Q in QR Factorization
Q contains orthonormal columns that span the column space of A.
R in QR Factorization
R is an upper triangular matrix that represents the projection coefficients from the Gram-Schmidt process.