1/17
This flashcard set covers core concepts from linear algebra including matrix inversion, Cramer's rule, vector space properties, eigenvalues, and inner products as detailed in the lecture notes.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Adjoint Matrix (adj(A))
The transpose of the cofactor matrix C, calculated as adj(A)=CT.
Inverse of a Matrix (A−1)
For a matrix A, it is defined as A−1=det(A)1adj(A).
Cramer’s Rule
A technique for solving systems of linear equations using determinants, where variables are found via ratios such as x=DDx, y=DDy, and z=DDz.
Linearly Independent Vectors
A set of vectors v1,v2,…,vn where the equation k1v1+k2v2+⋯+knvn=0 is satisfied only by the trivial solution k1=k2=⋯=kn=0.
Linearly Dependent Vectors
A set of vectors where at least one vector can be expressed as a linear combination or scalar multiple of the others, or if the set contains the zero vector.
Basis
A set of linearly independent vectors that span a specific solution space or subspace.
Dimension (dim)
The number of vectors contained within a basis for a given space.
Rank (Rank(A))
The number of non-zero rows present in the reduced row echelon form (R) of a matrix A.
Nullity (Nullity(A))
The value determined by subtracting the rank from the total number of columns in the matrix (Nullity=number of columns−Rank).
Characteristic Equation
The equation defined by det(λI−A)=0, used to determine the eigenvalues of a matrix A.
Eigenvalues (λ)
The scalar values produced by solving the characteristic equation det(λI−A)=0.
Eigenvectors
Non-zero vectors x that satisfy the system (λI−A)x=0 for a given eigenvalue λ.
Diagonalization (A=PDP−1)
A process where a matrix A is decomposed into a matrix of eigenvectors P and a diagonal matrix of eigenvalues D.
Orthogonal Matrices
Two matrices u and v are considered orthogonal with respect to an inner product if ⟨u,v⟩=0.
Standard Inner Product on M22
An inner product for 2×2 matrices defined as ⟨u,v⟩=tr(UTV).
Norm (∥U∥)
The magnitude of a vector or matrix, often calculated as the square root of the sum of the squares of its entries.
Distance (d(U,V))
The magnitude of the difference between two vectors, defined as ∥U−V∥.
Weighted Euclidean Inner Product
An inner product in Rn defined with specific weights for each component, such as ⟨u,v⟩=w1u1v1+w2u2v2.