1/40
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
det(ATB) =
det(BTA)
det(A+B) DOES NOT =
det(A) + det(B)
every orthonormal basis is
an orthogonal basis
If B is the RREF of a matrix A and det(B) = 0
then det(A) = 0
Row opperation DO change
the determinant of a matrix
det(2A) =
det(A) x 2m
if m is the number of rows
det(AB) =
det(A)det(B)
if det(A) NOT= 0 we know that
A is invertible
thus a solution exists and is unique
for a set of vectors to span Pn
It must have n+1 vectors that are not scalars of each other (linearly independent)
for a matrix/set of vectors to be linearly independent
det(A) CANNOT = 0
a set of vectors is a basis for Pn if
they span Pn and are linearly independent
a set of vectors is an orthogonal set if
all of the vetcor dot products = 0
they are linearly independent
if x→ is NOT in the subspace W, then
x→ - prodwx→ is NOT = 0
if Av=λv for a nonzero vector v,
then v is an eigenvetcor with eigenvalue λ
Algebraic multiplicity
multiplicity of eignenvalue in characteristic polynomial
geometic multiplicity
dimension of eigenspace (number of linearly independent eigenvectors)
When is a matrix diagonalizable?
when geometric multiplicity = algebraic multiplicity for all eigenvalues:
n linearly independent eigenvectors for nxn matrix
Every real symmetric matrix
is orthogonally diagonizable
to check if a matrix is diagonizable,
check if A = AT (symmetric)
Stepsfor orthogonal diagonization:
find eigenvalues
find eigenvectors
normalize eigenvectors
for P with normalized eigenvetors, D with eigenvalues
Singular value decomp (SVD)
A=UΣVT where U and V are orthogonal, Σ is diagonal with singular values.\
every mxn matrix has one
finding rank from SVD
rank = number of nonzero singular values (square root of eigenvalues of ATA, always nonnegative)
Properties of U and V in SVD A:
UUT = I and UTU = I
VVT = I and VTV = I
Columns are orthogonal
two linear systems are the same if
they have the same solution set
consistent system
either has one solution or infinitely many solutions
inconsistent system
has no solution
rows represent
equations (m)
columns represent
variables (n)
free variables indicate that
there are many solutions, not a unique solution
If b is in the span of the columns of A, you can also say that
b is in the column space of A
can a 3×2 matrix span R³?
No, because this matrix is 2D and cannot span a 3D subspace;
A is unable to have a pivot in every row
can a 2×3 matrix span R²?
yes because it can have a pivot in every row
rank(A) is equal to
the number of nonzero rows in its echelon form B
and/or
the number of pivot positions (pivot columns) in B
Rank-Nullity theorum
rank(A) + nullity(A) = number of columns of A
the basis for Col(A) is determined by
the pivot columns in the RREF that contain a leading 1
the basis for Nul(A) is determined by
solving for RREFx = 0; just write RREF in parametric vector form
(NulA = the set of all vectors x such that Ax = 0)
ex: the x for a 3×4 matrix will have four variables (x1, x2, x3, x4)
null space, NulA is
the set of all vectors x such that Ax = 0
For Ax=b to have a unique solution
A is invertible
A is row equivalent to I
There is a pivot in ever column of A
the columns of A are linearly independent
det(A) does NOT = 0
0 is not an eigenvalue of A
A has orthonormal columns if
ATA = I
A is orthogonally diagonizable if
A = AT ; a is symmretric
if , A = UΣV T (SVD), what is the size of the original matrix A?
same as the size of Σ