is V a vector space?
find if summ of 2 points in the vector space satisfy V, same for multiplying by any scalar and should also satisfy V
proof a linear map is a subspace
same as Vector space: T(v)= T(u)+T(w) because v=u+w. and same as vector space. T(a*v)= a*T(v) which is a l.d form bc multilplying by a scalar
how many l.i vectors are there?
see if there is a solution to matrix (columns=vectors) if only solution is (0,0,0) then they are l.i. [0] is always in the vector space. so if rank=#rows then only solution is [0] vector.
rank of a matrix
row reduce the matrix A, the rank is only the number of non-zero rows (aka pivots)
dim of sets of vectors
basis of a subspace
the basis of a subspace are all l.i vectors included in subspcace. If F defined by v1,v2,v3 but only v1,v3 are l.i. then the basis is v1,v3. check by null(f), if solution not [0], then check for other vectors/columns of A
corrdinates from a basis to another
for u, v arrival basis
Yc=AXc
Xc=V* x(b)
Yc= U* y(b)
combining we get —> U* y(b) = A* V* x(b)
y(b)= U^(-1)* A* V* x(b)
grassman formula
dim(f+g)= dim f+ dim g - dim (f inter g)
how to determine dim of subspace
number of vectors of the basis are wht define the dim!!
1: line
2: plane
3: 3d
operations with subspaces
intersection and sum of 2 subspcaces
direct intersection
when F inter G = [0]
intersection of F and G when they span the whole space
if F+G spans the whole space then F and G are complimentary: then the expansion of f, g vectors are all linearly independant and spans E so dim(g inter f)=0
change vector basis (1—> 2)
w= A(b) = C(d) | same vector in different basis
C= A(d) * (d)^(-1)
to obtain vector C in basis d, which is known
diagonalization
for f matrix in B basis made of v1,v2,…, vn vectors
then f(vi) in b= lambda(i)*vi
orthogonal spaces
when F inter Ft = [0] and F+Ft = E
Nullspace (F) =
some values of xyz (or simply a vector x such that Fx=0.
simply a function f such that f(v)=0 for f generated by v vectors
Image of F
dim of null(f)
dim E vector space - rank A
endomorphism
if f function goes from E—> H and E=H
endomorphisms are just multiplication of square matrix A
n*f(x)=A^n*x
diagonalizing a matrix A:
define charracteristic polynomial of A: det(A-lambda*Id)
find roots (= eigenvalues)
for each eigenvalue, find the corresponding eigenvector with null(A-lambda*I) for specific value of lambda
diagonalizing matrix
D is diagonalizing matrix for [(lambda 1, 0), (0, lambda 2)]
We diagonalize A by finding D:
A= VDV^-1
D= V^-1 AV
othogonal vector spaces
if every v has unit length 1
and if l.i (ie orthogonal to each other with scalar product =0)
check if Q is orthogonal
Q*Qt= Id.
isotropy principle
let U= a1v1, … , anvn
and W = b1v1, … , bnvn
then scalar product of U*W = a1b1+ … + anbn
orthogonal projection for dim 1
(<u,v> / ||u||² )* u
orthogonal projection for dim=1 into perpendicular subspace
pi into G(v) = v - pi into Gt(v)
orthogonal projections in general
At* v = At*A*alpha
we get solution as alpha 1(v ) + alpha 2 (v)
where v are vectors generators of A (being a subspace).