1/44
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Dot Product
A mathematical operation that takes two equal-length sequences of numbers (usually coordinate vectors) and returns a single number. It is computed as the sum of the products of corresponding entries. uv = u1v1 + unvn
Vector Theorems
Assume u, v,w in R^n and c is scalar then
uv = vu
(u+v)w = (uw)+(vw)
(cu)v = c(u+v) = u(cv)
uu >= 0. uu=0 if and only if u = 0
(c1u1 + cnun) = c1(u1v) + cn(unv)
Norm of a vector
The length of a vector v= [v1—vn] in Rn is defined by ||v|| = sqrt(vv) = sqrt(v1² +v2² +vn^n)
Unit Vector
A vector with a length of 1, often used to indicate direction. It is obtained by dividing a vector by its norm. For any nonzero vector v in Rn, let v = v/||v|| then v is a unit vector.
Finding a basis for W which isn’t a unit vector W=span{[0,1,2]}
Solve for magnitude of v = sqrt(1² + 2² + 0²) = sqrt(5) so v/||v|| = [0, 1/sqrt(5), 2/sqrt(5)
Is u the only unit vector that is a basis for W?
No because -u is also in the span. -u = [0, -1/sqrt(5), -2/sqrt(5)
Distance between two vectors
Dist(u,v) = ||u-v||
Orthogonal
Two vectors u and v in Rn are orthogonal to each other if uv=0
Zero vector Orthogonal
The zero vector is orthogonal to any other vector. For any v = 0v=0
Pythagorean Theorem for Vectors
Two vectors u and v are orthogonal if and only if (||u+v||)² = ||u||²+||v||²V
Vector and subspace orthogonal
Let z be a vector in Rn and W is a subspace in Rn. We say z is orthogonal if zw = 0 for any w in W. A vector x is in Wortho if and only if x is orthogonal to every vector in W. W is a subspace in RN since 0 is in Wortho and x,y are in Wortho, so x+y in Wortho because (x+y)w = 0 so xw+yw = 0
Orthogonal Complement/ W
The set of all vectors that are orthogonal to W is called the orthogonal complement of W and denoted by Worthortho
Relationship between Row(A)ortho, Col(A)ortho and Null(A)
Let A be a men matrix. Then Row(A)ortho = Null(A) and (col(A)ortho = Null(At)
Finding angles between vectors
Let theta be the angle between two vectors u and v then costheta = uv/(||u||||v||) for 0<= theta < pi
Orthogonal Set
The set {u1—-un} in Rn is an orthogonal set if uiuj whenever I≠ j
Orthogonal sets of linearly independence
Les s = {v1—-vn} be an orthogonal set of nonzero vectors in Rn. Then s is linearly independent set. S is a basis for v = span{s}
orthogonal basis
The set s is called an orthogonal basis if
s is a basis for w
s is an orthogonal set
An orthogonal set will make calculations easier
Theorem ofr linear combinations
Let {u1—-un} be an orthogonal basis of the subspace w. Let y be an element in w and y be an element of c1u1 —- nun then ci = uy/uu
Solving for orthogonal basis
check to ensure all vectors are orthogonal, that means the basis is linearly independent this implies its an orthogonal basis for Rn
Orthogonal projection
Given a vector u in Rn and any vector v in Rn, we can decompose y as the sum of two vectors, one along the direction of u, denoted as yhat and yhat is called the projection of y onto u. The other one is orthogonal to u, denoted as z, the norm of z is the distance from y to the subspace spanned by u
orthogonal projection equation
y = yhat + z where z = y - yhat
z is orthogonal to u, yhat is parallel to u
yhat = yu/ uu (u)
orthonormal set
A set s= {u1——ut} is called an orthonormal set if it is an orthogonal set and each vector v in s is a unit vector
finding orthonormal basis
find an orthogonal set, divide by ||v|| to each vector
orthogonal matrix
an orthogonal matrix is a square invertible matrix U such that U-1 = Ut, the columns of a A form an orthonormal basis for colA
Orthogonal decomposition
Let w be a subspace of Rn. Then each y in Rn can be written uniquely in the from y = yhat + z where yHat is in w and z is in W. If {u1—-ut{ is any orthogonal basis of w, then yhat = yu1/u1u1(u1) + —-+ yut/utut(ut)
Find orthogonal decomposition of two vectors
check to ensure both u1 and u2 are orthogonal, means linearly inepdendet set so its an orthogonal basis for W. Solve for yhat so yhat = yu1/u1u1(u1) + yu2/u2u2(u2) and then solve for z where z = y - yhat
Theorem Best Approximation
Let W be a subspace of Rn, y any vector in Rn, yhat be the orthogonal projection of y onto w. Then yhat is the closest point in w to y. in the sense that ||y-yhat|| <= ||y-v|| for all v in w distinct from y
Shortest disntace form y to w
||z|| = ||y-yhat||
another equation for yhat
yhat = UUty
Graham Schmidt Process
The graham Schmidt process is a simple algorithm focusing on orthogonal or orthonormal process for any nonzero subspace of Rn
How to solve a basis not orthogonal
Let L = span{x1} and let x2hat = projLx2 = x2x1/x1/x1(x1)
theorem graham Schmidt process
Given a basis {x1—xt} for a subspace w of Rn define v1 = x1, v2 - x2-x2v1/v1v1(v1) , v3 = x3-x3(v2)/v2v2 - x3(v1)/v1v1(v1) so vt = xt - xt (v+1) then {v1—vt{ is an orthogonal basis of w
Theorem QR Factorization
If A is an mxn matrix with. linearly independent columns, Then A can be factored as A =QR where Q is an mxn matrix whose columns form an orthonormal basis for colA and R is an mxn upper triangular invertible matrix with possible entries on its diagonal. Let A = [x1—-xn] If x1——xn are linearly independent. Then colA = span{x1—-xn}
Applying graham schmidt process to orthogonal basis
We may apply graham Schmidt process to construct na orthogonal basis v1—vn for colA to get orthonormal basis {u1—-un} for colA. Then Q = [u1—un] since QtQ = I since Q-1 = Qt so QtA = R and A=QR
Least Squares Problem
Ax=b. If A is mxn matrix and b is in Rn a least squares solution of Ax=b is an x’ in Rn such that ||b-Ax|| <= ||b-Ax|| for all x in Rn
Since Ax=b is inconsistent then ||b-Axhat|| <= min||b-Ax|| x in Rn
If Ax=b is consistent then the solution of Ax=b is the least square solution.
If Ax=b is inconsistent, then we need to find that such that ||b-Axhat|| = min||b-Ax||
[]\
Least Squares solution
AtAx = Atb
The set of least squares solution of Ax=b coincides with the nonempty set of solutions of the normal equations AtAx = Atb
Find the least squares solution
check for inconsistency first. AtAx = Atb.
Is the least squares solution unique
NO Ax=b has a unique solution the AtA is invertible.
Theorem
Ax=b has a unique least square solution for each b in Rn
The columns of A are linearly independent
AtA invertible, xhat = (AtA)-Atb
Least squares error for Ax=b
When a least squares solution that is used to produce Axhat as an approximation to b so ||b-Axhat||. The distance from b to Arhat is called the least squares error of approximation
Alternative calculation
A least square solution satisfies Ax=b = projcola b if we find that then we may solve Ax = bhat
If A is a nxn matrix and the columns of A are linearly independent then A can be written as A=QR and the unique least squares solution of Ax=b is given by xhat = R-1Qtb