Section 6 Vectors

0.0(0)
studied byStudied by 1 person
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/44

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

45 Terms

1
New cards

Dot Product

A mathematical operation that takes two equal-length sequences of numbers (usually coordinate vectors) and returns a single number. It is computed as the sum of the products of corresponding entries. uv = u1v1 + unvn

2
New cards

Vector Theorems

Assume u, v,w in R^n and c is scalar then

  1. uv = vu

  2. (u+v)w = (uw)+(vw)

  3. (cu)v = c(u+v) = u(cv)

  4. uu >= 0. uu=0 if and only if u = 0

  5. (c1u1 + cnun) = c1(u1v) + cn(unv)

3
New cards

Norm of a vector

The length of a vector v= [v1—vn] in Rn is defined by ||v|| = sqrt(vv) = sqrt(v1² +v2² +vn^n)

4
New cards

Unit Vector

A vector with a length of 1, often used to indicate direction. It is obtained by dividing a vector by its norm. For any nonzero vector v in Rn, let v = v/||v|| then v is a unit vector.

5
New cards

Finding a basis for W which isn’t a unit vector W=span{[0,1,2]}

Solve for magnitude of v = sqrt(1² + 2² + 0²) = sqrt(5) so v/||v|| = [0, 1/sqrt(5), 2/sqrt(5)

6
New cards

Is u the only unit vector that is a basis for W?

No because -u is also in the span. -u = [0, -1/sqrt(5), -2/sqrt(5)

7
New cards

Distance between two vectors

Dist(u,v) = ||u-v||

8
New cards

Orthogonal

Two vectors u and v in Rn are orthogonal to each other if uv=0

9
New cards

Zero vector Orthogonal

The zero vector is orthogonal to any other vector. For any v = 0v=0

10
New cards

Pythagorean Theorem for Vectors

Two vectors u and v are orthogonal if and only if (||u+v||)² = ||u||²+||v||²V

11
New cards

Vector and subspace orthogonal

Let z be a vector in Rn and W is a subspace in Rn. We say z is orthogonal if zw = 0 for any w in W. A vector x is in Wortho if and only if x is orthogonal to every vector in W. W is a subspace in RN since 0 is in Wortho and x,y are in Wortho, so x+y in Wortho because (x+y)w = 0 so xw+yw = 0

12
New cards

Orthogonal Complement/ W

The set of all vectors that are orthogonal to W is called the orthogonal complement of W and denoted by Worthortho

13
New cards

Relationship between Row(A)ortho, Col(A)ortho and Null(A)

Let A be a men matrix. Then Row(A)ortho = Null(A) and (col(A)ortho = Null(At)

14
New cards

Finding angles between vectors

Let theta be the angle between two vectors u and v then costheta = uv/(||u||||v||) for 0<= theta < pi

15
New cards

Orthogonal Set

The set {u1—-un} in Rn is an orthogonal set if uiuj whenever I≠ j

16
New cards

Orthogonal sets of linearly independence

Les s = {v1—-vn} be an orthogonal set of nonzero vectors in Rn. Then s is linearly independent set. S is a basis for v = span{s}

17
New cards

orthogonal basis

The set s is called an orthogonal basis if

  1. s is a basis for w

  2. s is an orthogonal set

An orthogonal set will make calculations easier

18
New cards

Theorem ofr linear combinations

Let {u1—-un} be an orthogonal basis of the subspace w. Let y be an element in w and y be an element of c1u1 —- nun then ci = uy/uu

19
New cards

Solving for orthogonal basis

check to ensure all vectors are orthogonal, that means the basis is linearly independent this implies its an orthogonal basis for Rn

20
New cards

Orthogonal projection

Given a vector u in Rn and any vector v in Rn, we can decompose y as the sum of two vectors, one along the direction of u, denoted as yhat and yhat is called the projection of y onto u. The other one is orthogonal to u, denoted as z, the norm of z is the distance from y to the subspace spanned by u

21
New cards

orthogonal projection equation

y = yhat + z where z = y - yhat

z is orthogonal to u, yhat is parallel to u

yhat = yu/ uu (u)

22
New cards

orthonormal set

A set s= {u1——ut} is called an orthonormal set if it is an orthogonal set and each vector v in s is a unit vector

23
New cards

finding orthonormal basis

find an orthogonal set, divide by ||v|| to each vector

24
New cards

orthogonal matrix

an orthogonal matrix is a square invertible matrix U such that U-1 = Ut, the columns of a A form an orthonormal basis for colA

25
New cards

Orthogonal decomposition

Let w be a subspace of Rn. Then each y in Rn can be written uniquely in the from y = yhat + z where yHat is in w and z is in W. If {u1—-ut{ is any orthogonal basis of w, then yhat = yu1/u1u1(u1) + —-+ yut/utut(ut)

26
New cards

Find orthogonal decomposition of two vectors

check to ensure both u1 and u2 are orthogonal, means linearly inepdendet set so its an orthogonal basis for W. Solve for yhat so yhat = yu1/u1u1(u1) + yu2/u2u2(u2) and then solve for z where z = y - yhat

27
New cards

Theorem Best Approximation

Let W be a subspace of Rn, y any vector in Rn, yhat be the orthogonal projection of y onto w. Then yhat is the closest point in w to y. in the sense that ||y-yhat|| <= ||y-v|| for all v in w distinct from y

28
New cards

Shortest disntace form y to w

||z|| = ||y-yhat||

29
New cards

another equation for yhat

yhat = UUty

30
New cards

Graham Schmidt Process

The graham Schmidt process is a simple algorithm focusing on orthogonal or orthonormal process for any nonzero subspace of Rn

31
New cards

How to solve a basis not orthogonal

Let L = span{x1} and let x2hat = projLx2 = x2x1/x1/x1(x1)

32
New cards

theorem graham Schmidt process

Given a basis {x1—xt} for a subspace w of Rn define v1 = x1, v2 - x2-x2v1/v1v1(v1) , v3 = x3-x3(v2)/v2v2 - x3(v1)/v1v1(v1) so vt = xt - xt (v+1) then {v1—vt{ is an orthogonal basis of w

33
New cards

Theorem QR Factorization

If A is an mxn matrix with. linearly independent columns, Then A can be factored as A =QR where Q is an mxn matrix whose columns form an orthonormal basis for colA and R is an mxn upper triangular invertible matrix with possible entries on its diagonal. Let A = [x1—-xn] If x1——xn are linearly independent. Then colA = span{x1—-xn}

34
New cards

Applying graham schmidt process to orthogonal basis

We may apply graham Schmidt process to construct na orthogonal basis v1—vn for colA to get orthonormal basis {u1—-un} for colA. Then Q = [u1—un] since QtQ = I since Q-1 = Qt so QtA = R and A=QR

35
New cards

Least Squares Problem

Ax=b. If A is mxn matrix and b is in Rn a least squares solution of Ax=b is an x’ in Rn such that ||b-Ax|| <= ||b-Ax|| for all x in Rn

36
New cards

Since Ax=b is inconsistent then ||b-Axhat|| <= min||b-Ax|| x in Rn

37
New cards

If Ax=b is consistent then the solution of Ax=b is the least square solution.

If Ax=b is inconsistent, then we need to find that such that ||b-Axhat|| = min||b-Ax||

38
New cards

[]\

Least Squares solution

AtAx = Atb

39
New cards

The set of least squares solution of Ax=b coincides with the nonempty set of solutions of the normal equations AtAx = Atb

40
New cards

Find the least squares solution

check for inconsistency first. AtAx = Atb.

41
New cards

Is the least squares solution unique

NO Ax=b has a unique solution the AtA is invertible.

42
New cards

Theorem

  1. Ax=b has a unique least square solution for each b in Rn

  2. The columns of A are linearly independent

  3. AtA invertible, xhat = (AtA)-Atb

43
New cards

Least squares error for Ax=b

When a least squares solution that is used to produce Axhat as an approximation to b so ||b-Axhat||. The distance from b to Arhat is called the least squares error of approximation

44
New cards

Alternative calculation

A least square solution satisfies Ax=b = projcola b if we find that then we may solve Ax = bhat

If A is a nxn matrix and the columns of A are linearly independent then A can be written as A=QR and the unique least squares solution of Ax=b is given by xhat = R-1Qtb

45
New cards