6.3 - Matrix Algebra

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/15

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

16 Terms

1
New cards

What is an orthogonal projection of y onto W?

The unique vector y(hat) ∈ W such that y−y(hat) is orthogonal to W and y(hat) is the closest vector in W to y.

Imagine shining a flashlight straight down onto a plane.

The shadow of y that lands on the plane is y(hat).

You’re forcing y onto W in the most direct (perpendicular) way.

<p>The unique vector <span>y(hat) ∈ W</span> such that <span>y−y(hat) </span>is orthogonal to W <em>and</em> <span>y(hat) </span>is the closest vector in W to y.</p><p></p><p>Imagine shining a flashlight straight down onto a plane.</p><p>The shadow of y that lands on the plane is y(hat). </p><p>You’re forcing y onto W in the most direct (perpendicular) way.</p>
2
New cards

Orthogonal Decomposition Theorem statement?

Every vector y can be uniquely written as y = y(hat) + z, where y(hat) ∈ W and z ∈ W(perpendicular)

Every vectory can be split into:

1) the part that lives in the subspace (the shadow).

2) The part that sticks straight out of it.

Like splitting a force into horizontal + vertical components.

<p>Every vector y can be uniquely written as y = y(hat) + z, where y(hat) ∈ W and z ∈ W(perpendicular)</p><p></p><p>Every vectory can be split into:</p><p>1) the part that lives in the subspace (the shadow).</p><p>2) The part that sticks straight out of it.</p><p>Like splitting a force into horizontal + vertical components. </p>
3
New cards

Formula for projection when {u1​,…,up​} is an orthogonal basis of W?

Picture

youre measuring how much of y points in each direction of the basis.

The dot product asks:

  • How aligned is y with this basis vector.

You scale each basis vector by how strongly y points toward it.

<p>Picture</p><p>youre measuring how much of y points in each direction of the basis.</p><p>The dot product asks:</p><ul><li><p>How aligned is y with this basis vector. </p></li></ul><p>You scale each basis vector by how strongly y points toward it.</p>
4
New cards

How do you compute the perpindicular component?

z = y - y(hat)

You’re literally subtracting the shadow from the original vector. Whats left is the part that didn’t land on the subspace.

5
New cards

Why is the decomposition y = y(hat) + z unique?

Because the only vector in both W and W(perpindicular) is 0.

A vector cant point both in a direction and be perfectly perpendicular to that direction unless its zero.

So the split is one of a kind.

6
New cards

If y is already in W, what is projWy?

projWy = y

If y is already in the plane / line, its shadow on the plane is just itself.

7
New cards

If y is in W(perpendicular), what is projWy?

projWy = 0

If y is sticking straight up from the plane, its shadow hits the origin. IT has zero components pointing along W

8
New cards

What does the Best Approximation Theorem say?

y(hat) = projWy is the closest point in W to y. It minimizes ||y-v|| over all v ∈ W.

v being some vector.

The perpendicular drop to the subspace is the shortest distance.

Any other point in W is farther away because you’d have to move sideways too.

9
New cards

What is the distance from y to W

dist(y, W) = ||y - y(hat)|| = ||y - projWy||

Distance from a point to a subspace = length of the leftover perpendicular piece.

Just like distance from a point to a line is the height of the perpendicular.

10
New cards
<p>What is the matrix formula for projection when U has orthogonal columns?</p><p></p><p>U orthogonal columns = Each column vector in U is orthogonal to every other column. it doesnt require them to be unit vectors, just perpindicular</p>

What is the matrix formula for projection when U has orthogonal columns?

U orthogonal columns = Each column vector in U is orthogonal to every other column. it doesnt require them to be unit vectors, just perpindicular

projWy = UUTy

UTy tells you how much y lines up with each basis vector.

Multiplying back by U reconstructs the projected vector.

It’s:

1) measure alignment

2) Rebuild the shadow

11
New cards

What does U contain in the projection matrix formula?

Columns of U = orthogonal basis vectors u1,…up.

Think of U as a stack of directions that span the subspace

The projection asks:

  • How much of y lies in each of these directions.

12
New cards

Why does UUTy work for projection?

The projection is a linear combination of columns of U with weights UTy

Its like recording how much of a y points in each direction (using UT)

then constructing the shadow using those amount (using U)

13
New cards

How do you check if your projection is correct?

Verify: (y - y(hat)) x ui = 0 for all basis vectors ui

The leftover part must stick straight out of the subspace. If not, the projection is wrong.

14
New cards

What must be true about the basis vectors to use the simple projection formula?

They must be orthogonal (not just linearly independent).

If the basis vectors point weird directions, projections overlap and interfere. Orthogonal basis vectors don’t mix with each other.

15
New cards

What changes if your basis isn’t orthogonal?

You must first apply Gram-Schmidt to make an orthogonal basis.

You must straighten out the basis so each direction is cleanly separate. Otherwise the projections wont isolate clean components.

16
New cards

WHen is the error minimized:

when v = y(hat)