Linear Algebra - Chapter 6

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/32

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

33 Terms

1
New cards

What is an inner product in ℝⁿ?

The inner product of vectors u and v is u·v = u₁v₁ + u₂v₂ + … + uₙvₙ.

2
New cards

What does it mean for two vectors to be orthogonal?

They are orthogonal if their dot product is zero: u·v = 0.

3
New cards

What is the norm / length of a vector v?

‖v‖ = sqrt(v·v) = sqrt(v₁² + v₂² + … + vₙ²).

4
New cards

What is the distance between vectors u and v?

distance(u, v) = ‖u − v‖ = sqrt((u₁ − v₁)² + (u₂ − v₂)² + … + (uₙ − vₙ)²).

5
New cards

What is an orthogonal set?

A set {v₁, …, vₖ} where vᵢ·vⱼ = 0 for all i ≠ j.

6
New cards

What is an orthonormal set?

A set of vectors that are orthogonal and each has norm 1.

7
New cards

What is an orthonormal basis?

A basis whose vectors are orthonormal.

8
New cards

What is the projection of v onto a non-unit vector u?

projᵤ(v) = [(v·u) / (u·u)]u.

9
New cards

How do you compute proj_W(v) when W has an orthonormal basis {u₁, …, uₖ}?

Sum the projections of V onto each basis vector.

10
New cards

What is the orthogonal decomposition theorem?

A vector y can be written in the form y = y-hat + (y - y-hat), where y-hat is the projection into a subspace.

11
New cards

What is the best approximation of the vector y?

y-hat

12
New cards

W^{COMPLIMENT}

All vectors orthogonal to every vector in W.

13
New cards

What is a least-squares solution to Ax = b?

A vector x̂ that minimizes ‖Ax − b‖.

14
New cards

What is the geometric meaning of Ax̂ in least squares?

Ax̂ is the projection of b onto Col(A).

15
New cards

What is an orthogonal matrix?

A matrix whose columns form an orthonormal set: QᵀQ = I.

16
New cards

What property do orthogonal matrices preserve?

Lengths and angles: ‖Qv‖ = ‖v‖.

17
New cards

What is the error vector in least squares?

e = b − Ax̂, and e is orthogonal to Col(A).

18
New cards

What are the normal equations?

AᵀA x = Aᵀb.

19
New cards

What are the steps for solving using normal equation?

  1. Calculate A^{Transpose}A and a^{Transpose}b seperately

  2. Plug answers in and solve for x-hat

  3. Solve for b-hat with the equation b-hat = A dot x-hat

  4. Calculate the error: b-b^{hat}

20
New cards

What is a unit vector?

A unit vector is a vector whose length is 1. 

21
New cards

What is an orthogonal compliment?

The orthogonal complement of a subspace W (within a larger subspace V)  is the set of all vectors in V that is orthogonal to every vector in W. 

22
New cards

What are the properties of an orthogonal complement?

<p></p>
23
New cards

What is the formula needed to normalise a vector?

\frac{1}{\left\Vert u\right\Vert}u 

24
New cards

Provide the pythagorean theorem for orthogonal vectors?

Two vectors u and v are orthogonal if and only if \left\Vert u+v\right\Vert^2=\left\Vert u\right\Vert^2+\left\Vert v\right\Vert^2

25
New cards

Provide the orthogonal complement properties?

(Row A)orthogonal complement = Nul A and (Col A) orthogonal complement = Nul A orthogonal complement

26
New cards

Why is the application of the Gram-schmidt process?

To take a set of linearly independent vectors and transform it into an orthogonal set that spans the same space

27
New cards

What are the steps of the Gram-schmidt process?

Set a vector equal to the first basis vector; Calculate preceding vectors by subtracting the vector you’re projecting from the projection of that vector onto each of the basis vectors. Finally normalise the vector

28
New cards

What does X, Beta and y represent in the equation X Beta = y?

X is the design matrix, Beta is the vector contiang Beta-nought and Beta-one (in least square line) and y is the observed vector.

29
New cards

How do you construct the design matrix X?

Set the first column to 1’s, set the following columns to the x-cordinates of a given data set.

30
New cards

Provide the equation for the residual vector e?

e = y - Xbeta

31
New cards

What does the residual vector represent?

The residual vector is the distance between y (original vector) and the projection of y into the design matrix

32
New cards

What is the linear model equation?

y = X Beta + e

33
New cards

What is the linear model normal equation?

XTranposeX Beta = X transpose y