6.6 Machine Learning and Linear Models

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/9

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

10 Terms

1
New cards

if at least two of the data points have different x values, then the two columns of x are linearly independent, meaning the least squares solution is ?

unique

2
New cards

in order to find the equation of the least squares line that best fits given data points, create a matrix X that holds the coefficients of the equations when you plug the data points in. Then create a matrix y with the y coordinates of the data points. next, you solve for the formula

b^ = (xTx)-1xTy

3
New cards

Let X be the design matrix used to find the least-squares line to fit data (x1,y1), , (xn,yn). Use a theorem in Section 6.5 to show that the normal equations have a unique solution if and only if the data include at least two data points with different x-coordinates.

If two data points have different x-coordinates, then the two columns of the design matrix X cannot

be multiples of each other and hence are linearly independent. By Theorem 14 in Section 6.5, the

normal equations have a unique solution.

4
New cards
  1. parallelogram law

||u+v||2 + ||u-v||² = 2||u||² + 2||v||² + 2u.v - 2u.v

5
New cards

Suppose the distance from u to v equals the distance from u to −v. Does it follow
that u is orthogonal to v?

Yes, the dist(u,v)=dist(u,-v) therefore one of the vectors must be 0 and u . 0 = 0

6
New cards

rue or False?
(a) For an m × n matrix A, vectors in the null space of A are orthogonal to vectors
in the row space of A.
(b) A square matrix with orthogonal columns is an orthogonal matrix.
(c) If U and V are orthogonal matrices, then U V is an orthogonal matrix.
(d) If the columns of an n × n matrix U are orthonormal, then the linear transfor-
mation x 7 → U x preserves lengths.

a) True

b) False; orthonormal columns

c) True UTU = I and VTV = I (UVT)(UV) = VTUTUV

d) True

7
New cards

when asked to veryify that y-projL(y) is a vector orthogonal to the line L just:

do the dot product of y-projL(y) and L to show that it is equal to 0

8
New cards

Choose any vector w ∈ L different from the projection of y, and verify that
∥y − projL(Y)∥ < ∥y − w∥

choose a vector like [0 0] and calculate ∥y − projL(Y)∥ and ∥y − w∥

9
New cards

Let y ∈ W where W is a subspace of R3. Find projW (y) and projW ⊥ (y). [This is a
conceptual question that you can answer without calculations!

projW(y) = y

projW (y) = 0

10
New cards

The formula for projecting a vector y onto a vector u is (y · u)u. Show that this can
be written as (uuT )y. [Remark: The matrix uuT is a rank-1 projection matrix that
projects y onto u.]

(y.u)u = uTy * u = uuTy