Linear algebra final

0.0(0)
studied byStudied by 1 person
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/45

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

46 Terms

1
New cards

How do you know if a square matrix is diagonalizable

guaranteed if it has n linearly independant eigenvectors. this is the case if (one, not all)

  1. it has n unique eigenvalues OR

  2. each of its eigenvalues’ algebraic multiplicity is equal to their geometric multiplicy OR

  3. it is symmetrical

  4. a combo of these is met

2
New cards

geometric multiplicity

the dimension of the eigenspace corresponding to an eigenvalue, representing the number of linearly independent eigenvectors associated with that eigenvalue.

3
New cards

algebraic multiplicy

the number of times an eigenvalue appears as a root of the characteristic polynomial of a matrix.

4
New cards

geometric and algebraic multiplicty relationship

amu must be greater or equal to gemu. if they are equal, the eigenvalue is said to be diagonalizable. If they are equal for every eigenvalue, the matrix is diagonalizable

5
New cards

det of a rotation

ALWAYS 1

6
New cards

det of a reflection

ALWAYS -1

7
New cards

definition and determinant of a shear

vertical shear by a factor of k: (x,y) goes to (x, y+kx). determinant ALWAYS 1

8
New cards

Projection determinant

1 if it is the identity projection (projects onto the entire subspace)

0 if not (if it collapses its input onto something with a smaller dimension)

9
New cards

reflection onto L eigenvectors and eigenvalues

values: 1 and -1

vectors: parallel to L for 1, perpendicular to L for -1

10
New cards

projection onto L eigenvectors and eigenvalues

values: 0 and 1

vectors: perpendicular to L for 0, parallel to L for 1

11
New cards

Rotation by angle x eigenvectors and eigenvalues

IF x 0<x<180 (NOT inclusive): NO eigenvalues or eigenvectors

IF x=180; value -1, every vector is an eigenvector (everything flips orientation)

IF x=0, value 1, every vector is an eigenvector (just stays the same)

12
New cards

shear eigenvector and eigenvalue

value: 1 only

vector: the direction in which nothing moves (for a horizontal shear, eigenvector (1,0)

13
New cards

symmetric matrix

equal to its transpose. always diagonizable,

14
New cards

injective

each input has a unique output. number of columns equals rank

15
New cards

surjective

range spans entire codomain. number of rows equals rank

16
New cards

How can you get to [T]b (transformation matrix T in basis B) from [T]e

If e is not the standard:

[T]b = S(b to e) * [T]e * S(e to b)

If e is the standard matrix:

[T]b = B [T]e B^(-1)

Note:

transformation from standard to B is B inverse.

Transformation from B to standard is B

17
New cards

what does it mean to have a transformation matrix T in basis B?

Each column of [T]b tells us what linear combination of the vectors of b you need to get to each of the transformed vectors. For example the first column of [T]b is the combination of b1, b2, b3, and b4 that you need to get to T(b1)

18
New cards

What is the change of basis matrix S from basis A to basis B? (2 ways to say it, say both)

S=B^(-1)A

To get S, we write each vector of A in the coordinates of the new basis, B.

What linear combination of the vectors of B do you need to get to each of the vectors in A?

19
New cards

is it invertible?

yes if the determinant is nonzero

20
New cards

What is the least squares solution to Ax=b (what does that mean)

least squares solution: when there is no real solution, we try to get as close as we possibly can to a real solution

21
New cards

We have a vector v that is NOT in the range (column space) of a matrix A and we are trying to solve Ax=v. Since v isn’t in the column space, there is no solution. What do we do?

Least squares solution: if i am looking for a solution x to Ax=v but v isnt in the column space of A, then i need to find another vector u that IS in the column space of A but that is as close to v as possible. to do this, i orthogonally project v onto the column space of A and the resulting vector is my new vector, u.

Use the attached formula but switch it around so that you isolate x (A is the transformation. Ax=u, the vector that is super close to v but not quite. v is still v, the vector we cant quite get to. To Isolate for x, we do x=(Atranspose A)^(-1) (Atranspose * v)

<p>Least squares solution: if i am looking for a solution x to Ax=v but v isnt in the column space of A, then i need to find another vector u that IS in the column space of A but that is as close to v as possible. to do this, i orthogonally project v onto the column space of A and the resulting vector is my new vector, u. </p><p>Use the attached formula but switch it around so that you isolate x (A is the transformation. Ax=u, the vector that is super close to v but not quite. v is still v, the vector we cant quite get to. To Isolate for x, we do x=(Atranspose <em> A)^(-1) (Atranspose * v)</em></p>
22
New cards

Least squares shorter version: we are looking for a solution to Ax=v. v isnt in the column space.

find the orthogonal projection of v onto the column space of A. that gives you u. A inverse that if you want x.

23
New cards

that long formula with the transposes and inverses is the formula for the projection onto a non-orthogonal matrix. how do you do the projection of a vector onto an orthogonal matrix?

project your vector onto each vector of the matrix and add them all up

24
New cards

What is Q, what is R in QR decomposition

A=QR

Q - an orthonormal matrix that spans A (orthonormal basis for A)

R - an upper right triangle matrix (so that if you want to get back to A from Q you can)

25
New cards

You have an orthogonal matrix. What is its inverse equal to

its transpose

26
New cards

Steps for QR factorization

  1. do gram schmidt on A to get an orthonormal basis with the same span, Q

  2. R=Q^(T)*A (R is equal to Q transpose times A, because Q inverse is the same as Q transpose, because Q is orthogonal)

  3. thats all

27
New cards

How does QR help with least squares problems

Attached image

<p>Attached image </p>
28
New cards

what does gram schmidt give you

an orthonormal basis that spans the same subspace

29
New cards

gram schmidt steps

let q be the new orthonormal basis and let be the old non orthonormal basis

q1=a1

q2=a2 - projection of a2 onto q1

q3=a3 - projection of a3 onto q1 - projection of a3 onto q2

30
New cards

dimension of a subspace plus the dimension of its orthogonal complement?

n (this tracks because any basis plus a subspace plus any basis for its orthogonal complement gives you a basis for Rn)

31
New cards

what is the case for any diagonalizable matrix if you do a specific change of basis?

any diagonalizable matrix can be diagonal in some basis

32
New cards

How do you diagonalize a matrix

Write it as PDP^-1

  1. find all its eigenvalues by setting det(A-lambda I)=0 and solving for all the lambdas

  2. find all its eigenvectors by solving (A-lambda I)(x)=0 for each lambda

  3. P is a matrix of eigenvectors. D is a diagonal matrix of eigenvalues in the same order

33
New cards

Requirements for V to be a subspace of S

  1. V contains the zero vector

  2. V is closed in vector addition and scalar multiplication

34
New cards

rank nullity for T: V to W

dim(im)+dim(ker)= dim V

OR

rank(T) + nullity(T) = dim V

35
New cards

det (CB)

detC*decB

36
New cards

det(C^-1)

1/detC

37
New cards

writing a vector as a linear combination of two others that are orthogonal

add up the projections of your vector onto each of the orthogonal vectors

38
New cards

writing a vector as a linear combination of two others that are not orthogonal

Sometimes not possible.

If writing v as a combo of u and w:

  1. make a matrix with columns [u w | v]

  2. rref it

  3. attached image

<p>Sometimes not possible.</p><p>If writing v as a combo of u and w:</p><ol><li><p>make a matrix with columns [u w | v]</p></li><li><p>rref it</p></li><li><p>attached image</p></li></ol><p></p>
39
New cards

S=AB. What is S inverse?

B^-1 A^-1 (flip the order!)

40
New cards

what does det=0 tell us (3 things)

  1. matrix is not invertible

  2. it collapses its input into a lower dimension

  3. it either has no solutions or infinite solutions

41
New cards

A subspace is defined by a given set of equations. How would you make a basis for that subspace?

  1. put your equations into a matrix (one column per variable)

  2. RREF the matrix

  3. identify all your dependant variables and assign them a new name

  4. write all your variables in terms of those dependant variables (write all your pivots in terms of the parameters)

  5. make a new matrix where each column is one of your parameters (dependant variables). That’s your basis!

42
New cards

how do you find the orthogonal complement for a matrix W

  1. Find a basis for W, B

  2. write a new matrix A. Each row of A should be a column of B

  3. solve Ax=0 for x (find the kernel of A)

  4. the solution forms a basis for W perp

43
New cards

Change of basis matrix from C to B

B[x]b=C[x]c

[x]b=B^(-1)C [x]c

44
New cards

Relationship between kernel and image and injectivity/surjectivity

if dim ker = 0, then it is injective

if image=codomain, it is surjective

45
New cards

how do you find [x]b (vector x in basis B)?

what linear combination of the vectors of basis b gets you to x?

  1. sometimes you can figure it out by inspection (you can see that 2b1+3b2 gives you x, so [x]b is <2,3>

  2. if B is orthogonal, you can just project x onto each of the vectors of B

  3. OTHERWISE: set up a linear system where each of b1, b2, b3, etc are columns of an augmented matrix and x is the augmented column. you are trying to solve for b1, b2, b3 coefficients that will get you to x.

46
New cards

angle between 2 vectors formula

knowt flashcard image