linear quiz/ test 3

0.0(0)
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Get a hint
Hint

FALSE: if A and B are row equivalent then they have the same eigenvalues

Get a hint
Hint

not true

Get a hint
Hint

geometric multiplicity is ALWAYS

Get a hint
Hint

LESS THAN OR EQUAL TO THE ALGEBRAIC MULT.

Card Sorting

1/34

Anonymous user
Anonymous user
encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

35 Terms

1
New cards

FALSE: if A and B are row equivalent then they have the same eigenvalues

not true

2
New cards

geometric multiplicity is ALWAYS

LESS THAN OR EQUAL TO THE ALGEBRAIC MULT.

3
New cards

algebraic multiplicity

number of times an eigenvalue repeats. Or the number of times that a root appears in the characteristic polynomial.

4
New cards

geometric multiplicity

dimension of eigenspace (also equal to number of free variables)

5
New cards

Diagonalizable

A square matrix A is said to be diagonalizable if A is similar to a diagonal matrix, that is, if A = PDP^-1 for some invertible matrix P and some diagonal matrix D

- a matrix is diagonalizable if the dimensions of its eigenvalues add up to the number of its columns AND rows??? (idk if its just columns and rows or both) see last question in 4.4 on page 4 titled the diagonalization theorem!!! for a great example

6
New cards

Know That

the eigenvalues of a triangular matrix are the numbers that go down diagonally

7
New cards

the sum of the algebraic multiplicities is equal to the

number of rows and columns that the corresponding matrix has and should also equal the degree of the matrix's characteristic polynomial

8
New cards

When the algebraic multiplicity is equal to 1 the geometric multiplicity is ALWAYS ALSO

EQUAL TO 1

9
New cards

know that when a matrix is linearly INDEPENDENT AKA INVERTIBLE...

0 can NOT be on of its eigenvalues

10
New cards

on the contrary, when a matrix is linearly DEPENDENT, AKA NOT INVERTIBLE

0 CAN be one of its eigenvalues 4.3 pg 3

11
New cards

Eigenvalue

A scalar lambda such that Ax = lamba x has a solution for some nonzero vector x

<p>A scalar lambda such that Ax = lamba x has a solution for some nonzero vector x</p>
12
New cards

eigenvector

a nonzero vector x such that Ax = lambda(x) for some scalar lambda

<p>a nonzero vector x such that Ax = lambda(x) for some scalar lambda</p>
13
New cards

FOR a matrix to be DIAGONALIZABLE, the number of TOTAL eigenvectors (add up the dimension of each eigenvalue) should equal up to the number of rows and columns

i.e. if you get 3 total eigenvectors for a matrix with 3 rows and 3 columns then that matrix is DIAGONALIZABLE!!!

(A hint for this is if a matrix has all different # eigenvalues, you can tell that it will have the same # of distinct eigenvalues which will add to the # of rows and columns naturally) 4.4 pg. 3

14
New cards

remember that the D matrix should correspond to the P matrix, the D matrix just writes

the D matrix just writes out the eigenvalues diagonally surrounded by all zeros.

15
New cards

A diagonalizable matrix has an algebraic multiplictity EQUAL TO its geometric multiplicity

read again 4.4 pg 4

16
New cards

PLEASE DONT FORGET UR AUG. Matrixes

DONT FORGET TO WRITE UR BASIS AS { } and ur EIGENSPACE AS =span ([ ]) !!!! READ CAREFULLY AND USE CORRECT NOTATION

17
New cards

How to find the degree of a polynomial

to find the degree of a polynomial you add up the total of EVERY exponents (even the 1)

(1 + ¥)^3 (1-¥)^2 (¥ - 2) = 3 + 2 + 1 = 6 EXPONENTS

18
New cards

how do I know that I am working with a complex eigenvector???

You'll know bc the characteristic polynomial you get will be impossible to factor (there's no values that multiply to 5 and add up to 4) and you'll have to resort to the quadratic formula

ex: (¥^2 - 4¥ + 5) is impossible to factor and doing the quadratic formula out gives its complex eigenvalues

19
New cards

The det (A^T) is equal to the regular det(A)

the determinant of the TRANSPOSE and the determinant of the regular A matrix ARE THE SAME

20
New cards

Thm. 4.18 : if A is INVERTIBLE, integer ^k is automatically an eigenvalue of A^k w/ corresponding eigenvector x

if A is invertible and for example equal to A^-1, then ¥^-1 is automatically an eigenvalue for A^-1

21
New cards

KNOW THAT A CAN BE SUBSTITUTED FOR A #/ eigenvalue!!!!

Additional problems #2

22
New cards

show that -¥ (bar over lambda) is an eigenvalue of A

Know that -Ais equal to regular matrix A (SEE PHOTOS)

23
New cards

orthogonal

vectors that equal zero when multiplied together (dot product is zero)

24
New cards

how to check if something is orthonormal

A list of vectors is called orthonormal if each vector is:

1) orthogonal (dot product equals out to zero)

AND 2) each vector has a length of 1

25
New cards

if you ever have a test question asking what the dot product of two vectors in the same sub space is, the answer is ALWAYS zero

!!!! likely will be a MCQ (see pg. 3 of 5.2)

26
New cards

WHEN y is in the subspace, your answer for the orthogonal projection of y onto (W for example) WILL BE A LINEAR COMBINATION

see pg 2 5.2

27
New cards

"find the orthogonal projection of y onto W"

y= (y•u1/ u1•u1)(u1) + (y•u2/u2•u2) (u2) see pg 2 of 5.2

28
New cards

"find a basis for W__|__ "

being asked to find a basis for the col (A __|__) AKA write the basis for the column space of the TRANSPOSE see pg 2 of 5.2

29
New cards

An orthogonal basis for a subspace W of R^n is

a basis for W that is also an orthogonal set ( vectors are proven to be an orthogonal set when their dot products equal out to zero, them equaling out to zero proves that the vectors are l.i. And in turn, when the # of vectors matches the dimension of the span i.e. 3 l.i. vectors in R^3, the vectors must be a basis for R^3 ) see pg 1 of 5.1

30
New cards

vectors must be LINEARLY INDEPENDENT in order to make up an orthogonal basis for R^n

- a linearly dependent vector like the zero vector can't be among the vectors

- the vectors can't equal each other when multiplied together

31
New cards

Question: find the orthogonal decomposition of W

Y= (y-^y) +y (use given y - found y hap, plus given y

32
New cards

!!!!KNOW THAT THE INVERSE OF AN ORTHOGONAL MATRIX/SET OF VECTORS IS THE SAME AS ITS TRANSPOSE

IF YOU’RE ASKED TO WRITE THE INVERSE OF A MATRIX AND YOU ALR KNOW THAT IT IS ORTHOGINAL, THEN WRITE ITS TRANSPOSE AS THE ANSWER

33
New cards

the basis of a SPAN MUST HAVE THE EXACT SAME # OF VECTORS AS THE SPAN CALLS FOR

if asked for basis for span R², you better write a basis with exactly TWO VECTORS (the basis is different than the span and doesn’t need a minimum of i.e. 2 vectors like the span does but instead needs exactly 2 vectors!!!)

34
New cards

The conjugate of a complex eigenvalue is also another eigenvalue of the same vector!!!! i.e.(1+2i) and (1-2i) are both eigenvalues of the same vector

basically saying that a vector with eigenvalue (1+2i) also has (1-2i) as an eigenvalue bc they are the SAME THING

35
New cards

for a linearly independent matrix

the equation Ax=0 has only the trivial solution