Linear Final

0.0(0)
studied byStudied by 2 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/40

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

41 Terms

1
New cards

det(ATB) =

det(BTA)

2
New cards

det(A+B) DOES NOT =

det(A) + det(B)

3
New cards

every orthonormal basis is

an orthogonal basis

4
New cards

If B is the RREF of a matrix A and det(B) = 0

then det(A) = 0

5
New cards

Row opperation DO change

the determinant of a matrix

6
New cards

det(2A) =

det(A) x 2m

  • if m is the number of rows

7
New cards

det(AB) = 

det(A)det(B)

8
New cards

if det(A) NOT= 0 we know that

A is invertible

  • thus a solution exists and is unique

9
New cards

for a set of vectors to span Pn

It must have n+1 vectors that are not scalars of each other (linearly independent)

10
New cards

for a matrix/set of vectors to be linearly independent

det(A) CANNOT = 0

11
New cards

a set of vectors is a basis for Pn if

they span Pn and are linearly independent 

12
New cards

a set of vectors is an orthogonal set if

all of the vetcor dot products = 0

they are linearly independent

13
New cards

if x is NOT in the subspace W, then

x - prodwxis NOT = 0

14
New cards

if Avv for a nonzero vector v,

then v is an eigenvetcor with eigenvalue λ

15
New cards

Algebraic multiplicity

multiplicity of eignenvalue in characteristic polynomial

16
New cards

geometic multiplicity

dimension of eigenspace (number of linearly independent eigenvectors)

17
New cards

When is a matrix diagonalizable?

when geometric multiplicity = algebraic multiplicity for all eigenvalues:

n linearly independent eigenvectors for nxn matrix

18
New cards

Every real symmetric matrix

is orthogonally diagonizable

19
New cards

to check if a matrix is diagonizable,

check if A = AT (symmetric)

20
New cards

Stepsfor orthogonal diagonization:

  1. find eigenvalues

  2. find eigenvectors

  3. normalize eigenvectors

  4. for P with normalized eigenvetors, D with eigenvalues

21
New cards

Singular value decomp (SVD)

A=UΣVT where U and V are orthogonal, Σ is diagonal with singular values.\

  • every mxn matrix has one

22
New cards

finding rank from SVD

rank = number of nonzero singular values (square root of eigenvalues of ATA, always nonnegative)

23
New cards

Properties of U and V in SVD A:

  • UUT = I and UTU = I

  • VVT = I and VTV = I

  • Columns are orthogonal

24
New cards

two linear systems are the same if

they have the same solution set

25
New cards

consistent system

either has one solution or infinitely many solutions

26
New cards

inconsistent system

has no solution

27
New cards

rows represent

equations (m)

28
New cards

columns represent

variables (n)

29
New cards

free variables indicate that

there are many solutions, not a unique solution

30
New cards

If b is in the span of the columns of A, you can also say that

b is in the column space of A

31
New cards

can a 3×2 matrix span R³?

No, because this matrix is 2D and cannot span a 3D subspace;

  • A is unable to have a pivot in every row

32
New cards

can a 2×3 matrix span R²?

yes because it can have a pivot in every row

33
New cards

rank(A) is equal to

  • the number of nonzero rows in its echelon form B

and/or

  • the number of pivot positions (pivot columns) in B

34
New cards

Rank-Nullity theorum

rank(A) + nullity(A) = number of columns of A

35
New cards

the basis for Col(A) is determined by

the pivot columns in the RREF that contain a leading 1

36
New cards

the basis for Nul(A) is determined by

solving for RREFx = 0; just write RREF in parametric vector form

(NulA = the set of all vectors x such that Ax = 0

ex: the x for a 3×4 matrix will have four variables (x1, x2, x3, x4)

37
New cards

null space, NulA is

the set of all vectors x such that Ax = 0

38
New cards

For Ax=b to have a unique solution

  • A is invertible

  • A is row equivalent to I

  • There is a pivot in ever column of A

  • the columns of A are linearly independent

  • det(A) does NOT = 0

  • 0 is not an eigenvalue of A

39
New cards

A has orthonormal columns if

ATA = I

40
New cards

A is orthogonally diagonizable if

A = AT ; a is symmretric

41
New cards

if , A = UΣV T (SVD), what is the size of the original matrix A?

same as the size of Σ