Linear Algebra

0.0(0)
studied byStudied by 5 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/50

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

51 Terms

1
New cards

solution set

Set of vectors that satisfy a condition

2
New cards

if W is a subspace in Rn ,

  • W is also a subspace of Rn

  • (W) = W

  • dim(W) + dim W = n

3
New cards
4
New cards

Row(A)=

Nul(A)

5
New cards

Nul(A)=

Row(A)

6
New cards

Col(A)=

Nul(AT)

7
New cards

Nul(AT)=

Col(A)

8
New cards
9
New cards

when 2 vec orthogonal

when the dot product of 2 vect is 0

10
New cards

if A is n x n, then the geometric multiplicity is alway…

less than or equal to the algebraic multiplicity

11
New cards

if A is n x n, then the Algebraic multiplicity is alway…

more than or equal to the geometric multiplicity

12
New cards

Characteristic polynomial equation

f(λ)=

det(A- λI) or

λ2−Tr(A)λ+det(A) (best for large Matricies)

13
New cards

finding the inverse of any size matrix A

  • write the matrix and then the standard basis vectors for the same dimension right next to it

  • RREF the matrix, and mimic all the steps on the standard basis vector matrix

14
New cards

subspace of a matrix m x n

Rm

15
New cards

A matrix is invertible if

Det(A) != 0 aka matrix is 1-1 and onto

16
New cards

Properties of Projection Matrices

Let W be a subspace of Rn, define T:Rn→Rn by T(x)=xW, and let B be the standard matrix for T. Then:

  1. Col(B)=W.

  2. Nul(B)=W⊥.

  3. B2=B.

  4. If WA={0}, then 1 is an eigenvalue of B and the 1-eigenspace for B is W.

  5. If WA=Rn, then 0 is an eigenvalue of B and the 0-eigenspace for B is W⊥.

  6. B is similar to the diagonal matrix with m ones and n−m zeros on the diagonal, where m=dim(W).

17
New cards

Properties of Orthogonal Projections

Let W be a subspace of Rn, and define T:Rn→Rn by T(x)=xW. Then:

  1. T is a linear transformation.

  2. T(x)=x if and only if x is in W.

  3. T(x)=0 if and only if x is in W⊥.

  4. T◦T=T.

  5. The range of T is W.

18
New cards

geometric multiplicity

dim(eigenspace), aka # of lin ind eigenvector

19
New cards

algebraic multiplicity

the amount of times the eigenval appears in a sol when sol for the characteristic polynomial

ex:

f(λ) = (λ + 3) (λ + 3)

λ has an algebraic multiplicity of 2 for the sol -3

20
New cards

Trace of a matrix

the sum of the diagonal entries of a matrix

21
New cards

rank Nullity theorem

for any consistent system of linear equations, (dim of column span)+(dim of solution set)=(number of variables).

22
New cards

nullity

dimension of the null space

23
New cards

column space

the span of the columns of a matrix

24
New cards

span

25
New cards

Homogeneous system

When b = 0 in Ax = b, if there is >1 sol, there are infinite sols.

26
New cards

vector equation

knowt flashcard image
27
New cards

matrix equation

Ax = b

28
New cards

linearly independent

  • There is a pivot in each column

  • Ax = 0 is the only sol, has no free vars

29
New cards

Linear Dependents

  • rows > col (m>n) and/or

  • one of the vec in the set is a 0 vec

30
New cards

subspace

a subset of vectors that is:

  • not empty

  • closed under addition

  • closed under scalar multiplication

31
New cards

null space

  • the space containing the col of vectors (x in Ax = b) that make b = 0

  • are always subspaces

32
New cards

injective transformation (1-1)

transformation such that every y (point in the new transformed set) has at most 1 corresponding x from the untransformed set (T(x) = y)

33
New cards

Injective transformation (1-1) properties

  • Each vector in V has at most ONE corresponding vector in W

    • There is one pivot in every column and row

  • Not all vectors in W have to be a correspondent in V

  • must have at least as many rows as columns: m >= n

  • Sets must be linearly Independent

34
New cards

surjective (onto)

transformation where every point in the new transformation has at least 1 corresponding val in the old transformation

35
New cards

surjective (onto) properties

  • range of A is in T (all points in T have to be connected to some A)

  • Multiple Vs in A can be mapped to 1 point in T

  • There is a pivot in every row

36
New cards

transformation no-no’s

  • have a var being added to a const

  • have one or more of the resulting vars be a const

  • have one or more of the resulting vars be an absolute power

37
New cards

basis

  • the min # of vectors needed to get the same span as the subspace

    • ex of subspaces: sol set, column space, etc

  • the vectors in a basis are lin indep

  • found by writing the parametric form of a matrix and/or by using a linear combination of the vectors found using parametric form

38
New cards

Co-domain

the subspace of the vectors after they are transformed

39
New cards

rank

The rank of a matrix A, written rank(A), is the dimension of the column space Col(A)

40
New cards

dimension

The # of linearly independent vectors in the basis of a matrix

ex: if the sol set of a matrix is a plane in a 3D space, the dim is 2

41
New cards

trivial solution

the 0 vector

42
New cards

:

“such that”

43
New cards

“there exists”

44
New cards

Closed under addition

If u and v are in V, then u+v is also in V.

45
New cards

unique sol

there is only 1 x in Ax=b that gets u a specific b

46
New cards

Closed under scalar multiplication

  • if u is in V, then uc, where c is any # in R1, is also in V

  • you can check this by multiplying a vector by a random real # and seeing if it is still in V

  • if there is even one situation where the system is not closed under addition, then the whole system is not closed

47
New cards

Invertible matrix

a square matrix transformation that can be multiplied by another square matrix to get the identity matrix  

48
New cards

A matrix is invertible if:

  • matrix is square

  • A and B are commutable

49
New cards

commutable

the product of 2 matrices is the same no matter the order

50
New cards

invertible matrix theorem

All the following statements about the matrix transformation T(x)=Ax where T:Rn → Rn, are true 

  1. A is invertible.

  2. T is invertible.

  3. Nul(A)={0}.

  4. The columns of A are linearly independent.

  5. The columns of A span Rn.

  6. Ax=b has a unique solution for each b in Rn.

  7. T is one-to-one and onto.

  8. det(A) and det(T) ≠ 0

51
New cards