Linear Algebra Final Exam Review

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/88

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

89 Terms

1
New cards

vector

A quantity that has magnitude and direction

2
New cards

dimension of a vector

Number of components in a vector.

3
New cards

vector space

A collection of elements that can be formed by adding or multiplying vectors together.

4
New cards

dimension of a matrix

The number of rows and columns of a matrix, written in the form rows×columns.

5
New cards

Vector Addition

adding or combining quantities that have magnitude and direction

6
New cards

Vector Multiplication

not commutative

a x b =/= b x a

7
New cards

Linear Combination

A sum of scalar multiples of vectors. The scalars are called the weights.

8
New cards

span of vectors

Set of all linear combinations of vectors.

9
New cards

dot product

vector multiply vector

results in scalar quantity

= a b cos(angle)

10
New cards

unit vector

v/||v||

11
New cards

norm of a unit vector

sqrt(u*u)

12
New cards

distance between two vectors

d(u,v) = ||u-v||

13
New cards

orthogonal vectors

two vectors whose dot product equals zero

14
New cards

projection of u onto v

(u*v/|v|^2)v

15
New cards

orth v U

U - projection of u onto v

16
New cards

Diagonal Matrix

a square matrix whose entries not on the main diagonal are all zero

17
New cards

identity matrix

a square matrix that, when multiplied by another matrix, equals that same matrix

18
New cards

transpose of a matrix

Switch the rows and columns - imagine it kinda swinging up/down a 90 degree angle

19
New cards

dimension of A*B (matrix multiplication)

the row from A and the column from B

20
New cards

a system of equations can have

one unique solution

no solution

infinite solutions

21
New cards

homogenous system

always has zero vector as a solution

22
New cards

rref

Same as REF but +...

4. The pivot in each non-zero row = 1

5. Each pivot is the only non-zero entry in its column

23
New cards

REF of matrix

1. any row containing entirely zeros are at bottom

2. each leading entry of a row is in a column to the right of the leading entry of the row above it

24
New cards

Inverse Matrix

Matrix that, when multiplied, yields the identity matrix.

25
New cards

A is not invertible if...

1. A is not a square matrix

2. A is a zero matrix

3. A has a zero row or zero column

26
New cards

Determinant of a matrix

ad/bc

27
New cards

Steps to take an inverse of a matrix

use the Gauss-Jordan Elimination method by augmenting your matrix A with the identity matrix [A|I] and performing row operations to transform it into [I|A⁻¹]

28
New cards

finding inverse of a 2x2 matrix

[[a, b], [c, d]]

Determinant: |A| = (ad) - (bc).

Adjoint: Swap a and d, then negate b and c: [[d, -b], [-c, a]].

Inverse: A⁻¹ = (1/|A|) * [[d, -b], [-c, a]]

29
New cards

determinent and row operations: switching two rows

multiples determinent by -1

30
New cards

determinent and row operations: multiplying by a scalar

multiples determinent by the scalar

31
New cards

determinent and row operations: adding a multiple of one row to another

does not change determinant

32
New cards

det(AT)

det(A)

33
New cards

det(A^-1)

Det(A)^-1

34
New cards

determinant of upper/lower triangular matrix or diagonal matrix

the product of all diagonal enteries

35
New cards

cofactor expansion steps:

select any row or column of a matrix, multiply each element in that row/column by its corresponding cofactor, and then sum these products

36
New cards

transformation of a matrix

R^k -> R^n

also called mapping

37
New cards

domain

Set of all possible input values.

38
New cards

codomain

the set of all possible outputs for a function

39
New cards

Range(Ta) or Image(Ta)

set of actual outputs

40
New cards

Range(Ta) for linear systems

vectors for which system Ax=b is consistent

41
New cards

Range(Ta) for linear combinations

If TA is associated to A, then the range(TA) is the span of the columns of A, i.e., the set of all linear combinations of the columns of A.

42
New cards

is b in the range of Ta?

asking if there is a v such that Ta(V) = b

(so augment b to A and then solve)

if there is a solution, it is in range

43
New cards

linear transformation: rotation

R(θ)=[cos⁡θ −sin⁡θ sin⁡θ cos⁡θ]

Clockwise rotation: just flip the sign of θ

44
New cards

reflection across x axis

[1 0 0 −1]

45
New cards

reflection through origin

[-1 0 0 -1]

46
New cards

reflection across y axis

[−1 0 0 1]

47
New cards

reflection across the line y = x

[0 1 1 0]

48
New cards

scaling

[sx 0 0 sy]

sx scales horizontally

sy scales vertically

49
New cards

horizontal shear

[1 k 0 1]

50
New cards

vertical shear

[1 0 k 1]

51
New cards

projection onto x axis

[1 0 0 0]

52
New cards

projection onto y axis

[0 0 0 1]

53
New cards

translation

[1 0 tx 0 1 ty 0 0 1]

txt_x = how far you move a point in the x‑direction (horizontal shift).

tyt_y = how far you move a point in the y‑direction (vertical shift).

54
New cards

properties of linear transformations

Additivity: T(u+v)=T(u)+T(v)T(\mathbf{u} + \mathbf{v}) = T(\mathbf{u}) + T(\mathbf{v}) → The transformation distributes over vector addition.

Homogeneity (scalar multiplication): T(c⋅u)=c⋅T(u)T(c \cdot \mathbf{u}) = c \cdot T(\mathbf{u}) → Scaling before or after the transformation gives the same result.

Preserves the origin: T(0)=0T(\mathbf{0}) = \mathbf{0}. → Linear transformations always map the zero vector to itself.

Matrix representation: Every linear transformation T:Rn→RmT: \mathbb{R}^n \to \mathbb{R}^m can be written as T(x)=AxT(\mathbf{x}) = A\mathbf{x}, where AA is an m×nm \times n matrix.

Composition corresponds to matrix multiplication: If T1(x)=AxT_1(\mathbf{x}) = A\mathbf{x} and T2(x)=BxT_2(\mathbf{x}) = B\mathbf{x}, then (T2∘T1)(x)=B(Ax)=(BA)x(T_2 \circ T_1)(\mathbf{x}) = B(A\mathbf{x}) = (BA)\mathbf{x}.

Invertibility: A linear transformation is invertible if and only if its matrix is invertible (determinant ≠ 0). → Inverse transformation corresponds to the inverse matrix.

Determinant meaning:

∣det⁡(A)∣|\det(A)| = scaling factor of area/volume.

det⁡(A)<0\det(A) < 0 → orientation is flipped (reflection).

det⁡(A)=0\det(A) = 0 → transformation squashes space into lower dimension (not invertible).

55
New cards

kernel(T) or null space of A

vectors mapped to 0 by T

so Av=0

56
New cards

a transformation is onto if

Av = b has at least one solution

57
New cards

one-to-one

Av = b should have at most one solution

58
New cards

vector subspace

collection or set 𝑈 ofvectors in 𝑉, such that• adding two vectors in 𝑈 yields a vector in 𝑈• scaling a vector in 𝑈 yields a vector in 𝑈

59
New cards

basis

linearly independent spanning set

60
New cards

linearly independent

the only solution to

c1v1+c2v2+⋯+cnvn=0

is when all coefficients are zero:

c1=c2=⋯=cn=0

Square matrix → check determinant.

Non‑square matrix → check rank (full column rank = independent).

Always → solve Ax=0 if unsure. (rref should show that each x variable is 0)

61
New cards

dimension of a subspace

number of vectors in that basis

62
New cards

let v be a subspace of R^n and dim(V) = m

basis must have exactly m vectors

spanning set must have at least m vectors

linearly independent set must have at most m vectors

63
New cards

column space

Put the matrix AA into row‑reduced echelon form (RREF).

Identify the pivot columns (the columns that contain leading 1's).

The corresponding columns in the original matrix (not the reduced one!) form a basis for the column space.

Dimension: The number of pivot columns = rank of A

64
New cards

row space

Row reduce AA to RREF.

The non‑zero rows of the RREF form a basis for the row space.

Dimension: The number of non‑zero rows = rank of A

65
New cards

rank(A) =

dim(col(a)) = dim(row(a)) = dim(col(AT)) = rank(AT)

66
New cards

rank-nullity theorem

rank(A) + nullity(A) = number of columns, where rank is number of pivot columns, and nullity is number of free variables

67
New cards

change of basis

Form the change of basis matrix

Put the new basis vectors as columns in a matrix P. Example:

P=[1 1 1 −1]

Convert coordinates from new basis to old basis

If a vector has coordinates [v]B in the new basis, then in the standard basis:

[v]standard=P[v]B[v]

Convert coordinates from old basis to new basis

Use the inverse:

[v]B=P−1[v]standard

68
New cards

Gram-Schmidt Process

Start with a set of linearly independent vectors {v1,v2,...,vn}

Define u1=v1

u2 = v2 - proju1V2

u3 = v3 - proju1V3 - proju2V3

then {u1, u2, ..., un} is an orthogonal basis for v

69
New cards

Least squares

Set up normal equations: ATAx=ATb

Solve for x

Equivalent to projecting b onto the column space of A

70
New cards

inner products

⟨u,v⟩=uTv

71
New cards

properties of inner products

Symmetry: ⟨u,v⟩=⟨v,u⟩

Linearity: ⟨au+bv,w⟩=a⟨u,w⟩+b⟨v,w⟩

Positive definiteness: ⟨v,v⟩≥0, equality only if v=0

Defines length: ∥v∥=sqrt(⟨v,v⟩)

Defines orthogonality: ⟨u,v⟩=0

distance = sqrt(

72
New cards

eigen problems

Solve det⁡(A−λI)=0 for eigenvalues λ

For each λ, solve (A−λI)v=0 for eigenvectors.

73
New cards

eigenvalues of triangular matrix

just the enteries on the diagonal

74
New cards

product of eigenvalues

determinent of A

75
New cards

diagonalisation

Find eigenvalues and eigenvectors.

Form matrix P with eigenvectors as columns.

Then A=PDP−1, where D is diagonal with eigenvalues.

76
New cards

Markov Process

Transition matrix P with nonnegative entries, columns (or rows) sum to 1.

State vector evolves as xk+1=Pxk

Long‑term behavior: look at steady state x such that Px=x

77
New cards

SVD

Factor A into A=UΣVT

U: orthogonal matrix (columns = left singular vectors).

Σ: diagonal with singular values. (sqrt of eigenvalues of ATA)

V: orthogonal matrix (columns = right singular vectors).

78
New cards

QR factorization

Apply Gram-Schmidt to columns of A.

Get orthonormal matrix Q.

Upper triangular matrix R satisfies A=QR

79
New cards

transition matrix P from B2 to B1

P = B1^-1 * B2

80
New cards

Guaranteed eigenvector of A^n

it's just the original eigenvector x

81
New cards

If λ is an eigenvalue of A

λ^k is an eigenvalue of A^k

82
New cards

Suppose A is mxn, B is nxp, and C is pxq

then ABC is mxq

83
New cards

if v and w are parallel, then projvW = ?

w

84
New cards

(AB)^T

B^TA^T

85
New cards

If A*B = 0 and A is invertible

then B is 0

86
New cards

det(2A)

2^n(det(A))

87
New cards

det(A^-1)

1/det(A)

88
New cards

What is the angle between a vector ⃗v and its unit vector ⃗v/ |⃗v |

Zero, since they are in the same direction

89
New cards