REAL LA FINAL

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/184

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 2:08 AM on 4/19/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

185 Terms

1
New cards

Solution of a System

A solution is a set of values for the variables that satisfies every equation in the system.

2
New cards

Consistent vs Inconsistent

A system is consistent if it has at least one solution; it is inconsistent if it has no solution.

3
New cards

Possible Numbers of Solutions

A linear system has either 00 solutions, exactly 11 solution, or infinitely many solutions.

4
New cards

Coefficient Matrix

The coefficient matrix records only the coefficients of the variables in a linear system.

5
New cards

Augmented Matrix

The augmented matrix [Ab][A|b] is formed by attaching the constants column bb to the coefficient matrix AA.

6
New cards

REF

REF means row echelon form: all nonzero rows are above zero rows, each leading entry is to the right of the one above it, and entries below each leading entry are zero.

7
New cards

RREF

RREF means reduced row echelon form: it is REF and each leading entry is 11 and is the only nonzero entry in its column.

8
New cards

Elementary Row Operations

The three row operations are: swap two rows; multiply a row by a nonzero scalar; add a multiple of one row to another.

9
New cards

Equivalent Systems

Two linear systems are equivalent if they have the same solution set.

10
New cards

Pivot

A pivot is a leading entry in a row of an echelon form matrix.

11
New cards

Basic Variable

A basic variable corresponds to a pivot column.

12
New cards

Free Variable

A free variable corresponds to a non-pivot column and can take arbitrary parameter values.

13
New cards

General Solution

The general solution writes all variables in terms of the free variables, often in parametric vector form.

14
New cards

Homogeneous System

A homogeneous system has the form Ax=0Ax=0 and always has at least the trivial solution x=0x=0.

15
New cards

Parametric Vector Form

Write the solution as x=p+sv1+tv2+x=p+s\,v_1+t\,v_2+\cdots, where pp is a particular solution and the vectors multiply free parameters.

16
New cards

Matrix Size

If a matrix has mm rows and nn columns, its size is m×nm\times n.

17
New cards

Matrix Addition

Matrix addition is defined only for matrices of the same size and is done entry-by-entry.

18
New cards

Scalar Multiplication

To multiply a matrix by a scalar, multiply every entry by that scalar.

19
New cards

Matrix Multiplication

If AA is m×nm\times n and BB is n×pn\times p, then ABAB is defined and has size m×pm\times p.

20
New cards

Identity Matrix

The identity matrix InI_n is the n×nn\times n matrix with ones on the diagonal and zeros elsewhere.

21
New cards

Inverse of a Matrix

A square matrix AA is invertible if there exists a matrix A1A^{-1} such that AA1=A1A=IAA^{-1}=A^{-1}A=I.

22
New cards

How to Solve with an Inverse

If AA is invertible, the solution to Ax=bAx=b is x=A1bx=A^{-1}b.

23
New cards

Invertible Matrix Theorem — Core Ideas

For an n×nn\times n matrix AA, these are equivalent: AA is invertible; Ax=bAx=b has a unique solution for every bb; Ax=0Ax=0 has only the trivial solution; AA is row equivalent to InI_n; det(A)0\det(A)\neq 0; rank(A)=n\operatorname{rank}(A)=n; the columns of AA span Rn\mathbb{R}^n; the columns of AA are linearly independent.

24
New cards

Determinant of a 2×22\times 2 Matrix

If A=((a,b),(c,d))A=((a,b),(c,d)), then det(A)=adbc\det(A)=ad-bc.

25
New cards

Meaning of Determinant

For a square matrix, det(A)0\det(A)\neq 0 means the matrix is invertible; det(A)=0\det(A)=0 means it is not invertible.

26
New cards

Determinant Row Facts

Swapping two rows changes the sign of the determinant; multiplying a row by cc multiplies the determinant by cc; adding a multiple of one row to another does not change the determinant.

27
New cards

Determinant of a Triangular Matrix

If AA is upper or lower triangular, then det(A)\det(A) is the product of the diagonal entries.

28
New cards

Cofactor

The cofactor of entry aija_{ij} is Cij=(1)i+jMijC_{ij}=(-1)^{i+j}M_{ij}, where MijM_{ij} is the minor obtained by deleting row ii and column jj.

29
New cards

Adjoint Matrix

The adjoint matrix is adj(A)=CT\operatorname{adj}(A)=C^T, where CC is the cofactor matrix.

30
New cards

Inverse by Adjoint

If det(A)0\det(A)\neq 0, then A1=1det(A)adj(A)A^{-1}=\dfrac{1}{\det(A)}\operatorname{adj}(A).

31
New cards

Cramer's Rule

If AA is invertible, the solution of Ax=bAx=b satisfies xi=det(Ai)det(A)x_i=\dfrac{\det(A_i)}{\det(A)}, where AiA_i is obtained by replacing column ii of AA with bb.

32
New cards

Vector in Rn\mathbb{R}^n

A vector in Rn\mathbb{R}^n is an ordered nn-tuple, written as x1,x2,,xn\langle x_1,x_2,\dots,x_n\rangle.

33
New cards

Vector Addition

For vectors u,vRnu,v\in\mathbb{R}^n, add componentwise: u+v=u1+v1,,un+vnu+v=\langle u_1+v_1,\dots,u_n+v_n\rangle.

34
New cards

Scalar Multiple of a Vector

For scalar cc and vector vv, cv=cv1,,cvncv=\langle cv_1,\dots,cv_n\rangle.

35
New cards

Norm of a Vector

The norm of x=(x1,,xn)x=\left(x_1,\dots,x_n\right) is x=x12++xn2|x|=\sqrt{x_1^2+\cdots+x_n^2}.

36
New cards

Vector Space

A vector space is a set with vector addition and scalar multiplication satisfying the vector space axioms.

37
New cards

Subspace Test

A nonempty subset SS of a vector space is a subspace if it is closed under addition and scalar multiplication.

38
New cards

Fast Way to Show Not a Subspace

Show either 0S0\notin S, or closure under addition fails, or closure under scalar multiplication fails.

39
New cards

Linear Combination

A vector xx is a linear combination of v1,,vkv_1,\dots,v_k if x=c1v1++ckvkx=c_1v_1+\cdots+c_kv_k for some scalars.

40
New cards

Span

The span of v1,,vk{v_1,\dots,v_k} is the set of all linear combinations of those vectors.

41
New cards

How to Test if xSpanv1,,vkx\in \operatorname{Span}{v_1,\dots,v_k}

Solve c1v1++ckvk=xc_1v_1+\cdots+c_kv_k=x; if the system is consistent, then xx is in the span.

42
New cards

Linearly Independent

Vectors v1,,vkv_1,\dots,v_k are linearly independent if c1v1++ckvk=0c_1v_1+\cdots+c_kv_k=0 implies c1==ck=0c_1=\cdots=c_k=0.

43
New cards

Linearly Dependent

Vectors are linearly dependent if there is a nontrivial solution to c1v1++ckvk=0c_1v_1+\cdots+c_kv_k=0.

44
New cards

Quick Test for Two Vectors

Two vectors are linearly dependent iff one is a scalar multiple of the other.

45
New cards

Dependence Theorem

A set is linearly dependent iff at least one vector in the set can be written as a linear combination of the others.

46
New cards

Basis

A basis for a vector space is a set of vectors that is both linearly independent and spanning.

47
New cards

Standard Basis of Rn\mathbb{R}^n

The standard basis is e1,e2,,ene_1,e_2,\dots,e_n, where each eie_i has a 11 in position ii and zeros elsewhere.

48
New cards

Dimension

The dimension of a vector space is the number of vectors in any basis for that space.

49
New cards

Basis Shortcut in Rn\mathbb{R}^n

If a set has exactly nn vectors in an nn-dimensional space, then proving either independence or spanning is enough to conclude it is a basis.

50
New cards

Too Many Vectors Rule

Any set with more than nn vectors in an nn-dimensional vector space is linearly dependent.

51
New cards

Row Space

The row space of a matrix is the span of its row vectors.

52
New cards

Column Space

The column space of a matrix is the span of its column vectors.

53
New cards

Rank of a Matrix

The rank of a matrix is dim(colA)=dim(rowA)\dim(\operatorname{col}A)=\dim(\operatorname{row}A) and equals the number of pivots.

54
New cards

Basis for Row Space

A basis for the row space is given by the nonzero rows of an REF or RREF of the matrix.

55
New cards

Basis for Column Space

Find pivot columns in an REF or RREF, then take the corresponding original columns of AA.

56
New cards

Null Space

The null space of AA is Nul(A)=x:Ax=0\operatorname{Nul}(A)={x:Ax=0}.

57
New cards

Nullity

The nullity of AA is dim(Nul(A))\dim(\operatorname{Nul}(A)).

58
New cards

Rank-Nullity Theorem

If AA is m×nm\times n, then rank(A)+nullity(A)=n\operatorname{rank}(A)+\operatorname{nullity}(A)=n.

59
New cards

Coordinate Vector

If B=v1,,vnB={v_1,\dots,v_n} is an ordered basis and x=c1v1++cnvnx=c_1v_1+\cdots+c_nv_n, then [x]B=c1,,cn[x]_B=\langle c_1,\dots,c_n\rangle.

60
New cards

How to Find [x]B[x]_B

Solve c1v1++cnvn=xc_1v_1+\cdots+c_nv_n=x and use the coefficients as the coordinates.

61
New cards

Transition Matrix

If PP is the transition matrix from basis BB to basis CC, then P[x]B=[x]CP[x]_B=[x]_C.

62
New cards

How to Find a Transition Matrix

Place the new basis and old basis into [C  B][C\;B] and row reduce to [I  P][I\;P] to get the transition matrix from BB to CC.

63
New cards

Linear Transformation

A function T:RnRmT:\mathbb{R}^n\to\mathbb{R}^m is linear if T(u+v)=T(u)+T(v)T(u+v)=T(u)+T(v) and T(cu)=cT(u)T(cu)=cT(u).

64
New cards

Key Properties of Linear Transformations

If TT is linear, then T(0)=0T(0)=0, T(u)=T(u)T(-u)=-T(u), T(uv)=T(u)T(v)T(u-v)=T(u)-T(v), and T(cu+dv)=cT(u)+dT(v)T(cu+dv)=cT(u)+dT(v).

65
New cards

Matrix Transformation

Every m×nm\times n matrix AA defines a linear transformation T(x)=AxT(x)=Ax from Rn\mathbb{R}^n to Rm\mathbb{R}^m.

66
New cards

Standard Matrix of TT

The standard matrix of T:RnRmT:\mathbb{R}^n\to\mathbb{R}^m is A=[T(e1)  T(e2)    T(en)]A=[T(e_1)\;T(e_2)\;\cdots\;T(e_n)].

67
New cards

Composition of Linear Transformations

If SS has matrix ASA_S and TT has matrix ATA_T, then TST\circ S has matrix ATASA_TA_S.

68
New cards

Kernel of a Linear Transformation

The kernel is ker(T)=x:T(x)=0\ker(T)={x:T(x)=0}.

69
New cards

Image of a Linear Transformation

The image is im(T)=T(x):xdomain\operatorname{im}(T)={T(x):x\in\text{domain}}.

70
New cards

Kernel and Null Space

If AA is the standard matrix of TT, then ker(T)=Nul(A)\ker(T)=\operatorname{Nul}(A).

71
New cards

Image and Column Space

If AA is the standard matrix of TT, then im(T)=col(A)\operatorname{im}(T)=\operatorname{col}(A).

72
New cards

One-to-One Test

A linear transformation is one-to-one iff ker(T)=0\ker(T)={0}.

73
New cards

Onto Test

A linear transformation T:RnRmT:\mathbb{R}^n\to\mathbb{R}^m is onto iff rank(T)=m\operatorname{rank}(T)=m.

74
New cards

Invertible Linear Transformation

A linear transformation T:RnRnT:\mathbb{R}^n\to\mathbb{R}^n is invertible iff it is one-to-one iff it is onto iff its standard matrix is invertible.

75
New cards

Eigenvalue

A scalar λ\lambda is an eigenvalue of AA if there exists a nonzero vector xx such that Ax=λxAx=\lambda x.

76
New cards

Eigenvector

A nonzero vector xx satisfying Ax=λxAx=\lambda x is an eigenvector corresponding to eigenvalue λ\lambda.

77
New cards

Characteristic Equation

Eigenvalues are found from det(AλI)=0\det(A-\lambda I)=0.

78
New cards

How to Find Eigenvectors

For each eigenvalue λ\lambda, solve (AλI)x=0(A-\lambda I)x=0.

79
New cards

Eigenspace

The eigenspace for λ\lambda is Eλ=x:(AλI)x=0E_\lambda={x:(A-\lambda I)x=0}.

80
New cards

Triangular Matrix Eigenvalues

If AA is triangular, its eigenvalues are the diagonal entries.

81
New cards

Distinct Eigenvalues Rule

If an n×nn\times n matrix has nn distinct eigenvalues, then it has nn linearly independent eigenvectors.

82
New cards

Similar Matrices

Matrices AA and BB are similar if B=P1APB=P^{-1}AP for some invertible PP.

83
New cards

Diagonalizable

A matrix AA is diagonalizable if it is similar to a diagonal matrix.

84
New cards

Diagonalization Test

An n×nn\times n matrix is diagonalizable iff it has nn linearly independent eigenvectors.

85
New cards

How to Build PP and DD for Diagonalization

Place linearly independent eigenvectors as the columns of PP, and place the matching eigenvalues in the corresponding diagonal positions of DD so that A=PDP1A=PDP^{-1}.

86
New cards

Why Diagonalization Is Useful

If A=PDP1A=PDP^{-1}, then Ak=PDkP1A^k=PD^kP^{-1}, and powers of diagonal matrices are easy because each diagonal entry is raised to the power kk.

87
New cards

Complex Number Standard Form

A complex number has the form a+bia+bi, where a,bRa,b\in\mathbb{R} and i2=1i^2=-1.

88
New cards

Complex Conjugate

The conjugate of z=a+biz=a+bi is z=abi\overline{z}=a-bi.

89
New cards

Modulus of a Complex Number

The modulus is z=a2+b2|z|=\sqrt{a^2+b^2} for z=a+biz=a+bi.

90
New cards

Division of Complex Numbers

To divide zw\dfrac{z}{w}, multiply numerator and denominator by the conjugate of the denominator.

91
New cards

Polar Form

A complex number can be written as z=r(cosθ+isinθ)z=r(\cos\theta+i\sin\theta), where r=zr=|z|.

92
New cards

Euler Form

The same polar form can be written as z=reiθz=re^{i\theta}.

93
New cards

De Moivre's Theorem

If z=r(cosθ+isinθ)z=r(\cos\theta+i\sin\theta), then zn=rn(cos(nθ)+isin(nθ))z^n=r^n(\cos(n\theta)+i\sin(n\theta)).

94
New cards

Complex Eigenvalues

Some real matrices have complex eigenvalues, often found when the characteristic polynomial has no real roots.

95
New cards

Inner Product in Rn\mathbb{R}^n

The standard inner product is uv=u1v1++unvnu\cdot v=u_1v_1+\cdots+u_nv_n.

96
New cards

Orthogonal Vectors

Two vectors are orthogonal if uv=0u\cdot v=0.

97
New cards

Norm from Inner Product

The norm can be written as v=vv|v|=\sqrt{v\cdot v}.

98
New cards

Orthogonal Complement

If WW is a subspace of Rn\mathbb{R}^n, then W=x:xw=0 for all wWW^\perp={x:x\cdot w=0\text{ for all }w\in W}.

99
New cards

Orthogonal Set

A set of nonzero vectors is orthogonal if every pair of distinct vectors has dot product zero.

100
New cards

Orthogonal Set Independence

Any orthogonal set of nonzero vectors is linearly independent.