1/184
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Solution of a System
A solution is a set of values for the variables that satisfies every equation in the system.
Consistent vs Inconsistent
A system is consistent if it has at least one solution; it is inconsistent if it has no solution.
Possible Numbers of Solutions
A linear system has either 0 solutions, exactly 1 solution, or infinitely many solutions.
Coefficient Matrix
The coefficient matrix records only the coefficients of the variables in a linear system.
Augmented Matrix
The augmented matrix [A∣b] is formed by attaching the constants column b to the coefficient matrix A.
REF
REF means row echelon form: all nonzero rows are above zero rows, each leading entry is to the right of the one above it, and entries below each leading entry are zero.
RREF
RREF means reduced row echelon form: it is REF and each leading entry is 1 and is the only nonzero entry in its column.
Elementary Row Operations
The three row operations are: swap two rows; multiply a row by a nonzero scalar; add a multiple of one row to another.
Equivalent Systems
Two linear systems are equivalent if they have the same solution set.
Pivot
A pivot is a leading entry in a row of an echelon form matrix.
Basic Variable
A basic variable corresponds to a pivot column.
Free Variable
A free variable corresponds to a non-pivot column and can take arbitrary parameter values.
General Solution
The general solution writes all variables in terms of the free variables, often in parametric vector form.
Homogeneous System
A homogeneous system has the form Ax=0 and always has at least the trivial solution x=0.
Parametric Vector Form
Write the solution as x=p+sv1+tv2+⋯, where p is a particular solution and the vectors multiply free parameters.
Matrix Size
If a matrix has m rows and n columns, its size is m×n.
Matrix Addition
Matrix addition is defined only for matrices of the same size and is done entry-by-entry.
Scalar Multiplication
To multiply a matrix by a scalar, multiply every entry by that scalar.
Matrix Multiplication
If A is m×n and B is n×p, then AB is defined and has size m×p.
Identity Matrix
The identity matrix In is the n×n matrix with ones on the diagonal and zeros elsewhere.
Inverse of a Matrix
A square matrix A is invertible if there exists a matrix A−1 such that AA−1=A−1A=I.
How to Solve with an Inverse
If A is invertible, the solution to Ax=b is x=A−1b.
Invertible Matrix Theorem — Core Ideas
For an n×n matrix A, these are equivalent: A is invertible; Ax=b has a unique solution for every b; Ax=0 has only the trivial solution; A is row equivalent to In; det(A)=0; rank(A)=n; the columns of A span Rn; the columns of A are linearly independent.
Determinant of a 2×2 Matrix
If A=((a,b),(c,d)), then det(A)=ad−bc.
Meaning of Determinant
For a square matrix, det(A)=0 means the matrix is invertible; det(A)=0 means it is not invertible.
Determinant Row Facts
Swapping two rows changes the sign of the determinant; multiplying a row by c multiplies the determinant by c; adding a multiple of one row to another does not change the determinant.
Determinant of a Triangular Matrix
If A is upper or lower triangular, then det(A) is the product of the diagonal entries.
Cofactor
The cofactor of entry aij is Cij=(−1)i+jMij, where Mij is the minor obtained by deleting row i and column j.
Adjoint Matrix
The adjoint matrix is adj(A)=CT, where C is the cofactor matrix.
Inverse by Adjoint
If det(A)=0, then A−1=det(A)1adj(A).
Cramer's Rule
If A is invertible, the solution of Ax=b satisfies xi=det(A)det(Ai), where Ai is obtained by replacing column i of A with b.
Vector in Rn
A vector in Rn is an ordered n-tuple, written as ⟨x1,x2,…,xn⟩.
Vector Addition
For vectors u,v∈Rn, add componentwise: u+v=⟨u1+v1,…,un+vn⟩.
Scalar Multiple of a Vector
For scalar c and vector v, cv=⟨cv1,…,cvn⟩.
Norm of a Vector
The norm of x=(x1,…,xn) is ∣x∣=x12+⋯+xn2.
Vector Space
A vector space is a set with vector addition and scalar multiplication satisfying the vector space axioms.
Subspace Test
A nonempty subset S of a vector space is a subspace if it is closed under addition and scalar multiplication.
Fast Way to Show Not a Subspace
Show either 0∈/S, or closure under addition fails, or closure under scalar multiplication fails.
Linear Combination
A vector x is a linear combination of v1,…,vk if x=c1v1+⋯+ckvk for some scalars.
Span
The span of v1,…,vk is the set of all linear combinations of those vectors.
How to Test if x∈Spanv1,…,vk
Solve c1v1+⋯+ckvk=x; if the system is consistent, then x is in the span.
Linearly Independent
Vectors v1,…,vk are linearly independent if c1v1+⋯+ckvk=0 implies c1=⋯=ck=0.
Linearly Dependent
Vectors are linearly dependent if there is a nontrivial solution to c1v1+⋯+ckvk=0.
Quick Test for Two Vectors
Two vectors are linearly dependent iff one is a scalar multiple of the other.
Dependence Theorem
A set is linearly dependent iff at least one vector in the set can be written as a linear combination of the others.
Basis
A basis for a vector space is a set of vectors that is both linearly independent and spanning.
Standard Basis of Rn
The standard basis is e1,e2,…,en, where each ei has a 1 in position i and zeros elsewhere.
Dimension
The dimension of a vector space is the number of vectors in any basis for that space.
Basis Shortcut in Rn
If a set has exactly n vectors in an n-dimensional space, then proving either independence or spanning is enough to conclude it is a basis.
Too Many Vectors Rule
Any set with more than n vectors in an n-dimensional vector space is linearly dependent.
Row Space
The row space of a matrix is the span of its row vectors.
Column Space
The column space of a matrix is the span of its column vectors.
Rank of a Matrix
The rank of a matrix is dim(colA)=dim(rowA) and equals the number of pivots.
Basis for Row Space
A basis for the row space is given by the nonzero rows of an REF or RREF of the matrix.
Basis for Column Space
Find pivot columns in an REF or RREF, then take the corresponding original columns of A.
Null Space
The null space of A is Nul(A)=x:Ax=0.
Nullity
The nullity of A is dim(Nul(A)).
Rank-Nullity Theorem
If A is m×n, then rank(A)+nullity(A)=n.
Coordinate Vector
If B=v1,…,vn is an ordered basis and x=c1v1+⋯+cnvn, then [x]B=⟨c1,…,cn⟩.
How to Find [x]B
Solve c1v1+⋯+cnvn=x and use the coefficients as the coordinates.
Transition Matrix
If P is the transition matrix from basis B to basis C, then P[x]B=[x]C.
How to Find a Transition Matrix
Place the new basis and old basis into [CB] and row reduce to [IP] to get the transition matrix from B to C.
Linear Transformation
A function T:Rn→Rm is linear if T(u+v)=T(u)+T(v) and T(cu)=cT(u).
Key Properties of Linear Transformations
If T is linear, then T(0)=0, T(−u)=−T(u), T(u−v)=T(u)−T(v), and T(cu+dv)=cT(u)+dT(v).
Matrix Transformation
Every m×n matrix A defines a linear transformation T(x)=Ax from Rn to Rm.
Standard Matrix of T
The standard matrix of T:Rn→Rm is A=[T(e1)T(e2)⋯T(en)].
Composition of Linear Transformations
If S has matrix AS and T has matrix AT, then T∘S has matrix ATAS.
Kernel of a Linear Transformation
The kernel is ker(T)=x:T(x)=0.
Image of a Linear Transformation
The image is im(T)=T(x):x∈domain.
Kernel and Null Space
If A is the standard matrix of T, then ker(T)=Nul(A).
Image and Column Space
If A is the standard matrix of T, then im(T)=col(A).
One-to-One Test
A linear transformation is one-to-one iff ker(T)=0.
Onto Test
A linear transformation T:Rn→Rm is onto iff rank(T)=m.
Invertible Linear Transformation
A linear transformation T:Rn→Rn is invertible iff it is one-to-one iff it is onto iff its standard matrix is invertible.
Eigenvalue
A scalar λ is an eigenvalue of A if there exists a nonzero vector x such that Ax=λx.
Eigenvector
A nonzero vector x satisfying Ax=λx is an eigenvector corresponding to eigenvalue λ.
Characteristic Equation
Eigenvalues are found from det(A−λI)=0.
How to Find Eigenvectors
For each eigenvalue λ, solve (A−λI)x=0.
Eigenspace
The eigenspace for λ is Eλ=x:(A−λI)x=0.
Triangular Matrix Eigenvalues
If A is triangular, its eigenvalues are the diagonal entries.
Distinct Eigenvalues Rule
If an n×n matrix has n distinct eigenvalues, then it has n linearly independent eigenvectors.
Similar Matrices
Matrices A and B are similar if B=P−1AP for some invertible P.
Diagonalizable
A matrix A is diagonalizable if it is similar to a diagonal matrix.
Diagonalization Test
An n×n matrix is diagonalizable iff it has n linearly independent eigenvectors.
How to Build P and D for Diagonalization
Place linearly independent eigenvectors as the columns of P, and place the matching eigenvalues in the corresponding diagonal positions of D so that A=PDP−1.
Why Diagonalization Is Useful
If A=PDP−1, then Ak=PDkP−1, and powers of diagonal matrices are easy because each diagonal entry is raised to the power k.
Complex Number Standard Form
A complex number has the form a+bi, where a,b∈R and i2=−1.
Complex Conjugate
The conjugate of z=a+bi is z=a−bi.
Modulus of a Complex Number
The modulus is ∣z∣=a2+b2 for z=a+bi.
Division of Complex Numbers
To divide wz, multiply numerator and denominator by the conjugate of the denominator.
Polar Form
A complex number can be written as z=r(cosθ+isinθ), where r=∣z∣.
Euler Form
The same polar form can be written as z=reiθ.
De Moivre's Theorem
If z=r(cosθ+isinθ), then zn=rn(cos(nθ)+isin(nθ)).
Complex Eigenvalues
Some real matrices have complex eigenvalues, often found when the characteristic polynomial has no real roots.
Inner Product in Rn
The standard inner product is u⋅v=u1v1+⋯+unvn.
Orthogonal Vectors
Two vectors are orthogonal if u⋅v=0.
Norm from Inner Product
The norm can be written as ∣v∣=v⋅v.
Orthogonal Complement
If W is a subspace of Rn, then W⊥=x:x⋅w=0 for all w∈W.
Orthogonal Set
A set of nonzero vectors is orthogonal if every pair of distinct vectors has dot product zero.
Orthogonal Set Independence
Any orthogonal set of nonzero vectors is linearly independent.