1/16
These flashcards summarize key concepts from Linear Algebra I, focusing on definitions and explanations important for understanding matrix algebra, linear transformations, and eigenvalues.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No study sessions yet.
Complex number
A number that can be expressed in the form z = a + bi, where a is the real part, b is the imaginary part, and i is the imaginary unit (i^2 = -1).
Real n-space
The set of all n-tuples of real numbers, denoted Rn, where n is a positive integer.
Scalar product
The dot product between two vectors, calculated as u · v = u1v1 + u2v2 + … + unvn.
Norm of a vector
The length of a vector, defined as ||v|| = √(v · v).
Cauchy-Schwarz inequality
For any vectors u and v, |u · v| ≤ ||u|| ||v||.
Matrix algebra
A branch of mathematics that deals with the study of matrices and their operations.
Row operations
Operations that can be performed on the rows of a matrix, including row scaling, row interchange, and row replacement.
Gaussian elimination
A method for solving systems of linear equations by bringing the augmented matrix to reduced row echelon form.
Determinant
A scalar value that is a function of the entries of a square matrix, indicating whether the matrix is invertible.
Eigenvalues
Scalar values λ such that for a matrix A, there exists a non-zero vector v (eigenvector) with Av = λv.
Linear transformation
A function T: Rn → Rm that satisfies T(u + v) = T(u) + T(v) and T(λu) = λT(u) for all u, v in Rn and scalar λ.
Subspace
A subset V of Rn that is non-empty, closed under addition, and closed under scalar multiplication.
Null space
The set of all vectors in Rn that are mapped to the zero vector in Rm by a linear transformation.
Linear independence
A set of vectors is linearly independent if the only solution to their linear combination equating to zero is the trivial solution.
Column space
The set of all possible linear combinations of the column vectors of a matrix.
Characteristic polynomial
The polynomial obtained from a square matrix A defined as pA(λ) = det(A - λIn), used to find eigenvalues.
Orthonormal set
A set of vectors that are orthogonal to each other and each have unit length.