Linear Algebra Fundamentals

0.0(0)
studied byStudied by 0 people
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/96

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

97 Terms

1
New cards

Linear equation

An equation of the form 𝑎1𝑥1+𝑎2𝑥2+⋯+𝑎𝑛𝑥𝑛=𝑏, where 𝑎𝑖 and 𝑏 are constants, and 𝑥𝑖 are variables.

2
New cards

System of linear equations representation

Represented using a coefficient matrix and a constant vector.

3
New cards

Augmented matrix

Matrix obtained by appending the constant terms of a system of equations to the coefficient matrix.

4
New cards

Gaussian elimination

Method to solve linear equations by transforming the augmented matrix into row echelon form.

5
New cards

Row echelon form

Matrix where nonzero rows are above rows of zeros, with leading entry of each nonzero row to the right of the row above.

6
New cards

Reduced row echelon form

Row echelon form where each leading entry is 1 and is the only nonzero entry in its column.

7
New cards

Unique solution in linear equations

Exists if after row reducing the augmented matrix, there are no free variables and the system is consistent.

8
New cards

Pivot position

Location in a row with a leading 1 in row echelon form.

9
New cards

Free variable

Variable that can take any value and is not a leading variable in row echelon form of a matrix.

10
New cards

Consistent system significance

System with at least one solution, while an inconsistent system has none.

11
New cards

Matrix addition

Adding corresponding elements of two matrices of the same dimensions.

12
New cards

Scalar multiplication

Multiplying each entry of a matrix by a scalar.

13
New cards

Matrix multiplication

Taking dot product of rows of the first matrix with columns of the second to produce a new matrix.

14
New cards

Identity matrix

Square matrix with ones on the diagonal and zeros elsewhere, acting as the multiplicative identity.

15
New cards

Transpose of a matrix

Obtained by swapping rows with columns.

16
New cards

Zero matrix

Matrix where all elements are zero.

17
New cards

Diagonal matrix

Square matrix where all off-diagonal elements are zero.

18
New cards

Matrix equality

When matrices have the same dimensions and corresponding entries are equal.

19
New cards

Matrix inverse

Matrix A has an inverse A^-1 such that AA^-1 = A^-1A = I, where I is the identity matrix.

20
New cards

Solution set of linear system

Set of all possible solutions satisfying the system of equations.

21
New cards

Homogeneous system

System where all constant terms are zero, i.e., Ax=0.

22
New cards

Particular solution

Specific solution to a non-homogeneous linear system.

23
New cards

General solution

Set of all solutions, often expressed as a particular solution plus the solution to the associated homogeneous system.

24
New cards

Solution set in terms of free variables

Expressed by solving for basic variables in terms of free variables.

25
New cards

Geometric interpretation of solution set

Intersection of lines in a plane for two linear equations in two variables, or planes in three-dimensional space for three linear equations in three variables.

26
New cards

Infinite solutions in a system

Existence of free variables, forming a line or plane in the solution space.

27
New cards

Pivot columns in solution set

Correspond to basic variables determined by the system, while non-pivot columns correspond to free variables.

28
New cards

Existence of solutions from row echelon form

Determined by checking for rows with all zeros except in the constant column.

29
New cards

Associative property of matrix addition

(A+B)+C = A+(B+C) for matrices A, B, and C of the same dimensions.

30
New cards

Distributive property of matrix operations

Matrix multiplication distributes over addition: A(B+C) = AB + AC and (A+B)C = AC + BC.

31
New cards

Commutative matrix multiplication condition

Matrices A and B commute if AB = BA.

32
New cards

Matrix inverse definition

Matrix A has an inverse A^-1 if AA^-1 = A^-1A = I, where I is the identity matrix.

33
New cards

Inverse of a 2x2 matrix

For a 2x2 matrix A=(a c; b d), the inverse is given by A^-1 = (1/(ad-bc)) * (d -b; -c a), provided ad-bc ≠ 0.

34
New cards

Identity matrix property in multiplication

Acts as the multiplicative identity: AI = IA = A for any matrix A of compatible dimensions.

35
New cards

Similar matrices

Matrices A and B are similar if there exists an invertible matrix P such that B = P^-1AP.

36
New cards

Determining invertibility of a matrix

A matrix is invertible if and only if its determinant is non-zero.

37
New cards

Transpose of a matrix product

(AB)^T = B^T A^T.

38
New cards

Determinant role in matrix inversion

Determinant being non-zero is a condition for matrix inversion.

39
New cards

Determinant

The determinant of a matrix helps determine if the matrix is invertible; a non-zero determinant indicates that the matrix has an inverse.

40
New cards

Elementary Matrix

An elementary matrix is obtained by performing a single row operation on the identity matrix.

41
New cards

Swapping Two Rows

Swapping two rows of a matrix multiplies the determinant by -1.

42
New cards

Multiplying a Row by a Scalar

Multiplying a row by a scalar k multiplies the determinant by k.

43
New cards

Adding a Multiple of One Row to Another Row

Adding a multiple of one row to another row does not change the determinant.

44
New cards

Inverse of an Elementary Matrix

The inverse of an elementary matrix is another elementary matrix corresponding to the inverse of the row operation.

45
New cards

Vector Space

A vector space is a set of vectors along with two operations (vector addition and scalar multiplication) that satisfy certain axioms.

46
New cards

Subspace

A subspace is a subset of a vector space that is itself a vector space under the same operations.

47
New cards

Span of Vectors

The span of a set of vectors is the set of all possible linear combinations of those vectors.

48
New cards

Basis of a Vector Space

A basis is a set of linearly independent vectors that spans the vector space.

49
New cards

Null Space of a Matrix

The null space (or kernel) of a matrix A is the set of all vectors x such that Ax=0.

50
New cards

Row Space

The row space is the span of the rows of the matrix.

51
New cards

Linear Combination

A linear combination is a sum of scalar multiples of vectors.

52
New cards

Rank of a Matrix

The rank is the dimension of the column space (or row space) of the matrix.

53
New cards

Rank-Nullity Theorem

The rank-nullity theorem states that for an m×n matrix A, rank(A) + nullity(A) = n.

54
New cards

Eigenvalue

An eigenvalue λ is a scalar such that Av=λv for some non-zero vector v, called an eigenvector.

55
New cards

Characteristic Polynomial

The characteristic polynomial is det(A-λI), where λ is a scalar and I is the identity matrix.

56
New cards

Diagonalizable Matrix

A matrix is diagonalizable if it can be written as PDP^-1, where D is a diagonal matrix and P is an invertible matrix.

57
New cards

Diagonalization Process

Diagonalization involves finding a matrix P of eigenvectors and a diagonal matrix D of eigenvalues such that A = PDP^-1.

58
New cards

Eigenspace

The eigenspace corresponding to an eigenvalue λ is the set of all eigenvectors associated with λ, along with the zero vector.

59
New cards

Orthogonal Vector

An orthogonal vector

60
New cards

Orthogonal vector

Two vectors are orthogonal if their dot product is zero.

61
New cards

Orthogonal set of vectors

A set of vectors where each pair of distinct vectors is orthogonal.

62
New cards

Orthonormal set of vectors

An orthogonal set where each vector is of unit length (norm 1).

63
New cards

Gram-Schmidt process

An algorithm to orthogonalize a set of vectors in an inner product space.

64
New cards

Projection of a vector onto another vector

The projection of vector v onto u is given by proju v = (v⋅u / u⋅u)u.

65
New cards

Orthogonal complement of a subspace

The set of all vectors orthogonal to every vector in a subspace.

66
New cards

Significance of orthogonality in least squares problems

In least squares problems, orthogonal projection minimizes the distance between the vector and the subspace.

67
New cards

Least squares solution to an overdetermined system

The solution that minimizes the norm of the residual vector b−Ax, found using the normal equation ATAx = ATb.

68
New cards

Relation between orthogonal matrices and orthogonality

An orthogonal matrix Q satisfies QTQ = QQ^T = I, indicating its columns form an orthonormal set.

69
New cards

Linear transformation

A function between vector spaces that preserves vector addition and scalar multiplication.

70
New cards

Kernel (null space) of a linear transformation

The set of all vectors that map to the zero vector under the linear transformation.

71
New cards

Image (range) of a linear transformation

The set of all possible outputs of the linear transformation.

72
New cards

Rank of a linear transformation

The dimension of the image (range) of the transformation.

73
New cards

Nullity of a linear transformation

The dimension of the kernel (null space) of the transformation.

74
New cards

Rank-nullity theorem for linear transformations

States that for a linear transformation T:V→W, rank(T) + nullity(T) = dim(V).

75
New cards

Determining if a transformation is invertible

A transformation is invertible if and only if it is both one-to-one (injective) and onto (surjective).

76
New cards

Effect of composition of linear transformations

The composition of two linear transformations is itself a linear transformation, with a matrix representation as the product of the individual transformation matrices.

77
New cards

Matrix of a linear transformation with respect to given bases

The matrix representation of a linear transformation depends on the chosen bases for the domain and codomain, showing how the transformation acts on basis vectors.

78
New cards

Diagonalization of a matrix

Expressing a matrix A as PDP^−1, where D is a diagonal matrix and P is an invertible matrix.

79
New cards

Determining if a matrix is diagonalizable

A matrix is diagonalizable if it has n linearly independent eigenvectors, where n is the size of the matrix.

80
New cards

Diagonal matrix in diagonalization

Contains the eigenvalues of the matrix on its diagonal in the context of diagonalization.

81
New cards

Importance of eigenvectors in diagonalization

Eigenvectors form the columns of the matrix P used in diagonalization.

82
New cards

Finding eigenvalues of a matrix

Eigenvalues are found by solving the characteristic polynomial det(A−λI) = 0.

83
New cards

Relationship between eigenvalues and the trace of a matrix

The trace of a matrix equals the sum of its eigenvalues.

84
New cards

Relationship between eigenvalues and the determinant of a matrix

The determinant of a matrix equals the product of its eigenvalues.

85
New cards

Applying the spectral theorem to symmetric matrices

States that a symmetric matrix can be diagonalized by an orthogonal matrix.

86
New cards

Significance of eigenvalue decomposition

Aids in simplifying matrix functions and solving differential equations.

87
New cards

Computing powers of a diagonalizable matrix

Powers of a diagonalizable matrix A can be computed as Ak = PD^kP^−1, where D^k is the diagonal matrix with eigenvalues raised to the power k.

88
New cards

Orthogonal diagonalization

Expressing a symmetric matrix A as PDP^T, where D is a diagonal matrix and P is an orthogonal matrix.

89
New cards

Spectral theorem for symmetric matrices

States that every symmetric matrix can be diagonalized by an orthogonal matrix.

90
New cards

Finding an orthogonal matrix for orthogonal diagonalization

Formed from the normalized eigenvectors of the symmetric matrix.

91
New cards

Relationship between eigenvalues and orthogonal matrices in orthogonal diagonalization

Eigenvalues are on the diagonal of the diagonal matrix, and the columns of the orthogonal matrix are the normalized eigenvectors.

92
New cards

Properties of orthogonal matrices

Orthogonal matrices satisfy QTQ = QQ^T = I, indicating their rows and columns are orthonormal.

93
New cards

Verifying if a matrix is symmetric

A matrix is symmetric if it is equal to its transpose, i.e., A = AT.

94
New cards

Significance of orthogonal diagonalization in practical applications

Simplifies matrix computations, including solving differential equations and principal component analysis in statistics.

95
New cards

Role of the Gram-Schmidt process in orthogonal diagonalization

Used to produce an orthonormal basis, aiding in constructing the orthogonal matrix P.

96
New cards

Effect of orthogonality of eigenvectors on matrix computations

Simplifies computations and ensures numerical stability.

97
New cards

Computing the matrix exponential for a diagonalizable matrix

For a diagonalizable matrix A = PDP^−1, the matrix exponential e^A is computed as e^A = Pe^D P^−1, where e^D is the exponential of the diagonal matrix D.