Linear Algebra Flashcards

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/55

flashcard set

Earn XP

Description and Tags

Flashcards covering topics from a linear algebra course, including linear equations, matrices, vector spaces and diagonalization.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

56 Terms

1
New cards

Linear Equation

An equation of the form a1x1 + a2x2 + . . . + anxn = b where a1, a2, . . . , an, b ∈ R are called the coefficients.

2
New cards

Homogeneous Linear Equation

A linear equation on the variables x1, x2, . . . , xn is called homogeneous if it is of the form a1x1 + a2x2 + . . . + anxn = 0.

3
New cards

System of Linear Equations

A set of linear equations on the variables x1, x2, . . . , xn.

4
New cards

Homogeneous System of Linear Equations

A system of linear equations is called homogeneous if it consists of only homogeneous linear equations.

5
New cards

Solution to a Linear Equation

Given a linear equation a1x1 + a2x2 + . . . + anxn = b on the variables x1, x2, . . . , xn, a solution is an n-tuple of real numbers (s1, s2, . . . , sn) such that a1s1+a2s2+. . .+ansn = b.

6
New cards

Solution to a System of Linear Equations

Given a system of linear equations on the variables x1, x2, . . . , xn, a solution to the system is an n-tuple of real numbers (s1, s2, . . . , sn) that is a solution to each equation of the system.

7
New cards

Number of Solutions to a Homogeneous Linear System

A homogeneous system of linear equations admits either exactly one solution or infinitely many of them.

8
New cards

Number of Solutions to a General Linear System

A system of linear equations admits either no solutions or exactly one solution or infinitely many of them.

9
New cards

Row Echelon Matrix

A matrix is said to be row echelon if it satisfies the following conditions: i) Rows with only zeros (if any) are at the bottom of the matrix. ii) If we call k-th pivot the first non zero coefficient of the k-th row (from left to right), then any pivot must be strictly to the right of the pivot above it.

10
New cards

Reduced Echelon Matrix

A matrix is said to be reduced echelon if, moreover, it satisfies two additional conditions: iii) All pivots are equal to 1. iv) In any column containing a pivot, all other coefficients must be equal to zero.

11
New cards

Basic and Free Variables

Consider a linear system represented by means of a echelon matrix (eventually reduced). The variables corresponding to pivot columns are called basic variables whereas the variables corresponding to non-pivot columns are called free variables.

12
New cards

Equivalent Linear Systems

Consider two different systems (S1) and (S2) of linear equations on the variables x1, x2, . . . , xn. We say that (S1) and (S2) are equivalent, denoted by (S1) ⇐⇒ (S2), if they have the same set of solutions.

13
New cards

Elementary Row Operations on a Linear System

There are three elementary row operations that transform a linear system into an equivalent one: 1. Multiplication of a row by a scalar: Lk ← λLk where λ ∈ R∗ . 2. Row permutation: Lk ←→ Lj . 3. Addition of a multiple of a given row: Lk ← Lk + λLj where λ ∈ R.

14
New cards

Vector Space over R

A set E is called a real vector space if it is equipped with two operations: Internal addition and External multiplication such that it satisfy certain axioms.

15
New cards

Linear Combination

Let E be a real-vector space and let u1, u2 be two vectors of E. Then, a linear combination of u1 and u2 is a vector of the form: αu1 + βu2, where α, β ∈ R.

16
New cards

Vector Subspace

Let E be a vector space and F ⊂ E a non-empty subset of E. We say that F is a vector subspace if “it is stable under linear combinations”.

17
New cards

Linear Span of a Family of Vectors

Let E be a vector space and F = u1, . . . , un} a finite family of vectors in E. The subset of all linear combinations of vectors of F is a vector subspace called the linear span of F (also: “the subspace generated by F”) and denoted by Span(F).

18
New cards

Sum of Vector Subspaces

Let E be a vector space and F1, F2 two vector subspaces of E. Then, the sum F1 + F2 is the vector subspace of E defined by F1 + F2 = u ∈ E | ∃v1 ∈ F1, ∃v2 ∈ F2, u = v1 + v2.

19
New cards

Direct Sum, Supplementary Subspaces

Let E be a vector space and F1, F2 two vector subspaces of E. We say that F1 and F2 are in direct sum (or that F1 and F2 are supplementary), denoted E = F1 ⊕ F2, just in case we have: 1. E = F1 + F2. 2. F1 ∩ F2 = {0}.

20
New cards

Spanning Set

A spanning set is a family of vectors from which one can build all other vectors in the vector space. More precisely, let E be a vector space and F be a family of vectors in E. Then F is a spanning set of E if we have Span(F) = E.

21
New cards

Linearly Independent Set

A linearly independent set is a family of vectors such that none of them can be written as linear combination of the other vectors of the family. More precisely, let E be a vector space and F = {u1, . . . , un} be a finite family of vectors. Then F is a linearly independent set if we have λ1u1 + . . . + λnun = 0 =⇒ λ1 = . . . = λn = 0.

22
New cards

Basis

Let E be a vector space. A basis of E is a family of vectors that is both a spanning set and a linearly independent set.

23
New cards

Finite-Dimensional Vector Spaces

A vector space E is said to be finite-dimensional if it has a finite spanning set.

24
New cards

Dimension of a Vector Space

Let E be a finite-dimensional vector space. The dimension of E, denoted dim(E), is the number of vectors of any basis of E.

25
New cards

Matrix Associated to a Family of Vectors

Let E be an n-dimensional vector space, B = b1, . . . , bn a basis of E and F = f1, . . . , fp a finite family of vectors in E. Then the matrix associated to the family F in the basis B, denoted M atB(F), is the matrix of p columns and n rows where the coefficients of the j-th column are the coordinates in the basis B of the vector fj .

26
New cards

Linear Map

A linear map is a map between vector spaces that respects the two fundamental operations of linear algebra (internal addition and external multiplication). More precisely, let A and B be two vector spaces. A map f : A −→ B is said to be a linear map if: ∀u1, u2 ∈ A, ∀α, β ∈ R, f(αu1 + βu2) = αf(u1) + βf(u2).

27
New cards

Endomorphism

An endomorphism is a linear map from a vector space A to itself.

28
New cards

Isomorphism

An isomorphism is a linear map that is also a bijection.

29
New cards

Automorphism

An automorphism is a linear map that is both an endomorphism and an isomorphism.

30
New cards

Image of a Linear Map

Given a linear map f : A −→ B, its image Im(f) is the subset of the codomain B defined by Im(f) = b ∈ B | ∃a ∈ A, f(a) = b.

31
New cards

Rank of a Linear Map

Given a linear map f : A −→ B, the rank of f, denoted rank(f), is the dimension of Im(f).

32
New cards

Kernel of a Linear Map

Given a linear map f : A −→ B, its kernel Ker(f) is the subset of the domain A defined by Ker(f) = a ∈ A | f(a) = 0B

33
New cards

Matrix Associated to a Linear Map

Let A and B be two finite-dimensional vector spaces. Let f : A −→ B be a linear map. Let A = {a1, a2, . . . , an} be a basis of A and B = {b1, b2, . . . , bp} be a basis of B. Then, the matrix associated to the linear map f in the bases A and B, denoted M atBA(f) is defined as the matrix associated to the family f(A) in the basis B.

34
New cards

The Rank-Nullity Theorem

Let f : A −→ B be a linear map between two finite-dimensional vector spaces. Then, we have rank(f) + dim(Ker(f)) = dim(A) .

35
New cards

Matrix

A matrix M is simply a table of p rows and n columns where each element is a real number.

36
New cards

Column Vector, Square Matrix

A matrix is called a column vector if it has only one column and a square matrix if it has as many rows as columns.

37
New cards

Diagonal and Triangular Matrices

A square matrix M is called • diagonal if all its non-diagonal elements are 0. • upper triangular if all the elements below the diagonal are 0. • lower triangular if all the elements above the diagonal are 0.

38
New cards

Multiplication by a Scalar

Let M be a matrix with p rows and n columns and let λ be a real number. Then λM is the matrix found by multiplying all elements of M by λ: (λM)ij = λMij .

39
New cards

Addition of Matrices

Let M and N be two matrices of the same size. Then one can define their sum M + N by (M + N)ij = Mij + Nij .

40
New cards

Product of a Matrix and a Vector

Let M ∈ Mpn(R) be a matrix with p rows and n columns. Denote by C1, C2, . . . , Cn the n columns of M. Moreover, let X ∈ Rn be a column vector with n rows. Then, the product MX is the vector column with p rows defined by MX = x1C1 + x2C2 + . . . + xnCn.

41
New cards

Proudct of Two Matrices

Let M ∈ Mpn(R) be a matrix with n columns and N ∈ Mnk(R) a matrix with n rows. Denote by N1, N2, . . . , Nk the columns of N. Then, the product MN is the matrix with p rows and k columns whose columns are the vectors MN1, MN2, . . . , MNk.

42
New cards

Identity Matrix

The square matrix of size n with 1 in all diagonal elements and 0 everywhere else is called the identity matrix and denoted In.

43
New cards

Invertible Matrix

Let M be a square matrix of size n. We say that M is invertible if there exists another square matrix N of size n such that MN = NM = In.

44
New cards

Transpose of a Matrix

Let M ∈ Mpn(R) be any matrix with p rows and n columns. Then, the transpose of M, denoted Mt ∈ Mnp(R), is the matrix with n rows and p columns whose columns are the rows of M. More precisely, we have (Mt)ij = Mji.

45
New cards

Symmetric and Skew-Symmetric Matrices

A square matrix M ∈ Mnn(R) is said to be symmetric if Mt = M and skew- symmetric if Mt = −M.

46
New cards

Trace of a Square Matrix

Let M ∈ Mnn(R) be a square matrix. Then, the trace of M, denoted T r(M), is defined as the sum of all the diagonal elements of M: T r(M) = n k=1 Mkk.

47
New cards

Determinant of a 2x2 Matrix

For a 2 × 2 matrix M = a b c d, its determinant is the real number det(M) = a b c d = ad − bc.

48
New cards

General Definition of the Determinant

Consider a general square matrix of size n M . Denote by Mi,j the square matrix of size n − 1 obtained by deleting row i and column j from the matrix M. Then, we have det(M) =n k=1 (−1)k+1m1k det(M1,k) =m11 det(M1,1) − m12 det(M1,2) + . . . + (−1)nm1n det(M1,n).

49
New cards

Determinant of a Triangular Matrix

Let T ∈ Mnn(R) be a triangular square matrix. Then the determinant of T is the product of its diagonal terms: det(T) = n k=1 Tkk.

50
New cards

Eigenvectors, Eigenvalues, Eigenspaces

Let E be a vector space and f : E −→ E be an endomorphism on E. Whenever we find a non-zero vector v ∈ E and a value λ ∈ R such that f(v) = λv, we say that v is an eigenvector of f and λ is an eigenvalue of f. The set of all eigenvalues of f is called the spectrum of f and is denoted Spec(f). Given an eigenvalue λ ∈ Spec(f), the set Eλ of all vectors satisfying f(v) = λv is called the eigenspace of f associated to the eigenvalue λ.

51
New cards

Geometric Multiplicity of an Eigenvalue

Let E be a vector space, f an endomorphism on E and λ ∈ R an eigenvalue of f. Then, the dimension of Eλ is called the geometric multiplicity of λ.

52
New cards

Characteristic Polynomial of an Endomorphism or Matrix

Let f be an endomorphism on a finite-dimensional vector space E. The characteristic polynomial of f is the quantity Pf (x) = det(f − xidE), where x ∈ R.

53
New cards

Algebraic Multiplicity of an Eigenvalue

Let f be an endomorphism on a finite-dimensional vector space E and λ ∈ spec(f) an eigenvalue of f. Then, the multiplicity of λ as a root of the characteristic polynomial Pf is called the algebraic multiplicity of λ.

54
New cards

Diagonalizable Endomorphism

Let E be a finite-dimensional vector space and f be an endomorphism on E. We say that f is diagonalizable if there exists a basis B such that matB(f) is a diagonal matrix.

55
New cards

Diagonalizable Matrix

Let M be a square matrix of size n. We say that M is diagonalizable if there exists an invertible matrix P and a diagonal matrix D such that M = P DP −1 .

56
New cards

Transition Matrix

Let A be a finite dimensional vector space, B and C two bases of A. The transition matrix representing the basis C in the basis B, denoted PBC, is the matrix whose columns are the coordinates of the vectors of C in the basis B.