Linear Algebra Principles and Matrix Properties

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/17

flashcard set

Earn XP

Description and Tags

This flashcard set covers core concepts from linear algebra including matrix inversion, Cramer's rule, vector space properties, eigenvalues, and inner products as detailed in the lecture notes.

Last updated 8:12 AM on 5/16/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

18 Terms

1
New cards

Adjoint Matrix (adj(A)adj(A))

The transpose of the cofactor matrix CC, calculated as adj(A)=CTadj(A) = C^T.

2
New cards

Inverse of a Matrix (A1A^{-1})

For a matrix AA, it is defined as A1=1det(A)adj(A)A^{-1} = \frac{1}{det(A)} adj(A).

3
New cards

Cramer’s Rule

A technique for solving systems of linear equations using determinants, where variables are found via ratios such as x=DxDx = \frac{D_x}{D}, y=DyDy = \frac{D_y}{D}, and z=DzDz = \frac{D_z}{D}.

4
New cards

Linearly Independent Vectors

A set of vectors v1,v2,,vn\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_n where the equation k1v1+k2v2++knvn=0k_1\mathbf{v}_1 + k_2\mathbf{v}_2 + \dots + k_n\mathbf{v}_n = \mathbf{0} is satisfied only by the trivial solution k1=k2==kn=0k_1 = k_2 = \dots = k_n = 0.

5
New cards

Linearly Dependent Vectors

A set of vectors where at least one vector can be expressed as a linear combination or scalar multiple of the others, or if the set contains the zero vector.

6
New cards

Basis

A set of linearly independent vectors that span a specific solution space or subspace.

7
New cards

Dimension (dimdim)

The number of vectors contained within a basis for a given space.

8
New cards

Rank (Rank(A)Rank(A))

The number of non-zero rows present in the reduced row echelon form (RR) of a matrix AA.

9
New cards

Nullity (Nullity(A)Nullity(A))

The value determined by subtracting the rank from the total number of columns in the matrix (Nullity=number of columnsRankNullity = \text{number of columns} - Rank).

10
New cards

Characteristic Equation

The equation defined by det(λIA)=0det(\lambda I - A) = 0, used to determine the eigenvalues of a matrix AA.

11
New cards

Eigenvalues (λ\lambda)

The scalar values produced by solving the characteristic equation det(λIA)=0det(\lambda I - A) = 0.

12
New cards

Eigenvectors

Non-zero vectors x\mathbf{x} that satisfy the system (λIA)x=0(\lambda I - A)\mathbf{x} = \mathbf{0} for a given eigenvalue λ\lambda.

13
New cards

Diagonalization (A=PDP1A = PDP^{-1})

A process where a matrix AA is decomposed into a matrix of eigenvectors PP and a diagonal matrix of eigenvalues DD.

14
New cards

Orthogonal Matrices

Two matrices u\mathbf{u} and v\mathbf{v} are considered orthogonal with respect to an inner product if u,v=0\langle \mathbf{u}, \mathbf{v} \rangle = 0.

15
New cards

Standard Inner Product on M22M_{22}

An inner product for 2×22 \times 2 matrices defined as u,v=tr(UTV)\langle \mathbf{u}, \mathbf{v} \rangle = tr(U^T V).

16
New cards

Norm (U\left\| U \right\|)

The magnitude of a vector or matrix, often calculated as the square root of the sum of the squares of its entries.

17
New cards

Distance (d(U,V)d(U, V))

The magnitude of the difference between two vectors, defined as UV\left\| U - V \right\|.

18
New cards

Weighted Euclidean Inner Product

An inner product in Rn\mathbb{R}^n defined with specific weights for each component, such as u,v=w1u1v1+w2u2v2\langle \mathbf{u}, \mathbf{v} \rangle = w_1 u_1 v_1 + w_2 u_2 v_2.