defenition study

0.0(0)
studied byStudied by 0 people
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/16

flashcard set

Earn XP

Description and Tags

from chat

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

17 Terms

1
New cards

Linear Independence

A set of vectors is linearly independent if none of the vectors can be written as a linear combination of the others.

2
New cards

Linearly Dependent

A set of vectors is linearly dependent if at least one of the vectors is a linear combination of the others.

3
New cards

Zero Vector in a Set

If a set of vectors includes the zero vector, it is automatically linearly dependent.

4
New cards

Orthogonal Vectors

Two vectors are orthogonal if their dot product is zero.

5
New cards

Orthonormal Vectors

A set of vectors is orthonormal if each vector has unit length and they are all mutually orthogonal.

6
New cards

Orthogonal Matrix

A square matrix is orthogonal if its columns (and rows) are orthonormal vectors, i.e., ( Q^T Q = I ).

7
New cards

Jordan Canonical Form

A form of a square matrix where it is broken into Jordan blocks corresponding to each eigenvalue, including generalized eigenvectors.

8
New cards

Diagonalizable Matrix

A matrix is diagonalizable if it has enough linearly independent eigenvectors to form a diagonal matrix under similarity transformation.

9
New cards

Similar Matrices

Two matrices are similar if they represent the same linear transformation under different bases, i.e., ( A = PBP^{-1} ).

10
New cards

Properties of Similar Matrices

Similar matrices have the same characteristic polynomial, eigenvalues, and determinant.

11
New cards

Singular Value Decomposition (SVD)

A = UΣVᵀ, where U and V are orthogonal matrices, and Σ contains the singular values on the diagonal.

12
New cards

Singular Values

The non-negative square roots of the eigenvalues of ( A^T A ); they measure how A stretches space.

13
New cards

U and V in SVD

The columns of U are eigenvectors of ( AA^T ); the columns of V are eigenvectors of ( A^T A ).

14
New cards

QR Factorization

A = QR, where Q is an orthogonal matrix and R is an upper triangular matrix.

15
New cards

Q in QR Factorization

Q contains orthonormal columns that span the column space of A.

16
New cards

R in QR Factorization

R is an upper triangular matrix that represents the projection coefficients from the Gram-Schmidt process.

17
New cards