Linear Algebra

0.0(0)
studied byStudied by 1 person
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/50

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

51 Terms

1
New cards

\textbf{Properties of Transpose}

For any matrices Aand B , and scalar c :

1. (A^T)^T = A

2. (A + B)^T = A^T + B^T

3. (cA)^T = cA^T

4. (AB)^T = B^T A^T

2
New cards

Inverse of a 2×2 Matrix A=\begin{bmatrix}a & b\\ c & d\end{bmatrix}

A^{-1} = \frac{1}{ad - bc} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix}, \text{ if } ad - bc \neq 0

3
New cards

\textbf{Solving } A\mathbf{x} = \mathbf{b} \textbf{ using the inverse}

\text{If }A\text{ is invertible, then the solution is: }\mathbf{x}=A^{-1}\mathbf{b}

4
New cards

\textbf{Product: } z \cdot \overline{z}

z \cdot \overline{z} = |z|^2

5
New cards

\textbf{Sum: } z + \overline{z}

z + \overline{z} = 2 \operatorname{Re}(z)

6
New cards

\textbf{Difference: } z - \overline{z}

z - \overline{z} = 2i \operatorname{Im}(z)

7
New cards

\textbf{Addition: } (a + bi) + (c + di)

= (a + c) + (b + d)i

8
New cards

\textbf{Subtraction: } (a + bi) - (c + di)

= (a - c) + (b - d)i

9
New cards

\textbf{Division: } \frac{a + bi}{c + di}

=\frac{(a+bi)(c-di)}{(c+di)(c-di)}=\frac{(a+bi)(c-di)}{c^2+d^2}

10
New cards

\textbf{Multiplication: } (a + bi)(c + di)

=ac+adi+bci+bdi^2=(ac-bd)+(ad+bc)i

11
New cards

\textbf{Properties of Invertible Matrices}

\begin{align*}\text{If }A\text{ is invertible, then } & (A^{-1})^{-1}=A\\ \text{If }A\text{ and }B\text{ are invertible, then } & (AB)^{-1}=B^{-1}A^{-1}\\ \text{If }A\text{ is invertible, then } & (A^{T})^{-1}=(A^{-1})^{T}\end{align*}

12
New cards

\textbf{Matrix Multiplication: Important Warnings}

\begin{align*} & \text{1. Not commutative: }AB\neq BA\text{ in general}\\ & \text{2. }AB=AC\nRightarrow B=C\quad\text{(if }A\text{ is not invertible)}\\ & \text{3. }AB=0\nRightarrow A=0\text{ or }B=0\text{ (not necessarily)}\end{align*}

13
New cards

\textbf{Must-Know: Invertible Matrix Theorem (for } n \times n \text{ matrix } A)

\begin{align*}1.\;\; & A\text{ has }n\text{ pivot positions}\\ 2.\;\; & A\text{ is invertible}\\ 3.\;\; & Ax=0\text{ has only the trivial solution}\\ 4.\;\; & \text{The columns of }A\text{ are linearly independent}\\ 5.\;\; & \text{The columns of }A\text{ span }\mathbb{R}^{n}\\ 6.\;\; & \text{The transformation }x\mapsto Ax\text{ is one-to-one and onto}\end{align*}

14
New cards

\textbf{Standard Rotation Matrix for }\theta\text{ }

R(\theta) = \begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix}

15
New cards

What does it mean for a transformation T\left( \overrightarrow{x} \right)=Ax to be “one-to-one”?

  • Ax=0 has only the trivial solution (x=0)

  • A has a pivot in every column

  • The columns of A are linearly independent 

  • Different inputs give different outputs

  • Injective

16
New cards

What does it mean for a transformation T\left( \overrightarrow{x} \right)=Ax to be “onto”?

  • For every b\overrightarrow{}\in\mathbb{R^{n}} , there is a solution to Ax=b

  • A has a pivot in every row∙

  • The columns of A span \mathbb{R^{m}}

  • Every vector in the codomain can be reached

17
New cards

How do pivot positions relate to one-to-one and onto?

Pivot in every column⇒One-to-one

Pivot in every row⇒Onto

18
New cards

What are three key properties of determinants for n \times n matrices?

1. \det(A^T) = \det(A)

2. \det(AB) = \det(A) \cdot \det(B)

3. If c\in\mathbb{R},then\det(cA)=c^{n}\cdot\det(A)

19
New cards

What are some key axioms that define a vector space?

\begin{align*}1.\;\; & \vec{u}+\vec{v}\in V\quad\text{(Closure under addition)}\\ 4.\;\; & \exists\,\vec{0}\in V\text{ such that }\vec{0}+\vec{u}=\vec{u}\quad\text{(Zero vector exists)}\\ 6.\;\; & c\vec{u}\in V\quad\text{(Closure under scalar multiplication)}\end{align*}

20
New cards

\text{What is a basis of a vector space?}

A basis of a vector space V is a set of vectors that is:

  • Linearly independent

  • Spans V

21
New cards

How do you check if \overrightarrow{u}\in ColA and how do you find ColA

\vec{u}\in\operatorname{Col}(A) if A\vec{x} = \vec{u} has a solution and ColA is just the spanning set of its columns

22
New cards

What is the null space of a matrix? How do you check if\overrightarrow{u}\in NulA ?

It is the set of all solutions to A\vec{x}=0 and if A\overrightarrow{u}=\overrightarrow{0} then \overrightarrow{u}\in NulA

23
New cards

\text{What equation defines an eigenvalue and eigenvector of } A?

A\vec{x} = \lambda \vec{x}

Where \lambda is an eigenvalue and \vec{x} \ne \vec{0} is its corresponding eigenvector.

24
New cards

\text{How can you rewrite the eigenvalue equation } A\vec{x} = \lambda \vec{x}?

(A - \lambda I)\vec{x} = \vec{0}

This is a homogeneous system with nontrivial solutions when \lambda is an eigenvalue.

25
New cards

\text{What is the characteristic equation for eigenvalues of } A?

\det(A - \lambda I) = 0

A scalar \lambda is an eigenvalue of A if and only if it satisfies this equation.

26
New cards

How do you compute the dot product of two vectors\vec{u}\text{ and }\vec{v}?

\vec{u} \cdot \vec{v} = u_1v_1 + u_2v_2 + \cdots + u_n v_n or \overrightarrow{u}^{T}\overrightarrow{v}

27
New cards

How do you find the length (magnitude) of a vector \vec{v}?

\|\vec{v}\| = \sqrt{v_1^2 + v_2^2 + \cdots + v_n^2}

This is the distance from the origin to the point \vec{v} in \mathbb{R}^n

28
New cards

\text{How do you find the angle } \theta \text{ between two vectors } \vec{u} \text{ and } \vec{v}?

\cos \theta = \frac{\vec{u} \cdot \vec{v}}{\|\vec{u}\| \|\vec{v}\|}

Then use:

\theta = \cos^{-1} \left( \frac{\vec{u} \cdot \vec{v}}{\|\vec{u}\| \|\vec{v}\|} \right)

Where \vec{u} \cdot \vec{v} is the dot product, and \|\vec{u}\|,\|\vec{v}\|

are the vector lengths

29
New cards

\text{How do you find the distance between two vectors } \vec{u} \text{ and } \vec{v}?

\text{Distance} = \|\vec{u} - \vec{v}\| = \sqrt{(u_1 - v_1)^2 + (u_2 - v_2)^2 + \cdots + (u_n - v_n)^2}

This is the length of the vector \vec{u} - \vec{v} , the displacement from

\vec{v} to \vec{u}

30
New cards

\text{What does it mean for two vectors } \vec{u} \text{ and } \vec{v} \text{ to be orthogonal?}

\vec{u} \cdot \vec{v} = 0

Two vectors are orthogonal if their dot product is zero

31
New cards

\text{What is an orthogonal set of vectors?}

\text{A set }\{\vec{v}_1,\vec{v}_2,\ldots,\vec{v}_{n}\}\text{ is orthogonal if }\vec{v}_{i}\cdot\vec{v}_{j}=0\text{ for all }i\ne j.

Each pair of distinct vectors in the set is orthogonal.

32
New cards

What is true about an orthogonal set of nonzero vectors in \mathbb{R}^n?

\text{If }S=\{\vec{v}_1,\vec{v}_2,\ldots,\vec{v}_{n}\}

is an orthogonal set of nonzero vectors in \mathbb{R}^n,

then S is linearly independent and forms a basis for the subspace it spans.

33
New cards

How do you find the weights (coefficients) in a linear combination using an orthogonal basis?

\text{If } \{ \vec{u}_1, \vec{u}_2, \dots, \vec{u}_p \} \text{ is an orthogonal basis for a subspace } W \subseteq \mathbb{R}^n, \text{ then for } \vec{y} \in W:

\vec{y} = c_1\vec{u}_1 + c_2\vec{u}_2 + \dots + c_p\vec{u}_p

c_j = \frac{\vec{y} \cdot \vec{u}_j}{\vec{u}_j \cdot \vec{u}_j} \quad \text{for } j = 1, \dots, p

34
New cards

What does the coordinate vector [\vec{x}]_B mean?

[\vec{x}]_{B}=\begin{bmatrix}x_1\\ x_2\\ ...\\ x_{n}\end{bmatrix}

This means thatis expressed as a linear combination of the basis vectors in B:

\vec{x} = x_1 \vec{b}_1 + x_2 \vec{b}_2 + \dots + x_n \vec{b}_n

where B=\{\vec{b}_1,\vec{b}_2,\dots,\vec{b}_{n}\rbrace is a basis for the space.

35
New cards

What is a change of basis matrix and how is it used?

To convert coordinates from a basis B to the standard basis:

\vec{x} = P_B [\vec{x}]_B

where P_B = \begin{bmatrix} \vec{b}_1 & \vec{b}_2 & \cdots & \vec{b}_n \end{bmatrix} ge of basis matrix, and each \vec{b}_i is a column vector in basis B.

To convert from standard coordinates to B coordinates:

[\vec{x}]_B = P_B^{-1} \vec{x}

36
New cards

How do you construct the change of basis matrix P_{C \leftarrow B}?

P_{C \leftarrow B} = \begin{bmatrix} [\vec{b}_1]_C & [\vec{b}_2]_C \end{bmatrix} That is P_{C \leftarrow B} has the coordinate vectors of the basis B vectors written in basis C.

If [\vec{b}_1]_C = \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} \text{ and } [\vec{b}_2]_C = \begin{bmatrix} y_1 \\ y_2 \end{bmatrix}, then:

\left\lbrack c_1c_2\right\rbrack\begin{bmatrix}x_1\\ x_2\end{bmatrix}=\vec{b}_1,\left\lbrack c_1c_2\right\rbrack\begin{bmatrix}y_1\\ y_2\end{bmatrix}=\vec{b}_2 and P_{C\leftarrow B}=\begin{bmatrix}c_1 & c_2 & b_1 & b_1\end{bmatrix}

37
New cards

dimNulA

number of free variables

38
New cards

dimColA

number of pivot columns

39
New cards

dimRowA

number of pivot rows

40
New cards

RankA

number of pivot columns in A = dimColA = dimRowA

41
New cards

What must be true for a set of vectors to be a basis for a subspace H?

A set\left\lbrace\vec{v}_1,\vec{v}_2,\ldots,\vec{v}_{n}\right\rbrace is a basis for H if:

  1. The set is linearly independent

  2. The set spans H

42
New cards

How do you normalize a vector to get a unit vector?

To normalize a vector \vec{v}, divide it by its length\hat{v}=\frac{\vec{v}}{\|\vec{v}\|}

Then \hat{v} is a unit vector: \left\Vert\hat{v}\right\Vert=1

43
New cards

What is an orthonormal set?

A set of vectors is orthonormal if:

  1. Every vector in the set is a unit vector

\left\Vert \vec{v}_i \right\Vert = 1

  1. Every pair of distinct vectors is orthogonal

44
New cards

Properties of orthonormal matrix A

A^T A = I

A^T = A^{-1}

45
New cards

What is the least squares solution for an inconsistent system A\vec{x}=\vec{b} ?

A^T A \hat{\vec{x}} = A^T \vec{b}

46
New cards

What are the steps to compute a least squares approximation?

1. Ensure the system A\vec{x}=\vec{b} is inconsistent (no exact solution).

2. Compute A^T A and A^T \vec{b}

3. Solve the normal equation:

A^T A \hat{\vec{x}} = A^T \vec{b}

4. Solve for b:

\hat{b}=A\hat{\vec{x}}

5. Calculate the error:

\|\vec{b} - \hat{\vec{b}}\|

47
New cards

What is a symmetric matrix?

A matrix A is symmetric if it equals its transpose:

A = A^T

48
New cards

What is the Gram-Schmidt process?

The Gram-Schmidt process converts a set of linearly independent vectors

\{\vec{v}_1, \vec{v}_2, \dots, \vec{v}_n\}

into an orthogonal (or orthonormal) set

\{\vec{u}_1, \vec{u}_2, \dots, \vec{u}_n\}

that spans the same subspace.

The process:

  1. Let v_1=x_1 and w_1=Span\left\lbrace x_1\right\rbrace=Span\left\lbrace w_1\right\rbrace

  2. Let v_2=x_2-proj_{w1}x2_{}=x_2-\frac{x_2v_1}{v_1v_1}v_1 then let w_2=Span\left\lbrace v_1,v_2\right\rbrace

    (Let v_2 be the vector produced by subtracting from x_2 its projection into subspace w_1 )

  3. Let Let v_3=x_3-proj_{w2}x3=x_3-\frac{x_3v_1}{v_1v_1}v_1-\frac{x_3v_2}{v_2v_2}v_2 then let w_2=Span\left\lbrace v_1,v_2,v_3\right\rbrace

    (Let v_3 be the vector produced by subtracting from x_3 its projection into subspace

49
New cards

What is an orthogonal projection?

An orthogonal projection is a decomposition of a vector \vec{y} into the sum of two orthogonal vectors:

\vec{y} = \hat{\vec{y}} + \vec{z}

Where:

- \hat{y} is the projection of \vec{y} onto a vector \vec{u} (in the span of \vec{u} )

-\vec{z}=\vec{y}-\hat{y} is orthogonal to \vec{u}

50
New cards

What is the formula for the orthogonal projection of \vec{y} onto \vec{u} ?

The orthogonal projection of \vec{y} onto \vec{u} is:

\hat{y}=\text{proj}_{\vec{u}}\vec{y}=\frac{\vec{y} \cdot\vec{u}}{\vec{u} \cdot\vec{u}}\vec{u}

To write \vec{y} as a sum of two orthogonal vectors:

\vec{y}=\hat{y}+(\vec{y}-\hat{y})

You can verify the decomposition is orthogonal by checking

\hat{y} \cdot (\vec{y} - \hat{y}) = 0

51
New cards

What does the number of free variables and solution vectors tell us about the geometric description of a solution set?

  • 0 free variables → A point (unique solution)

  • 1 free variable → A line (span of 1 vector)

  • 2 free variables → A plane (span of 2 vectors)

  • 3 free variables → A 3D subspace (in \mathbb{R^4} or higher)

The vectors in the solution describe the direction of that freedom