Linear Algebra Final Exam Study Guide

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/31

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

32 Terms

1
New cards

Find a basis and state the dimension for the subspace

convert subspace to matrix, row reduce, use amount of pivot columns to determine how many columns are part of the basis and amount of pivot columns = dimension

2
New cards

Find the dimension of the subspace spanned by

convert to matrix, row reduce, find number of pivot columns

3
New cards

Determine the dimension of Nul A, Col A, and Row A for the matrix

Nul A = amount of columns - amount of pivot columns

Col A = amount of pivot columns

Row A = amount of linearly independent rows

4
New cards

Show that the polynomials form a basis for IPn

Check if degree matches n, convert to matrix vertically, NOT HORIZONTALLY, and check if linearly independent

5
New cards

If the nullity of a nxm matrix is x, what are the dimensions of the column and row space of A?

x (nullity) + dimensions (what we are looking for) = m (columns)

6
New cards

Let A = {a1, a2, a3} and B = {b1, b2, b3} be bases for a vector space V, and suppose a1 = , a2 = , and a3 =

FIND the change of coordinates matrix from A to B

Convert to matrix HORIZONTALLY

7
New cards

Let A = {a1, a2, a3} and B = {b1, b2, b3} be bases for a vector space V, and suppose a1 = , a2 = , and a3 =

Find [x](A or B) for x =

convert x to matrix and multiply with change of coordinates matrix

8
New cards

Let A = {a1, a2, a3} and B = {b1, b2, b3} be bases for a vector space V, and suppose a1 = , a2 = , and a3 =

FIND the change of coordinates matrix from B to A

Convert to matrix VERTICALLY

9
New cards

Let B = {b1, b2} and C = {c1, c2} be bases for IR2 . Find the change of coordinates matrix from B to C and from C to B

B to C is equal to inverse C * B

calculate inverse C

multiply with B

and C to B is the inverse of B to C so just calculate inverse of first answer

10
New cards

In P2, find the change of coordinate matrix from the basis B = {} to the standard basis. Then write t^2 as a linear combination of the polynomials in B

Convert to matrix VERTICALLY, then solve the system of equations with the third row equal to one because of t^2

11
New cards

Is λ = an eigenvalue of ? Why or why not?

Compute determinant of matrix with the diagonal subtracted by λ. It is an eigenvalue if det = 0

12
New cards

Is matrix A an eigenvector of matrix B? Why or why not

Multiply both together, check if scaler multiple of A. If there is a factor, A is an eigenvector of B with the eigenvalue equal to the factor

13
New cards

Find the basis (eigenvector(s)) corresponding to the listed eigenvalue

Compute matrix with diagonals subtracted by eigenvalue and solve system. Shouldn't be an exact solve so using the factor you get one of the solutions. If x = multiple variables, then use multiple matrices to show solution

14
New cards

Find the characteristics of polynomial and the eigenvalues of the matrix

compute determinant of matrix with diagonals subtracted by λ. Determinant = Polynomial and the eigenvalues is the solution using quadratic formula or factoring if possible

15
New cards

Let A = PDP^-1 where P = , D =

Compute A^4

Diagonalization formula makes A^4 = PD^4P^-1

compute D^4, compute inverse P, multiply P and D^4, multiply PD^4 with inverse P

16
New cards

Let A = PDP^-1 where P = , D =

Use the Diagonalization Theorem to find the eigenvalues of A and a basis for the eigenspace if

Eigenvalues of A = the diagonal values of D (second matrix after the equals sign). Match each eigenvalue with their corresponding column from P (first matrix after the equals sign). Use multiple matrices if duplicate eigenvalue exists

17
New cards

The eigenvalues of A = are λ = . Diagonalize the matrix A

compute matrix with diagonals subtracted by one of the eigenvalues then row reduce, then solve the system and convert to matrix form. do for each eigenvalue then A = PDP^-1 where P = all solutions put together, D is empty matrix except with eigenvalues in diagonals, calculate P^-1 by placing P and D side by side and row reduce until P is in row echelon form. D will then become P^-1 by doing this

18
New cards

VECTORS: Let u = , v =

u*u, v*u and (v*u)/(u*u)

u * u is just each value in u squared, then add them

v * u is multiplying corresponding values then adding them. then place second value divided by first as the answer for last question

19
New cards

VECTORS: Let u = , v =

||u||

take u * u and square root it

20
New cards

Find a unit vector in the direction of the given vector

unit vector = vector divided by norm (|| ||)

calculate norm, then divide each value in original vector by the norm and simplify if needed

21
New cards

Find the distance between u = and z =

compute u - z, then calculate norm of u - z

22
New cards

Determine which pairs of vectors are orthogonal

Orthogonal means dot product = 0 and dot product is just multiplying each value in matrix with the corresponding value from other matrix and adding them together

23
New cards

Determine if the set {v1, v2, v3} is orthogonal where v1 = , v2 = , v3 =

Check each combination of pairs if dot product is zero for all it works

24
New cards

Show that v = { , } is an orthogonal basis for IR2 and write x = as a linear combination of the vectors in v

Orthogonal basis is orthogonal plus both have to be linearly independent, then for linear combination, compute a = v1 * v1, b = x * v1, c = v2 * v2, and d = x * v2 and its x = b/a * v1 + d/c * v2

25
New cards

Show that v = { , , } is an orthogonal basis for IR3 and write x = as a linear combination of the vectors in v

Orthogonal basis is orthogonal for all combinations plus all have to be linearly independent, then for linear combination, compute a = v1 * v1, b = x * v1, c = v2 * v2, d = x * v2, e = v3 * v3, f = x * v3 and its x = b/a * v1 + d/c * v2 + f/e * v3

26
New cards

Let y = and u = . Write y as the sum of two orthogonal vectors, one in span{u} and one orthogonal to u.

compute a = y * u and b = u * u then y^ (unit vector y) = a/b * u, z = y - y^, finally y = y^ + z

27
New cards

Determine which sets of vectors are orthonormal

Orthonormal means orthogonal and all vectors have to have a norm of 1

28
New cards

Write x = as a sum of two vectors, one in span{u1,u2,u3} and the other in span{u4} where u1 = , u2 = , u3 = , u4 = if {u1,u2,u3,u4} is an orthogonal basis for IR^4

a = x * u1, b = u1 * u1, c = x * u2, d = u2 * u2, e = x * u3, f = u3 * u3, g = x * u4, h = u4 * u4 and its x = a/b * u1 + c/d * u2 + e/f * u3 + g/h * u4. Compute as vector. second vector = x - vector we just computed. x = first vector + second vector (don't add them just write them both with addition sign in the middle)

29
New cards

Find the closest point to y = in the subspace spanned by v1 = and v2 =

y = (y * v1)/(v1 * v1)v1 + (y * v2)/(v2 * v2)v2. Convert answer to matrix

30
New cards

Find the best approximation to z = by vectors of the form c1v1 + c2v2 where v1 = and v2 =

z^ = (z * v1)/(v1 * v1)v1 + (z * v2)/(v2 * v2)v2. Convert answer to matrix

31
New cards

Let y = , u1 = , and w = span{u1}

Let u be the 2x1 matrix whose only column is u1. compute u^Tu and uu^T

u^Tu is just u1 * u1 (dot product), uu^T is just u1 * u1 except not a dot product, you multiply them as matrices

32
New cards

Let y = , u1 = , and w = span{u1}

compute projwy and (uuT )y

projwy = (y * u1)/(u1 * u1)u1. convert to matrix

uu^T is just u1 * u1 except not a dot product, you multiply them as matrices then multiply that with y as a matrix not dot product