Week 2 Linear algebra

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/36

flashcard set

Earn XP

Description and Tags

The essence of linear algebra.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

37 Terms

1
New cards

What is a vector?

CS: An orderded list of numbers.

Math: arrows in space that happen to have a nice numerical representation.

2
New cards

What are the 2 fundamentel vector operations and explain how they work.

1 vector addition

Adding two vector, which results as in a new vector, is the same as starting from one vector and then moving from the tip of that vector to the second vector.

2 scalar multiplication

squishes, stretches or flips around a vector.

3
New cards

What is the process of stretching, squishing or reversing the direction of a vector called?

Scaling

4
New cards

What is a scalar?

A number that scales a vector.

5
New cards

Describe a linear combination

the process of scaling and adding vectors.

6
New cards

span

the set of all possible vectors you can reach, with a linear combination of a given pair of vectors. 

7
New cards

When is a vector linearly independent? 

If id adds another dimension to the span.

8
New cards

When is a vector linearly dependent?

If a vector does not add a new dimension to the span, meaning it can be expressed as a linear combination of already included vectors. 

9
New cards

What are linear transformations?

a way to move around space such that gridlines remain parallel and evenly spaced, and the origin remains fixed. They are like functions in the sense that for each input they spit out an output.

10
New cards

What is a matrix?

A representation of a linear transformation.

11
New cards

What is a composition?

The product of two distinct linear transformations (matrices). 

12
New cards

Does order matter for a composition?

Yes! A x B is generally not equal to B x A. This is because matrix multiplication involves taking the dotproduct of rows of the first matrix with columns of the second, and swapping the matrices changes which one provides the rows and which provides the columns. 

13
New cards

What is the determinant?

The factor by which a linear transformation scales any area in the space.

14
New cards

Can the determinant be negative and what does this mean? 

Yes the determinant can be negative and it means that the orientation of space is inverted? 

15
New cards

What happens when the determinant is zero? 

When the determinant is zero, the accompanied linear transformation squishes all of space into a lower dimension. This means it is impossible to determine how space is scaled resulting in a determinant of zero.

16
New cards

Explain how we solve the system of linear combinations of Ax = v by explaining what the inverse of a matrix does. 

For Ax = v, x is unknown but equal to v for some linear transformation represented by A. That means that if we apply the linear transformation A to x again but in reverse, we get back to were we started. So we solve for AxA*-1 = vA*-1 Because multiplying a matrix by its inverse results in the identity matrix I, we get x = vA*-1. 

17
New cards

What are eigenvectors?

Vectors that stay on their span after a linear transformation and only get stretched or squished by some scalar. 

18
New cards

What are eigenvalues? 

Scalars that determine how much the corresponding eigenvector gets stretched or squished during the corresponding linear transformation. 

19
New cards

What is a diagonal matrix? 

A matrix that has zero’s everywhere except its diagonal. And the way to interpret this is that all the basisvectors are eigenvectors with the diagonal entries of this matrix being their eigenvalues. 

20
New cards

Does every matrix have a determinant? And an inverse?

No only square matrices have determinants and inverses. however, not every square matrix has an inverse, for that its determinant must not be equal to zero. 

21
New cards

Diagonalization

A square matrix A of order n is called diagonalizable if there is an invertible matrix P

such that P−1AP is a diagonal matrix. If A is diagonalizable, then the columns of P

are eigenvectors of A and the diagonal entries of P−1AP are the eigenvalues of A.

this is where you come up with a basis consisting of eigenvectors such that your linear transformation is a diagonal matrix with respect to that basis.

22
New cards

pca reddit

Maybe this is less practical but still an application of eigenvalues/vectors. If you have a data cloud ie multivariate data, you can compute a covariance matrix of the data.

That covariance matrix is positive definite so it will only have positive eigenvalues. If it’s full rank (which PD matrices are), you will have as many eigenvectors as your rank.

Since your eigenvectors are all orthogonal to each other, and you now have a new basis for your data cloud. Namely the one in which each basis vector is rotated in a way to maximise the variance along that basis vector. The variance along that one basis vector is your eigenvalue associated with that that particular basis eigenvector.

23
New cards

Describe the transpose of a matrix

the transpose of matrix flips its rows and columns so that the row 1 in matrix A becomes column 1 in matrix A’.

24
New cards

What is a square matrix

a matrix with equal number of rows and columns

25
New cards

The trace

the sum of teh diagonal entries.

26
New cards

do all matrices have a trace?

No, only square matrices have a main diagonal and therefore a trace. 

27
New cards

what is a special property of a symmetric matrix?

It is equal to its own transpose.

28
New cards

Does order matter for matrix addition?

NO

29
New cards

Does order matter for matrix multiplication? Why?

Yes, because you multiply the rows of the first matrix by the columns of the second. (which need to be equal for it to work)

30
New cards

What does it mean for the trace of a matrix to be invariant under cyclic permutations?

Means that the trace of a matrix product stays the same as long as the internal order doesnt change.

TR(AB) = TR(BA). However, if it is not cyclic but a different internal order it is no longer equal.

TR(ABC) not equal to TR(CBA)

31
New cards

Are all square matrices invertible?

No, only square matrices with a non-zero determinant are invertible. Input cannot be mapped to a higher dimension.

32
New cards

Orthogonal matrices

square matrices where the inverse is equal to their transpose, meaning they preserve lengths and angles when used for a transformation. They are useful for representing isometric transformations like rotations and reflections. The determinant is either 1 or minus 1. and are symmetric.

33
New cards

What does Av=λv mean?

It means that the matrix-vector product gives the same result of just scaling the eigenvector v by some (eigen)value lambda.

34
New cards

How do we find the values of Av=λv?

rewrite right hand side to be matrix-vector multiplication as well. 

With both sides looking like matrix vector multiplication we can subtract off the rhs and factor out the v. 

Now we ask ourself when this new matrix product times v gives us the zero factor? 

For that we need the determinant of the matrix = 0

35
New cards

When is a matrix positive semidefinite

A matrix is positive semidefinite if and only if all of its eigenvalues are larger then zero.

non-negative (i.e., λi ≥ 0) for all i

36
New cards
37
New cards

explain orthogonal diagonalization