Linear Algebra Prelim 1 Study Guide

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/37

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

38 Terms

1
New cards

Linearly Independent Condition

i) the columns of matrix A are linearly independent

ii) see image

iii) there is a pivot in every column of A

<p>i) the columns of matrix A are linearly independent</p><p>ii) see image</p><p>iii) there is a pivot in every column of A</p>
2
New cards

Linear Independence of Two Vectors

*does not generalize to 3 vectors, if subsets of 2 are linearly independent the set of 3 vectors can still be dependent

<p>*does not generalize to 3 vectors, if subsets of 2 are linearly independent the set of 3 vectors can still be dependent</p>
3
New cards

Connection Between Linear Independence and Linear Combination Theorem

knowt flashcard image
4
New cards

T/F If S is a linearly dependent set, then each vector is a linear combination of the other vectors in S

False, it guarantees some vector is, not each vector

5
New cards

T/F If a set contains fewer vectors than there are entries in the vectors, then the set is linearly independent.

False, it could be linearly dependent

6
New cards

(T/F) The columns of any 4 x 5 matrix are linearly dependent.

True, five vectors in R4 must be linearly dependent.

7
New cards

domain, codomain of transformation

knowt flashcard image
8
New cards

image, range of transformation

knowt flashcard image
9
New cards

Linear Transformation Conditions

knowt flashcard image
10
New cards

Linear Transformation Properties

knowt flashcard image
11
New cards

matrix transformations and linear transformations

Every linear transformation is a matrix transformation and vice versa

12
New cards

superposition principle

The superposition principle in physics is a direct application of the properties of linear transformations. It states that the net response at a given place and time caused by two or more stimuli is the sum of the responses that would have been caused by each stimulus individually. just the conditinos applied

13
New cards

Computing Matrix of Linear Transformation Theorem

knowt flashcard image
14
New cards

Onto Transformation Theorem

knowt flashcard image
15
New cards

One-to-One Transformation Theorem

knowt flashcard image
16
New cards

(T/F) The columns of the standard matrix for a linear transformation from Rn to Rm are the images of the columns of the nxn identity matrix.

True (T). The columns of the standard matrix for a linear transformation from Rn to Rm are the images of the columns of the n×n identity matrix. As explained in problem 23, the standard matrix for a linear transformation T is constructed by applying the transformation to the standard basis vectors e1​,…,en​ and using the resulting vectors as the columns of the matrix. The j-th column of the standard matrix is the vector T(ej​).

17
New cards

names of theorems

know em

18
New cards

diagonal matrix

a_ij = 0 for all i =/= j

19
New cards

Properties of Matrix Arithmetic

knowt flashcard image
20
New cards

Properties of Matrix Multiplication

knowt flashcard image
21
New cards

Properties of Matrix Transpose

knowt flashcard image
22
New cards

(T/F) Each column of AB is a linear combination of the columns of B using weights from the corresponding column of A. 

The correct rule of matrix multiplication is that each column of the product AB is a linear combination of the columns of A using weights from the corresponding column of B. The statement reverses the roles of matrices A and B.

23
New cards

Suppose the last column of AB is all zeros, but 𝐵 itself has no column of zeros. What can you say about the columns of 𝐴?

Let the last column of B be bk​. We are given that Abk​=0 and bk​=0. This means that the equation Ax=0 has a non-trivial solution, namely x=bk​. By the Invertible Matrix Theorem, if Ax=0 has a non-trivial solution, then the columns of A are linearly dependent.

24
New cards

is the inverse unique

Inverse is Unique Theorem: If an inverse of a matrix exists, it is unique.

25
New cards

Elementary Matrix Relationship

It turns out that applying an elementary row operation to an mxn matrix A is equivalent to multiplying A by a corresponding mxm elementary matrix from the left. Get that matrix by applying the row operation to I

26
New cards

Invertible if and only if row equivalent to In theorem

<p></p>
27
New cards

Invertibility of 2×2 Matrices Theorem

knowt flashcard image
28
New cards

Invertible Unique Solution Theorem

knowt flashcard image
29
New cards

Properties of Invertible Matrices

knowt flashcard image
30
New cards

Elementary Matrices Invertible?

knowt flashcard image
31
New cards

Invertible Matrices Theorem

h) the columns of A span Rn
i) the linear transformation T(x) = A(x) is onto Rn
j) there is an nxn matrix C such that CA.= I
k) there is an nxn matrix D such that AD = I. 
l) A^T is an invertible matrix. 

<p>h) the columns of A span Rn<br>i) the linear transformation T(x) = A(x) is onto Rn<br>j) there is an nxn matrix C such that CA.= I<br>k) there is an nxn matrix D such that AD = I.&nbsp;<br>l) A^T is an invertible matrix.&nbsp;</p>
32
New cards
33
New cards
x2 = x3 + m
34
New cards
x3 is free
35
New cards
A(c(vec_x)) = c(A(vec_x))
36
New cards
t1...tk are parameters
37
New cards
Connection between Homogeneous and Non-Homogeneous Systems Equation
Suppose the equation A(vec_x) = vec_b is consistent for some given vec_b, and let vec_p be a solution. Then the solution set of A(vec_x) = vec_b is the set of all vectors of the form vec_w = vec_p + vec_vh, where vec_vh is any solution to the homogeneous equation A(vec_x) = vec_0.
Geometrically, the solution set of A(vec_x) = (vec_b) is the result of translation of the solution set A(vec_x) = vec_0 by a vector vec_p.
Suppose the equation A(vec_x) = vec_b is consistent for some given vec_b, and let vec_p be a solution. Then the solution set of A(vec_x) = vec_b is the set of all vectors of the form vec_w = vec_p + vec_vh, where vec_vh is any solution to the homogeneous equation A(vec_x) = vec_0. 
Geometrically, the solution set of A(vec_x) = (vec_b) is the result of translation of the solution set A(vec_x) = vec_0 by a vector vec_p.
38
New cards
Parametric Form of a Solution Set
vec_p + t1(vec_v1)... + tk(vec_vk)
vec_p is a vector such that A(vec_p) = vec_b
vec_v1..., vec_vk are solutions to the associated homogeneous system A(vec_x) = 0.
t1...tk are parameters