1/9
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
if at least two of the data points have different x values, then the two columns of x are linearly independent, meaning the least squares solution is ?
unique
in order to find the equation of the least squares line that best fits given data points, create a matrix X that holds the coefficients of the equations when you plug the data points in. Then create a matrix y with the y coordinates of the data points. next, you solve for the formula
b^ = (xTx)-1xTy
Let X be the design matrix used to find the least-squares line to fit data (x1,y1), …, (xn,yn). Use a theorem in Section 6.5 to show that the normal equations have a unique solution if and only if the data include at least two data points with different x-coordinates.
If two data points have different x-coordinates, then the two columns of the design matrix X cannot
be multiples of each other and hence are linearly independent. By Theorem 14 in Section 6.5, the
normal equations have a unique solution.
parallelogram law
||u+v||2 + ||u-v||² = 2||u||² + 2||v||² + 2u.v - 2u.v
Suppose the distance from u to v equals the distance from u to −v. Does it follow
that u is orthogonal to v?
Yes, the dist(u,v)=dist(u,-v) therefore one of the vectors must be 0 and u . 0 = 0
rue or False?
(a) For an m × n matrix A, vectors in the null space of A are orthogonal to vectors
in the row space of A.
(b) A square matrix with orthogonal columns is an orthogonal matrix.
(c) If U and V are orthogonal matrices, then U V is an orthogonal matrix.
(d) If the columns of an n × n matrix U are orthonormal, then the linear transfor-
mation x 7 → U x preserves lengths.
a) True
b) False; orthonormal columns
c) True UTU = I and VTV = I (UVT)(UV) = VTUTUV
d) True
when asked to veryify that y-projL(y) is a vector orthogonal to the line L just:
do the dot product of y-projL(y) and L to show that it is equal to 0
Choose any vector w ∈ L different from the projection of y, and verify that
∥y − projL(Y)∥ < ∥y − w∥
choose a vector like [0 0] and calculate ∥y − projL(Y)∥ and ∥y − w∥
Let y ∈ W where W is a subspace of R3. Find projW (y) and projW ⊥ (y). [This is a
conceptual question that you can answer without calculations!
projW(y) = y
projW (y) = 0
The formula for projecting a vector y onto a vector u is (y · u)u. Show that this can
be written as (uuT )y. [Remark: The matrix uuT is a rank-1 projection matrix that
projects y onto u.]
(y.u)u = uTy * u = uuTy