TT

L11

Projection Onto a Line

  • Notation and Definition:

    • The notation "x bar" represents a projection onto a 1-dimensional subspace.

    • Given a vector a in ( \mathbb{R}^m ), we want to find vector x that is closest to another vector b.

    • This is referred to as the projection of b onto span(a).

  • Projection Formula:

    • The projection can be expressed as: [ p = \text{proj}_b = x_a ]

    • Here ( x ) is chosen so that the distance ( |b - x_a| ) is minimized.

    • We denote ( b - x_a ) as the error vector,( e = b - p ).

  • Geometric Observation:

    • For the projection to be accurate,

      • The error vector must be orthogonal to the vector a: [ a^T e = 0 \quad \text{or equivalently,} \quad e = b - p = (b^T a)/(a^T a) \times a ]

Projection Formula Derivation

  • From the projection equation:

    • If ( p = x_a = a ) then ( x = \frac{a^Tb}{a^Ta} )

Example of 1D Projection

  • Given:

    • ( b = (2) ), ( a = (3) )

    • To find ( x ) such that the point p (closest to b) can be determined:

      • ( p = a \cdot ( \frac{b}{a^T a} ) = \frac{(2)(3)}{3^2} = (6) )

      • Error vector: ( e = b - p = (2) - (6) )

      • Scalar ( x = \frac{1}{3} ) minimizes the error.

Projection onto Higher Dimensions

  • Now consider projecting onto a 2-dimensional subspace:

    • Given two vectors ( a_1, a_2 \in \mathbb{R}^m ), we need to find the projection of b onto the span of ( a_1 ) and ( a_2 ).

      • The projection is expressed through the system: [ P = A \mathbf{x} ]where ( A = [a_1, a_2] )

      • The projection minimizes the error minimizing ( \left|b - Ax\right| ).

General Projection Formula

  • Using the least squares formula for projection: [ P = A(A^TA)^{-1}A^T b ]

    • Note that the invertibility of ( A^TA ) is assured when the columns of A are independent.

Summary of Least Squares Problem

  • When faced with a problem where solutions to ( Ax = b ) do not exist directly,

    • The least squares problem attempts to find a best fit by minimizing the distance error.

    • Objective: project b onto the column space of A to find: [ p = Ax ]where p is the best approximation of b.

  • Least Squares Solution:

    • To derive for cases where | A is not a perfect match to b:

    • Error becomes: [ e = b - Ax ]

    • Computed error is closer to minimum when solved directly only for x: [ x = (A^TA)^{-1}A^tb ]

  • Final Note:

    • This derived solution ensures error is always minimized for the given setup.