Ch 1.8 - Introduction to Linear Transformations
Linear Dependence
Definition of Linear Dependence Refresher: If a set of vectors is linearly dependent, it means there exist scalars , , and , not all zero, such that .
Extension to a Larger Set: If holds, then adding to both sides results in . Since not all scalars (certainly and also for ) are zero, the set is also linearly dependent.
Introduction to Linear Transformations
Dynamic View of Matrix Multiplication
Notation Equivalence: The difference between a matrix equation and its associated vector equation is primarily notational.
Transformational Perspective: In various applications (e.g., computer graphics, signal processing), a matrix equation can be viewed dynamically. The matrix is considered an entity that "acts" on a vector through multiplication, resulting in a new vector called .
Example: The equations and show that multiplication by transforms into and transforms into the zero vector.
Solving from this Viewpoint: Finding all vectors in that are transformed into the vector in under the action of multiplication by . For instance, in Figure 1, a vector from is transformed to a vector in .
Definition of a Transformation (Function or Mapping)
A transformation (or function or mapping) from to is a rule that assigns to each vector in a unique vector in .
Domain: The set is called the domain of .
Codomain: The set is called the codomain of .
Notation: indicates that is the domain and is the codomain.
Image: For a vector in , the vector in is called the image of (under the action of ).
Range: The set of all possible images is called the range of . The range is a subset of the codomain. (See Figure 2 for an illustration of domain, codomain, and range).
Importance of Dynamic View: This dynamic perspective of matrix-vector multiplication is crucial for understanding linear algebra concepts and building mathematical models of physical systems that evolve over time (e.g., in Sections 1.10, 4.8, and Chapter 5).
Matrix Transformations
Definition: A matrix transformation is a mapping where for each in , is computed as , where is an matrix.
Simplified Notation: Sometimes denoted by .
Domain and Codomain of Matrix Transformations:
The domain of is when the matrix has columns.
The codomain of is when each column of has entries.
Range of a Matrix Transformation: The range of is precisely the set of all linear combinations of the columns of , because every image is of the form , which is by definition a linear combination of the columns of .
Example 1: Image, Pre-image, Uniqueness, and Range Membership
Let .
Define a transformation by , so that .
a. Finding the image of under (i.e., ):
b. Finding an in whose image under is (i.e., solving ):
This requires solving the matrix equation :
We form the augmented matrix and row reduce it:
From the reduced echelon form:
Thus, the vector is . The image of this under is the given vector .
c. Uniqueness of : From the row reduction in part (b), the system has a unique solution (no free variables). Therefore, there is exactly one whose image under is .
d. Determining if (this seems to be a typo in the original text, as 'c' also represented the third column of the original augmented matrix in 'b'. Let's use the actual definition provided in 'd') is in the range of :
This asks if is the image of some in , i.e., if for some . This means checking if the system is consistent. We row reduce the augmented matrix:
The third row corresponds to the equation , which simplifies to . This is a contradiction, so the system is inconsistent. Therefore, is not in the range of .
Summary of Example 1: Example 1c is a uniqueness problem (is the image of a unique ?), and Example 1d is an existence problem (does there exist an whose image is ?).
Geometric Matrix Transformations
These examples illustrate the dynamic view of matrices as operators that transform vectors.
Example 2: Projection Transformation
Let . The transformation projects points in onto the -plane.
.
Example 3: Shear Transformation
Let . The transformation defined by is called a shear transformation.
It deforms a shape (e.g., a square) into a sheared parallelogram (See Figure 4).
Key Idea: maps line segments onto line segments. By checking the images of the corners of the square, one can see the transformation.
Image of is .
Image of is .
Geometric Effect: The transformation deforms the square as if its top were pushed to the right while the base is held fixed. Shear transformations are observed in physics, geology, and crystallography.
Linear Transformations
Formal Definition
Theorem 5 in Section 1.4 established properties for matrix transformations: and . These properties define the most important class of transformations in linear algebra.
A transformation (or mapping) is linear if:
(i) Additivity: for all vectors in the domain of .
(ii) Homogeneity of Degree 1: for all scalars and all vectors in the domain of .
Important Notes on Linear Transformations
Matrix Transformations are Linear: Every matrix transformation () is a linear transformation. (However, not all linear transformations are matrix transformations; this will be explored in Chapters 4 and 5).
Preservation of Operations: Linear transformations preserve the operations of vector addition and scalar multiplication. Property (i) means that applying after adding vectors and is the same as applying to and separately and then adding their images.
Useful Facts derived from Linearity
If is a linear transformation, then:
Image of the Zero Vector:
Proof: From condition (ii), (where on the left is the scalar, and on the right is the zero vector in the codomain).
Linear Combination Property:
Proof: This property requires both (i) and (ii):
Implication: If a transformation satisfies for all and , it must be linear. (Setting gives additivity, and setting gives scalar multiplication property).
Superposition Principle (Generalization): Repeated application of the linear combination property leads to a useful generalization:
Significance in Science/Engineering: This is widely known as the superposition principle in engineering and physics. If are inputs (signals) to a system, and are the corresponding responses, then a system satisfies the superposition principle if its response to a linear combination of inputs is the same linear combination of the responses to the individual inputs. This concept is fundamental in many fields.
Examples of Linear Transformations
Example 4: Contraction and Dilation Transformation
Given a scalar , define by .
This is a contraction when 0 < r < 1 (shrinks vectors).
This is a dilation when r > 1 (stretches vectors).
Proof of Linearity (for ):
Let be vectors in and be scalars.
Since , the transformation is linear. (See Figure 5 for a dilation example).
Example 5: Rotation Transformation
Define a linear transformation by .
Let , , and .
Verification of Additivity: Notice that , which is indeed equal to .
Geometric Interpretation: This transformation rotates vectors (e.g., ) counterclockwise about the origin through (See Figure 6). It transforms the entire parallelogram determined by and into the parallelogram determined by and .
Example 6: Cost Transformation for Manufacturing Production
Scenario: A company manufactures two products, B and C.
Unit Cost Matrix : From Section 1.3, we construct a