Describing Linear Transformations
- A linear transformation is a function that maps vectors from one vector space to another vector space , while preserving the operations of vector addition and scalar multiplication
-
In general, linear transformations have the following properties:
- The origin remains fixed once the vector space is transformed
- The underlying grid lines remain parallel
- The underlying grid lines remain evenly spaced
- A vector can also be thought of as a linear combination of its standard basis vectors
- In other words, if we linearly transform a vector, then that transformed vector will end up as that same linear combination of the standard basis vectors
- Therefore, the only difference between the vectors is the scale of the standard basis vectors
- In other words, the standard basis vectors in the transformed space are scaled differently than the standard basis vectors in the original space
Defining a Linear Transformation
- A transformation matrix is a basis of our original vector space
- Generally speaking, the purpose of a transformation matrix is to map vectors from our original space to the standard basis vectors in a new space
- More specifically, a transformation matrix is a set of vectors from our original vector space that represent the unit vectors of the new vector space
- Essentially, if we linearly scale every vector (in our original space) around a transformation matrix , then the scaled vector in that space will translate to a vector in our original space
- In other words, as long as we know which vectors from our original space translate to the standard basis vectors in the new space, then we are able to translate any vector from that new space back to a vector in our original space
- We typically define a linear transformation as the following:
- Where is our transformation matrix
- Where is our scaled vector from the vector space defined by our transformation matrix
- Where is our vector from the original vector space
Abstract Analogy of Linear Transformation
- Let's assume that we only understand English
- The transformation matrix can be thought of as a translator who only knows how to translate French words into English words
- Therefore, if we give the translator any French word, then they will be able to give us the English translation
-
For this analogy to have any meaning, we should assume the following:
- English words and French words have an exact translation that are identical in meaning
- The translator has no idea how to translate English to French, Spanish to English, or any other translation besides French to English
- In this analogy, the translator represents a transformation matrix , a French word given to the translator represents vector , and an English word translated by the translator represents vector
Notation of a Linear Transformation
- The following are ways we can write the same linear transformation:
Example of a Linear Transformation
- Let's say we have the following linear transformation:
- Then, the vector would be linearly transformed to
- Then, the vector would be linearly transformed to
- Then, the vector would be linearly transformed to
Assumptions of Linear Transformations
-
A linear transformation needs to ensure the same requirements as vector spaces:
- The zero vector is an element of (i.e. )
- If and , then
- If and , then
Linear Dependence
- A set of vectors is linearly dependent when one of the vectors from the set can be defined as a linear combination of the others
- Conversely, a set of vectors are linearly independent when no other vector from the set can be defined as a linear combination of the others
-
The following are some rules related to linear dependence in an vector space, where is the number of dimensions and is the number of vectors:
- If , then the set of vectors can span , but can't be linearly independent
- If , then the set of vectors can span , and can be linearly independent
- If , then the set of vectors can't span , but can be linearly independent
- If our transformation matrix has linear dependent columns, then one of those columns is a scaled version of the other
- In other words, if our transformation matrix has linearly dependent columns, then the linear transformation squishes all of the original vector space onto a lower dimension
- For example, If our transformation matrix is a two dimensional matrix with linearly dependent columns, then the linear transformation squishes all of the 2D space onto a line where those two vectors sit, which is also just a one dimensional span of those two linearly dependent vectors
- In data analysis, linear dependence represents multicollinearity of two data columns
- We need to ensure linear dependence for greater interpretability and data compression reasons
Transforming Nonsquare Matrices
- If our transformation matrix is a matrix, then we are mapping two dimensional vectors to three dimensional vectors
- In other words, we are mapping vectors from a two dimensional vector space to a three dimensional vector space if our transformation matrix is a matrix
- Similarly, if our transformation matrix is a matrix, then we are mapping three dimensional vectors to two dimensional vectors
- In a general sense, if our transformation matrix is a matrix, then we are mapping dimensional vectors to dimensional vectors (and vice versa)
References
Previous
Next