Inverse transformations are linear

 
 
 
 
 

Proving that inverse transformations are always linear transformations

Inverse transformations are linear transformations

The inverse of an invertible linear transformation TT is also itself a linear transformation.

Hi! I'm krista.

I create online courses to help you rock your math class. Read more.

 

Which means that the inverse transformation T1T^{-1} is closed under addition and closed under scalar multiplication.

T1(u+v)=T1(u)+T1(v)T^{-1}(\vec{u}+\vec{v})=T^{-1}(\vec{u})+T^{-1}(\vec{v})

T1(cu)=cT1(u)T^{-1}(c\vec{u})=cT^{-1}(\vec{u})

In other words, as long as the original transformation TT

  1.  is a linear transformation itself, and

  2.  is invertible (its inverse is defined, you can find its inverse),

then the inverse of T1T^{-1}, TT, is also a linear transformation.

Inverse transformations as matrix-vector products

Remember that any linear transformation can be represented as a matrix-vector product. Normally, we rewrite the linear transformation T:RnRnT: \mathbb{R}^n\to \mathbb{R}^n as

T(x)=AxT(\vec{x})=A\vec{x}, where AA is a square n×nn\times n matrix

But because the inverse transformation T1T^{-1} is a linear transformation as well, we can also write the inverse as a matrix-vector product.

T1(x)=A1xT^{-1}(\vec{x})=A^{-1}\vec{x}

The matrix A1A^{-1} is the inverse of the matrix AA that we used to define the transformation TT.

In other words, given a linear transformation TT and its inverse T1T^{-1}, if we want to express both of them as matrix-vector products, we know that the matrices we use to do so will be inverses of one another.

The reason this is true is because we already know that taking the composition of the inverse transformation T1T^{-1} with TT will always give us the identity matrix InI_n, where nn is the dimension of the domain and codomain, Rn\mathbb{R}^n.

(T1T)(x)=Inx(T^{-1}\circ T)(\vec{x})=I_n\vec{x}

Going the other way (applying the transformation to the inverse transformation), gives the same result:

(TT1)(x)=Inx(T\circ T^{-1})(\vec{x})=I_n\vec{x}

The only way these can be true is if the matrices that are part of the matrix-vector products are inverses of one another.

(T1T)(x)=Inx(T1T)(x)=A1Ax(T^{-1}\circ T)(\vec{x})=I_n\vec{x}\quad\quad\to\quad\quad(T^{-1}\circ T)(\vec{x})=A^{-1}A\vec{x}

(TT1)(x)=Inx(T1T)(x)=AA1x(T\circ T^{-1})(\vec{x})=I_n\vec{x}\quad\quad\to\quad\quad(T^{-1}\circ T)(\vec{x})=AA^{-1}\vec{x}

Remember that matrix multiplication is not commutative, which means that, given two matrices AA and BB, ABAB is not equal to BABA. We can’t change the order of the matrix multiplication and still get the same answer. But based on these compositions of TT with T1T^{-1} and vice versa, we’re saying A1AA^{-1}A and AA1AA^{-1} must both equal InI_n, which means they must equal each other. The only way that A1AA^{-1}A and AA1AA^{-1} can both equal InI_n is if AA and A1A^{-1} are inverses of one another.

This is why, when we represent the inverse transformation T1T^{-1} as a matrix-vector product, we know that the matrix we use must be the inverse of the matrix we used to represent TT. So if we represent TT as T(x)=AxT(\vec{x})=A\vec{x} with the matrix AA, that means T1T^{-1} can only be represented as T1(x)=A1xT^{-1}(\vec{x})=A^{-1}\vec{x} with the inverse of AA, A1A^{-1}.

Finding the matrix inverse

We’ve said that the inverse transformation T1T^{-1} can be represented as the matrix-vector product T1(x)=A1xT^{-1}(\vec{x})=A^{-1}\vec{x}. Here’s how we can find A1A^{-1}.

If we start with AA, but augment it with the identity matrix, then all we have to do to find A1A^{-1} is work on the augmented matrix until AA is in reduced row-echelon form.

In other words, given the matrix AA, we’ll start with the augmented matrix

[A  I]\begin{bmatrix}A\ |\ I\end{bmatrix}

Through the process of putting AA in reduced row-echelon form, II will be transformed into A1A^{-1}, and we’ll end up with

[I  A1]\begin{bmatrix}I\ |\ A^{-1}\end{bmatrix}

This seems like a magical process, but there’s a very simple reason why it works. Remember earlier in the course that we learned how one row operation could be expressed as an elimination matrix EE. And that if we performed lots of row operations, through matrix multiplication of E1E_1, E2E_2, E3E_3, etc., we could find one consolidated elimination matrix.

That’s exactly what we’re doing here. We’re performing row operations on AA to change it into II. All those row operations could be expressed as the elimination matrix EE. And we’re saying that if we multiply EE by AA, that we’ll get the identity matrix, so EA=IEA=I. But as you know, A1A=IA^{-1}A=I, which means E=A1E=A^{-1}.

 
 

How to show that inverse transformations are linear


 
 

Take the course

Want to learn more about Linear Algebra? I have a step-by-step course for that. :)

 
 

 
 

Finding the inverse transformation

Example

Find A1A^{-1}.

Screen Shot 2021-08-16 at 1.39.47 PM.png

Because AA is a 3×33\times 3 matrix, its associated identity matrix is I3I_3. So we’ll augment AA with I3I_3.

Screen Shot 2021-08-16 at 1.40.08 PM.png

This is why, when we represent the inverse transformation as a matrix-vector product, we know that the matrix we use must be the inverse of the matrix we used to represent T.

Now we need to put AA into reduced row-echelon form. We start by switching R2R_2 and R1R_1, to get a 11 into the first entry of the first row.

Screen Shot 2021-08-16 at 1.40.31 PM.png

Now we’ll zero out the rest of the first column.

Screen Shot 2021-08-16 at 1.40.41 PM.png

Find the pivot entry in the second row.

Screen Shot 2021-08-16 at 1.40.51 PM.png

Zero out the rest of the second column.

Screen Shot 2021-08-16 at 1.41.03 PM.png

Find the pivot entry in the third row.

Screen Shot 2021-08-16 at 1.41.18 PM.png

Zero out the rest of the third column.

Screen Shot 2021-08-16 at 1.41.28 PM.png

Now that AA has been put into reduced row-echelon form on the left side of the augmented matrix, the identity matrix on the right side has been turned into the inverse matrix A1A^{-1}. So we can say

Screen Shot 2021-08-16 at 1.41.39 PM.png
 
 

Get access to the complete Linear Algebra course