How to find eigenvalues, eigenvectors, and eigenspaces
What are eigenvectors and eigenvalues?
Any vector that satisfies is an eigenvector for the transformation , and is the eigenvalue that’s associated with the eigenvector . The transformation is a linear transformation that can also be represented as .
Hi! I'm krista.
I create online courses to help you rock your math class. Read more.
The first thing you want to notice about , is that, because is a constant that acts like a scalar on , we’re saying that the transformation of , , is really just a scaled version of .
We could also say that the eigenvectors are the vectors that don’t change direction when we apply the transformation matrix . So if we apply to a vector , and the result is parallel to the original , then is an eigenvector.
Identifying eigenvectors
In other words, if we define a specific transformation that maps vectors from to , then there may be certain vectors in the domain that change direction under the transformation . For instance, maybe the transformation rotates vectors by . Vectors that rotate by will never satisfy .
But there may be other vectors in the domain that stay along the same line under the transformation, and might just get scaled up or scaled down by . Those are the vectors that will satisfy , which means that those are the eigenvectors for . And this makes sense, because literally reads “the transformed version of is the same as the original , but just scaled up or down by .”
The way to really identify an eigenvector is to compare the span of with the span of . The span of any single vector will always be a line. If, under the transformation , the span remains the same, such that has the same span as , then you know is an eigenvector. The vectors and might be different lengths, but their spans are the same because they lie along the same line.
The reason we care about identifying eigenvectors is because they often make good basis vectors for the subspace, and we’re always interested in finding a simple, easy-to-work-with basis.
Finding eigenvalues
Because we’ve said that and , it has to be true that . Which means eigenvectors are any vectors that satisfy .
We also know that there will be eigenvectors when is , that there will be eigenvectors when is , and that there will be eigenvectors when is .
While would satisfy , we don’t really include that as an eigenvector. The reason is first, because it doesn’t really give us any interesting information, and second, because doesn’t allow us to determine the associated eigenvalue .
So we’re really only interested in the vectors that are nonzero. If we rework , we could write it as
Realize that this is just a matrix-vector product, set equal to the zero vector. Because is just a matrix. The eigenvalue acts as a scalar on the identity matrix , which means will be a matrix. If, from , we subtract the matrix , we’ll still just get another matrix, which is why is a matrix. So let’s make a substitution .
Written this way, we can see that any vector that satisfies will be in the null space of , . But we already said that was going to be nonzero, which tells us right away that there must be at least one vector in the null space that’s not the zero vector. Whenever we know that there’s a vector in the null space other than the zero vector, we conclude that the matrix (the matrix ) has linearly dependent columns, and that is not invertible, and that the determinant of is , .
Which means we could come up with these rules:
for nonzero vectors if and only if .
is an eigenvalue of if and only if .
With these rules in mind, we have everything we need to find the eigenvalues for a particular matrix.
How to find eigenvalues, eigenvectors, and eigenspaces
Take the course
Want to learn more about Linear Algebra? I have a step-by-step course for that. :)
Finding the eigenvalues of the transformation
Example
Find the eigenvalues of the transformation matrix .
We need to find the determinant .
Then the determinant of this resulting matrix is
This polynomial is called the characteristic polynomial. Remember that we’re trying to satisfy , so we can set this characteristic polynomial equal to , and get the characteristic equation:
To solve for , we’ll always try factoring, but if the polynomial can’t be factored, we can either complete the square or use the quadratic formula. This one can be factored.
or
So assuming non-zero eigenvectors, we’re saying that can be solved for and .
The reason we care about identifying eigenvectors is because they often make good basis vectors for the subspace, and we’re always interested in finding a simple, easy-to-work-with basis.
We want to make a couple of important points, which are both illustrated by this last example.
First, the sum of the eigenvalues will always equal the sum of the matrix entries that run down its diagonal. In the matrix from the example, the values down the diagonal were and . Their sum is , which means the sum of the eigenvalues will be as well. The sum of the entries along the diagonal is called the trace of the matrix, so we can say that the trace will always be equal to the sum of the eigenvalues.
Realize that this also means that, for an matrix , once we find of the eigenvalues, we’ll already have the value of the th eigenvalue.
Second, the determinant of , , will always be equal to the product of the eigenvalues. In the last example, , and the product of the eigenvalues was .
Finding eigenvectors
Once we’ve found the eigenvalues for the transformation matrix, we need to find their associated eigenvectors. To do that, we’ll start by defining an eigenspace for each eigenvalue of the matrix.
The eigenspace for a specific eigenvalue is the set of all the eigenvectors that satisfy for that particular eigenvalue .
As we know, we were able to rewrite as , and we recognized that is just a matrix. So the eigenspace is simply the null space of the matrix .
To find the matrix , we can simply plug the eigenvalue into the value we found earlier for . Let’s continue on with the previous example and find the eigenvectors associated with and .
Example
For the transformation matrix , we found eigenvalues and . Find the eigenvectors associated with each eigenvalue.
With and , we’ll have two eigenspaces, given by . With
we get
and
Therefore, the eigenvectors in the eigenspace will satisfy
So with , we’ll substitute , and say that
Which means that is defined by
And the eigenvectors in the eigenspace will satisfy
And with , we’ll substitute , and say that
Which means that is defined by
If we put these last two examples together (the first one where we found the eigenvalues, and this second one where we found the associated eigenvectors), we can sketch a picture of the solution. For the eigenvalue , we got
We can sketch the spanning eigenvector ,
and then say that the eigenspace for is the set of all the vectors that lie along the line created by .
Then for the eigenvalue , we got
We can add to our sketch the spanning eigenvector ,
and then say that the eigenspace for is the set of all the vectors that lie along the line created by .
In other words, we know that, for any vector along either of these lines, when you apply the transformation to the vector , will be a vector along the same line, it might just be scaled up or scaled down.
Specifically,
since in the eigenspace , any vector in , under the transformation , will be scaled by , meaning that , and
since in the eigenspace , any vector in , under the transformation , will be scaled by , meaning that .