Eigen Values and Eigen Vectors
Definition: A nonzero vector is an eigenvector or characteristic vector of a square matrix A if there exists a scalar such that . Then is the eigenvalue characteristic value of A.
What exactly does the above mean? If a square matrix is multiplied by a vector, the result is a vector or , where A is a square matrix, and x and y are vectors. Sometimes, it is possible to find a vector and scalar so that the following equality is true: . The scalar is the eigenvalue and the vector is the eigen vector (which is never a zero vector).
Why would we want to do this? In spatial analysis, we might be interested in assessing the strength of spatial autocorrelation](https://en.wikipedia.org/wiki/Spatial_analysis#Spatial_autocorrelation) and it has been shown that a relationship exists between measures for spatial autocorrelation (for example, Moran's I) and the eigenvalue computabled from a spatial weights object. The rest of this chapter explores both of those ideas and the take away here is simply that eigen values and vectors can be viewed as summarizations of more complex matrix representations.
Computing Eigen Values and Eigen Vectors
The computation of eigen values and vectors will utilize many of the linear algebra skills practiced in the previous two chapters including use of an identity matrix, matrix multiplication, and the use of elementary row operations.
I like to look at computing eigen values using an example:
and we want to compute the eigen values first:
then the determinant,
so,
and
Reminder: These were computed using the quadratic formula!
Once we have computed the eigen values, it is time to compute the eigen vectors. From above, we know that the following equality must hold:
or
So this is now a 'plug and chug' operations for a system of equations. Even though we officially look at systems of linear equations next week, you have already been solving them using elementary row operations! To keep rolling with the example, a little algebraic manipulation and:
From the top row, we see:
, so
and from the second row we see that:
So the values in the eigen vector must have equal and opposite signs. Notice that the actual scalar values in the vector do not matter. In other words, the magnitude. What does matter is the direction.
The process described above is then repeated for each of the other eigen values; plug the eigen value into and solve.
Additional Examples
Once again, Paul's Online math notes has a number of wonderful examples that step through computing the eigen values and then solving the system of linear equations.
Supplemental Video
Properties
Here are a few interesting properties of eigenvalues and vectors:
- the sum of eigen values is equal to the trace of the matrix
- a matrix is singular, if and only if is has a zero eigenvalue
- if is an eigen value of A and A is invertable, then is an eigen value of .
- If is an eigenvalue of A then it is also an eigenvalue of .
Who cares?
I like the following two webpages:
- this math.stackexchange post where the top answer is quite good
- this blog style post looking at what eigen vectors are visually


