10: Eigenvalues and Eigenvectors
Most vectors change direction when you apply a matrix to them. But eigenvectors don’t,they stay on their own line, just getting stretched or compressed. These special directions reveal the “skeleton” of a transformation. The stretch factor is the eigenvalue. Together, eigenvalues and eigenvectors tell you what a matrix really does, stripped of coordinate system noise.
Finding eigenvectors means asking: “What directions does this transformation leave invariant?” The answer unlocks everything,stability analysis, matrix powers, differential equations, Google’s PageRank. Eigenvectors are the natural coordinates where complicated transformations become simple.
The Definition
(Eigenvector and Eigenvalue)
Let be an matrix. A nonzero vector is an eigenvector of if there exists a scalar such that:
The scalar is the eigenvalue corresponding to .
What this means: Applying to just scales ,no rotation, no shear, just stretching (or compressing, or flipping) by factor .
(Why “Eigen”?)
The German word eigen means “characteristic” or “own.” Eigenvectors are the matrix’s “own vectors”,the directions intrinsic to the transformation itself, independent of any choice of coordinates.
Geometric Intuition
(Invariant Directions)
Think of a transformation as a physical process: stretching rubber, flowing water, rotating machinery. Most directions get mixed,a vector pointing northeast might end up pointing south after transformation.
But eigenvectors are special: they point along axes that the transformation respects. Apply the matrix, and the vector stays on its own line.
Example: Consider a transformation that:
- Stretches horizontally by factor 3
- Stretches vertically by factor 2
The eigenvectors are (horizontal, ) and (vertical, ).
(Eigenvalues as Stretch Factors)
The eigenvalue tells you how much the eigenvector gets scaled:
- : Stretch (expansion)
- : Compression
- : No change
- : Collapse to zero
- : Flip direction, then scale by
Example: means the eigenvector flips and doubles in length.
(Visualization in 2D)
For a matrix, eigenvectors define two special axes. The transformation:
- Stretches along the first eigenvector by
- Stretches along the second eigenvector by
Every other vector is a combination of these eigenvectors, so it gets stretched in a complicated way. But the eigenvectors themselves just scale.
Finding Eigenvalues
(The Characteristic Equation)
Rewrite as:
For a nontrivial solution to exist, the matrix must be singular:
This is the characteristic equation. It’s a polynomial equation in .
(The Characteristic Polynomial)
Expanding gives a polynomial of degree (for an matrix):
The roots of this polynomial are the eigenvalues.
Key fact: An matrix has exactly eigenvalues (counting multiplicity), though some may be complex.
(Example: 2×2 Matrix)
Find the eigenvalues of:
Step 1: Compute :
Step 2: Find the determinant:
Step 3: Solve :
Eigenvalues: , .
(Example: 3×3 Matrix)
Find the eigenvalues of:
This is upper triangular, so the eigenvalues are the diagonal entries:
Key shortcut: For triangular (or diagonal) matrices, eigenvalues are just the diagonal entries.
Finding Eigenvectors
Once you have an eigenvalue , find its eigenvectors by solving:
This is a homogeneous system,row reduce and find the null space of .
(Example: Eigenvectors for )
From before, has .
Row reduce:
From , we get .
Eigenvector: (or any nonzero scalar multiple).
Verify:
(Example: Eigenvectors for )
Row reduce:
From , we get .
Eigenvector: .
Verify:
Eigenspaces
(Definition)
For each eigenvalue , the eigenspace is the set of all eigenvectors corresponding to , plus the zero vector:
The eigenspace is a subspace of .
(Geometric Multiplicity)
The geometric multiplicity of is:
This is the number of linearly independent eigenvectors for .
In the example above:
- has dimension 1
- has dimension 1
Algebraic vs Geometric Multiplicity
(Algebraic Multiplicity)
The algebraic multiplicity of is how many times appears as a root of the characteristic polynomial.
Example: If , then:
- has algebraic multiplicity 2
- has algebraic multiplicity 1
(Key Inequality)
For any eigenvalue :
When they’re equal: The matrix behaves “nicely” for that eigenvalue,there are enough eigenvectors to span the eigenspace fully.
When they differ: The matrix is defective for that eigenvalue,there aren’t enough independent eigenvectors.
(Example: Defective Matrix)
Characteristic polynomial:
has algebraic multiplicity 2.
Find eigenvectors:
From , the eigenspace is:
Geometric multiplicity: 1 (only one independent eigenvector).
Since geometric < algebraic, this matrix is defective and cannot be diagonalized.
Properties of Eigenvalues and Eigenvectors
(Linearity of Eigenspaces)
If and are eigenvectors with the same eigenvalue , then any linear combination is also an eigenvector with eigenvalue .
Proof:
This is why eigenspaces are subspaces.
(Independence Across Eigenvalues)
Theorem: Eigenvectors corresponding to distinct eigenvalues are linearly independent.
Why this matters: If you have distinct eigenvalues for an matrix, you automatically have independent eigenvectors,the matrix is diagonalizable.
(Eigenvalues of Special Matrices)
1. Triangular matrices: Eigenvalues are the diagonal entries.
2. Identity matrix: with multiplicity (every vector is an eigenvector).
3. Zero matrix: with multiplicity .
4. Projection matrix: Eigenvalues are 0 and 1 only.
5. Rotation matrix (by angle ): Complex eigenvalues and (no real eigenvectors,rotations have no invariant directions in 2D).
The Trace and Determinant
(Sum of Eigenvalues = Trace)
The trace of (sum of diagonal entries) equals the sum of eigenvalues:
Example: has eigenvalues 5 and 2.
(Product of Eigenvalues = Determinant)
The determinant of equals the product of eigenvalues:
Example:
Why this matters: If , then at least one eigenvalue is zero, meaning is singular.
Complex Eigenvalues
(Real Matrices Can Have Complex Eigenvalues)
Even if has all real entries, its eigenvalues might be complex.
Example: Rotation by :
Characteristic polynomial:
Eigenvalues: (purely imaginary).
Interpretation: Rotations in 2D have no real invariant directions,every vector gets rotated off its line.
(Complex Eigenvalues Come in Conjugate Pairs)
For real matrices, complex eigenvalues come in conjugate pairs: if is an eigenvalue, so is .
Applications
(1. Stability Analysis)
For the differential equation :
- If all eigenvalues have : System is stable (solutions decay to zero)
- If any eigenvalue has : System is unstable (solutions explode)
- If eigenvalues have : Neutral stability
The eigenvectors give the “modes” of the system,independent directions of behavior.
(2. Matrix Powers)
If is an eigenvector with eigenvalue , then:
Why: , and so on.
The eigenvector with the largest dominates long-term behavior,this is why PageRank works.
(3. Google PageRank)
Google’s PageRank treats the web as a giant matrix where represents links from page to page . The “importance” of pages is the eigenvector corresponding to the largest eigenvalue (which is 1 for a properly normalized matrix).
(4. Vibrations and Oscillations)
In mechanical systems, eigenvalues give frequencies of vibration, and eigenvectors give the vibrational modes (standing wave patterns).
(5. Principal Component Analysis (PCA)
In statistics, PCA finds the eigenvectors of the covariance matrix. These eigenvectors point in directions of maximum variance,they’re the “principal components” that capture the most information in the data.
Computing Eigenvalues: The Algorithm
For small matrices (2×2, 3×3), you can:
- Compute by hand
- Solve the characteristic polynomial
- For each , solve
For larger matrices, this is computationally impractical (solving degree- polynomials is hard). Instead, numerical methods are used:
- QR algorithm: Iteratively factor , then form , repeat
- Power iteration: Find the dominant eigenvalue by repeatedly multiplying
- Inverse iteration: Find specific eigenvalues by inverting
These methods are what numerical libraries use,they avoid computing the characteristic polynomial directly.
Why Eigenvectors Are Fundamental
Eigenvectors are the natural coordinates for a matrix. In the eigenvector basis, the matrix is diagonal,just scaling, no mixing. This is why:
-
Diagonalization works: If you have enough eigenvectors, you can change to a basis where is diagonal.
-
Long-term behavior is simple: The eigenvector with the largest eigenvalue dominates as in .
-
Systems decouple: Differential equations split into independent modes along eigenvectors.
-
Structure is revealed: Eigenvalues tell you if a matrix is invertible (no zero eigenvalues), symmetric (real eigenvalues), orthogonal (), etc.
Every matrix is “trying” to be diagonal. Eigenvectors show you the coordinate system where this is true.
Next, we’ll use eigenvalues and eigenvectors to diagonalize matrices,writing where is diagonal. This unlocks matrix powers, exponentials, and the ability to solve or effortlessly.