Eigenvalues and Eigenvectors

linear-algebra
Published

January 9, 2025

Definition

For a square matrix \(A \in \mathbb{R}^{n \times n}\), a scalar \(\lambda\) is an eigenvalue and non-zero vector \(\mathbf{v}\) is its corresponding eigenvector if:

\[A\mathbf{v} = \lambda\mathbf{v}\]

The transformation \(A\) only scales \(\mathbf{v}\) — it does not change its direction.

Finding Eigenvalues

Rearranging: \((A - \lambda I)\mathbf{v} = \mathbf{0}\). For a non-trivial solution, the matrix must be singular:

\[\det(A - \lambda I) = 0\]

This is the characteristic equation. Its roots are the eigenvalues.

Finding Eigenvectors

For each eigenvalue \(\lambda_i\), solve \((A - \lambda_i I)\mathbf{v} = \mathbf{0}\) for \(\mathbf{v}\).

Example

\[A = \begin{pmatrix} 4 & 1 \\ 2 & 3 \end{pmatrix}\]

Characteristic equation:

\[\det\begin{pmatrix} 4-\lambda & 1 \\ 2 & 3-\lambda \end{pmatrix} = (4-\lambda)(3-\lambda) - 2 = \lambda^2 - 7\lambda + 10 = 0\]

\[(\lambda - 5)(\lambda - 2) = 0 \Rightarrow \lambda_1 = 5,\; \lambda_2 = 2\]

For \(\lambda_1 = 5\): \((A - 5I)\mathbf{v} = \mathbf{0} \Rightarrow \mathbf{v}_1 = \begin{pmatrix}1\\1\end{pmatrix}\)

For \(\lambda_2 = 2\): \((A - 2I)\mathbf{v} = \mathbf{0} \Rightarrow \mathbf{v}_2 = \begin{pmatrix}1\\-2\end{pmatrix}\)

Eigendecomposition

If \(A\) has \(n\) linearly independent eigenvectors:

\[A = P \Lambda P^{-1}\]

where \(P\) is the matrix of eigenvectors (columns) and \(\Lambda = \text{diag}(\lambda_1, \ldots, \lambda_n)\).

Applications in ML

  • PCA — eigenvectors of the covariance matrix define principal components
  • Spectral clustering — eigenvectors of the graph Laplacian
  • PageRank — dominant eigenvector of the web graph transition matrix
  • Stability analysis — eigenvalues determine convergence of iterative algorithms

Next: Programming Assignment — Eigenvalues and Eigenvectors