Programming Assignment: Eigenvalues and Eigenvectors

linear-algebra
python
Published

January 10, 2025

Objective

Explore eigendecomposition using NumPy and verify properties analytically.

import numpy as np

A = np.array([[4, 1],
              [2, 3]], dtype=float)

eigenvalues, eigenvectors = np.linalg.eig(A)
print("Eigenvalues:", eigenvalues)
print("Eigenvectors (columns):\n", eigenvectors)

Verify \(A\mathbf{v} = \lambda\mathbf{v}\)

for i in range(len(eigenvalues)):
    lam = eigenvalues[i]
    v = eigenvectors[:, i]
    lhs = A @ v
    rhs = lam * v
    print(f"λ={lam:.1f}: Av={np.round(lhs,6)}, λv={np.round(rhs,6)}, match={np.allclose(lhs, rhs)}")

Eigendecomposition \(A = P\Lambda P^{-1}\)

P = eigenvectors
Lambda = np.diag(eigenvalues)
A_reconstructed = P @ Lambda @ np.linalg.inv(P)

print("Original A:\n", A)
print("Reconstructed A:\n", np.round(A_reconstructed, 10))
print("Match:", np.allclose(A, A_reconstructed))

Application: PCA on Synthetic Data

PCA finds the directions of maximum variance using eigenvectors of the covariance matrix.

np.random.seed(0)
X = np.random.multivariate_normal(mean=[0, 0], cov=[[3, 2], [2, 2]], size=200)
X -= X.mean(axis=0)  # center

cov = np.cov(X.T)
evals, evecs = np.linalg.eigh(cov)

# Sort by descending eigenvalue
idx = np.argsort(evals)[::-1]
evals, evecs = evals[idx], evecs[:, idx]

print("Covariance matrix:\n", np.round(cov, 4))
print("Principal components (columns):\n", np.round(evecs, 4))
print("Explained variance ratio:", np.round(evals / evals.sum(), 4))

X_pca = X @ evecs[:, :1]  # project onto first PC
print(f"\nData reduced from {X.shape[1]}D to {X_pca.shape[1]}D")