Eigenvalues and Eigenvectors #
Eigenvalues and eigenvectors are fundamental concepts in linear algebra that describe how a matrix transforms vectors.
They help us understand:
- Directional scaling
- Matrix behaviour
- Dimensionality reduction (very important for ML)
Key Idea: A matrix transformation stretches or compresses vectors. Eigenvectors are the directions that remain unchanged (only scaled), and eigenvalues tell us how much scaling happens.
Definition #
For a square matrix ( A \in \mathbb{R}^{n \times n} ):
[ A v = \lambda v ]Where:
- ( v \neq 0 ) → eigenvector
- ( \lambda ) → eigenvalue
Intuition (from Lecture + Webinar) #
From lecture discussions:
- Matrix multiplication = transformation
- Most vectors change direction + magnitude
- But some special vectors:
- Only scale
- Do NOT change direction
These are eigenvectors.
Geometric Interpretation #
- Eigenvector → direction stays same
- Eigenvalue → scaling factor
| Eigenvalue | Meaning |
|---|---|
| ( \lambda > 1 ) | Stretch |
| ( 0 < \lambda < 1 ) | Shrink |
| ( \lambda = 1 ) | No change |
| ( \lambda = 0 ) | Collapse to zero |
| ( \lambda < 0 ) | Flip direction |
How to Find Eigenvalues #
Step 1: Start from definition
Step 2: Rearrange
Step 3: Non-trivial solution exists only if:
[ \det(A - \lambda I) = 0 ]This is called the characteristic equation.
Example #
Let:
[ A = \begin{bmatrix} 4 & 2 \ 1 & 3 \end{bmatrix} ]Characteristic equation:
[ \det(A - \lambda I) = \begin{vmatrix} 4-\lambda & 2 \ 1 & 3-\lambda \end{vmatrix} = 0 ]Solve → eigenvalues.
Finding Eigenvectors #
For each eigenvalue ( \lambda ):
Solve:
[ (A - \lambda I)v = 0 ]This is a homogeneous system.
👉 From lecture:
- This relates to null space
- Eigenvectors lie in null space of ( A - \lambda I )
Important Connections (Exam Insight) #
From lectures:
- Solution of ( Ax = b ) uses pivot columns
- Solution of ( Ax = 0 ) → null space :contentReference[oaicite:1]{index=1}
👉 Eigenvectors are exactly:
- non-zero solutions of a homogeneous system
Properties #
- A matrix can have multiple eigenvalues
- Eigenvectors corresponding to distinct eigenvalues are linearly independent
- Number of eigenvectors determines diagonalizability
Diagonalisation Link #
If matrix has enough independent eigenvectors:
[ A = PDP^{-1} ]Where:
- ( D ) = diagonal matrix of eigenvalues
- ( P ) = matrix of eigenvectors
Why Important (ML Perspective) #
Eigenvalues & eigenvectors are used in:
- PCA (Principal Component Analysis)
- Dimensionality reduction
- Covariance matrix analysis
- SVD (VERY IMPORTANT for exam)
Common Exam Questions #
Based on past papers + webinar patterns:
- Find eigenvalues of a matrix
- Find eigenvectors
- Check if matrix is diagonalizable
- Relation with null space
- Interpret geometrically
Hidden Exam Pattern (Important) #
From course discussion:
- Questions are NOT direct formula-based
- Expect:
- Concept + computation combined
- Link with rank/null space
- Application-based questions
Mistakes to Avoid #
- Forgetting determinant condition
- Taking zero vector as eigenvector ❌
- Not solving full null space
- Missing multiplicity cases
Strategy to Prepare #
- Practice determinant + solving systems
- Understand null space deeply
- Solve at least 10–15 matrices
- Connect with:
- Rank
- Diagonalisation
- SVD
References #
- T1 Sections 4.1, 4.2
- Lecture slides
- Webinar problem discussions