Orthonormal Basis #
A basis is orthonormal if its vectors are:
- orthogonal to each other
- each has unit length
Key Idea: Orthonormal basis = perfectly independent + perfectly scaled. This makes computations extremely simple and stable.
Why it matters: orthonormal bases make projections and computations simple.
Intuition (From Lectures) #
From analytic geometry and vector space lectures:
- Basis gives coordinate system
- Orthonormal basis gives perfect coordinate system
Why?
- No overlap between directions (orthogonal)
- No scaling distortion (unit length)
Properties of Orthonormal Basis #
For vectors:
\[ \{e_1, e_2, \dots, e_n\} \]They satisfy:
Orthogonality #
\[ \langle e_i, e_j \rangle = 0 \quad (i \ne j) \]Unit Norm #
\[ \|e_i\| = 1 \]Matrix Form (Important) #
If we form a matrix:
\[ Q = [e_1 \; e_2 \; \cdots \; e_n] \]Then:
\[ Q^T Q = I \]This means:
- columns are orthonormal
- Q is an orthogonal matrix
Why Orthonormal Basis Matters #
From lectures:
1. Easy Coordinates #
Any vector can be written as:
\[ x = \sum \langle x, e_i \rangle e_i \]No solving system required.
2. Simple Projection #
Projection onto basis vector:
\[ \text{proj}_{e_i}(x) = \langle x, e_i \rangle e_i \]3. Numerical Stability #
- No redundancy
- No scaling distortion
- Stable computations
Connection to Gram-Schmidt (From Lecture) #
From lecture discussions:
- We can convert any independent set into orthonormal basis
- Using Gram-Schmidt process
Steps:
- Start with independent vectors
- Make them orthogonal
- Normalize them
Geometric Interpretation #
- Orthonormal basis = perpendicular axes
- Like standard coordinate axes
Example:
\[ (1,0), (0,1) \]These are orthonormal in 2D.
Connection to Matrix Decomposition #
From later lectures:
- Eigen decomposition uses orthogonal eigenvectors
- SVD produces orthonormal matrices
Where:
- U and V are orthonormal matrices
Orthonormal vs Normal Basis #
| Type | Property |
|---|---|
| Basis | Independent + spanning |
| Orthonormal basis | Independent + spanning + orthogonal + unit norm |
Example #
Given vectors:
\[ v_1 = (1,0), \quad v_2 = (1,1) \]These are not orthogonal.
After Gram-Schmidt:
\[ e_1 = (1,0), \quad e_2 = (0,1) \]Now orthonormal.
Applications in Machine Learning #
- PCA gives orthonormal directions
- SVD produces orthonormal matrices
- Feature decorrelation
- Dimensionality reduction
Hidden Exam Pattern #
From lecture flow:
- Orthonormal basis appears with:
- orthogonality
- inner product
- projections
- Gram-Schmidt
👉 rarely asked alone
Common Mistakes #
- Forgetting normalization
- Assuming orthogonal = orthonormal
- Not checking unit length
- Mixing up row and column orthogonality
Strategy to Prepare #
- Practice Gram-Schmidt
- Verify orthogonality using dot product
- Normalize vectors correctly
- Understand projection formula
Quick Summary Table #
| Concept | Formula | Meaning |
|---|---|---|
| Orthogonal | <e_i, e_j> = 0 | Independent directions |
| Unit norm | ||
| Orthonormal | both above | Perfect basis |
| Matrix form | Q^T Q = I | Orthogonal matrix |
References #
Lecture slides (Analytic Geometry, Orthogonality)
Course handout
Webinar discussions