Angles and Orthogonality

Angles and Orthogonality #

Once we define an inner product, we can define the angle between two vectors.


Angle Formula #

For
\( \mathbf{a}, \mathbf{b} \in \mathbb{R}^n \)

\[ \cos \alpha = \frac{\langle \mathbf{a}, \mathbf{b} \rangle} {\|\mathbf{a}\|\,\|\mathbf{b}\|} \]

The angle is:

\[ \alpha = \cos^{-1} \left( \frac{\langle \mathbf{a}, \mathbf{b} \rangle} {\|\mathbf{a}\|\,\|\mathbf{b}\|} \right) \]

The reason this fraction always lies in [-1,1] is guaranteed by the
Cauchy–Schwarz Inequality.


Interpretation #

  • Cosine ≈ 1 → vectors align
  • Cosine ≈ 0 → vectors are perpendicular
  • Cosine < 0 → vectors oppose

Orthogonality #

Vectors are orthogonal if:

\[ \langle \mathbf{a}, \mathbf{b} \rangle = 0 \]

Then:

\[ \alpha = \frac{\pi}{2} \]

Example #

\[ \mathbf{a} = \begin{bmatrix} 2\\ 2 \end{bmatrix}, \quad \mathbf{b} = \begin{bmatrix} 2\\ -2 \end{bmatrix} \]

Dot product:

\[ (2)(2) + (2)(-2) = 0 \]

So vectors are orthogonal.


Why It Matters in Machine Learning #

  • PCA produces orthogonal components
  • Orthogonal features reduce redundancy
  • Gradient directions depend on angle

Home | Vector Spaces