Angles and Orthogonality

Angles and Orthogonality #

The angle between vectors is defined using the inner product:

\[ \cos\alpha = \frac{\langle \mathbf{a}, \mathbf{b} \rangle}{\lVert \mathbf{a} \rVert \,\lVert \mathbf{b} \rVert} \]

Vectors are orthogonal if their inner product is zero:

\[ \langle \mathbf{a}, \mathbf{b} \rangle = 0 \]

To understand the angle between two vectors, we use the inner product (dot product).

For any two vectors ( \mathbf{a}, \mathbf{b} \in \mathbb{R}^n ), the following always holds:

\[ -1 \le \frac{\langle \mathbf{a}, \mathbf{b} \rangle}{\|\mathbf{a}\| \, \|\mathbf{b}\|} \le 1 \]

This allows us to define the angle between the vectors.


Angle Between Two Vectors #

Let ( \alpha ) be the angle between vectors ( \mathbf{a} ) and ( \mathbf{b} ).

\[ \alpha = \cos^{-1}\left(\frac{\langle \mathbf{a}, \mathbf{b} \rangle}{\|\mathbf{a}\| \, \|\mathbf{b}\|}\right) \]

Intuition #

  • If the dot product is large and positive, the vectors point in similar directions
  • If the dot product is small (near 0), the vectors point in very different directions
  • If the dot product is negative, the vectors point in opposite directions

Orthogonality #

Two vectors are orthogonal if their dot product is zero. \[ \langle \mathbf{a}, \mathbf{b} \rangle = 0 \]

In this case, the angle between them is:

\[ \alpha = \frac{\pi}{2} \]

This means the vectors are perpendicular.

If the angle between two vectors is 𝜋/ 2, their Dot product = 0 ⇔ vectors are perpendicular (orthogonal).


Example #

Consider the vectors:

\[ \mathbf{a} = \begin{bmatrix} 2 \\ 2 \end{bmatrix}, \quad \mathbf{b} = \begin{bmatrix} 2 \\ -2 \end{bmatrix} \]

Their dot product is:

\[ \langle \mathbf{a}, \mathbf{b} \rangle = (2)(2) + (2)(-2) = 4 - 4 = 0 \]

Since the dot product is zero, the vectors are orthogonal.


Key Takeaways #

  • The angle between vectors is defined using the dot product
  • Orthogonal vectors have zero dot product
  • Orthogonality means vectors share no directional overlap

Why it matters #

  • In machine learning, orthogonal features often represent independent information, which can make models easier to train and interpret

Home | Vector Spaces