Cauchy–Schwarz

Cauchy–Schwarz Inequality #

The Cauchy–Schwarz Inequality is one of the most important results in linear algebra.

It places a fundamental bound on the dot product of two vectors.

If you see angle, cosine, similarity, or inner product bounds
→ think Cauchy–Schwarz Inequality


Statement of the Inequality #

For any vectors
\( \mathbf{a}, \mathbf{b} \in \mathbb{R}^n \) :

\[ |\mathbf{a}\cdot\mathbf{b}| \;\le\; \|\mathbf{a}\|\,\|\mathbf{b}\| \]

This is one of the most important inequalities in linear algebra.

It guarantees that:

The cosine formula for angles is always valid.


Equality Condition #

Equality holds if and only if the vectors are linearly dependent, i.e.:

One vector is a scalar multiple of the other:

\( \mathbf{a} = \lambda \mathbf{b} \)

This means the vectors point in the same or opposite direction.


Why This Inequality Matters #

Cauchy–Schwarz guarantees that:

\( -1 \le \frac{\mathbf{a}\cdot\mathbf{b}} {\|\mathbf{a}\|\,\|\mathbf{b}\|} \le 1 \)

Because of this:

  • The angle between vectors is always well-defined
  • The cosine formula never breaks
  • Inner products behave consistently

Geometric Interpretation #

  • If the dot product is large, vectors align
  • If it is zero, vectors are orthogonal
  • If it reaches the bound, vectors are collinear

Cauchy–Schwarz tells us:

“The dot product can never exceed the product of lengths.”


Machine Learning Connection #

Cauchy–Schwarz appears implicitly in:

  • Cosine similarity
  • SVM kernels
  • Projection formulas
  • Gradient bounds
  • Proofs of convergence

Without Cauchy–Schwarz:

  • cosine similarity would be invalid
  • angle-based similarity would not work

Reference #


Home | Vector Spaces