Cauchy–Schwarz Inequality #
The Cauchy–Schwarz Inequality is one of the most important results in linear algebra.
It places a fundamental bound on the inner product of two vectors.
If you see angle, cosine, similarity, or inner product bounds
→ think Cauchy–Schwarz InequalityKey Idea: The inner product (dot product) can never exceed the product of magnitudes. This ensures all geometric interpretations (angles, cosine) are valid.
Statement of the Inequality #
For any vectors:
\[ \mathbf{a}, \mathbf{b} \in \mathbb{R}^n \] \[ |\mathbf{a}\cdot\mathbf{b}| \le \|\mathbf{a}\|\,\|\mathbf{b}\| \]Why This Inequality Matters #
From lectures on inner product and angles:
Cauchy–Schwarz guarantees that:
\[ -1 \le \frac{\mathbf{a}\cdot\mathbf{b}} {\|\mathbf{a}\|\,\|\mathbf{b}\|} \le 1 \]This ensures:
- cosine values are always valid
- angle between vectors is well-defined
- geometric interpretation is consistent
Equality Condition #
Equality holds if and only if:
\[ \mathbf{a} = \lambda \mathbf{b} \]This means:
- vectors are linearly dependent
- vectors lie on the same line
- direction is same or opposite
Connection to Angles #
From analytic geometry:
\[ \cos \alpha = \frac{\langle a, b \rangle} {\|a\| \|b\|} \]Cauchy–Schwarz ensures this expression always lies in valid cosine range.
Geometric Interpretation #
If the dot product is:
- Large dot product → vectors align
- Zero dot product → orthogonal vectors
- Maximum value → vectors collinear
Interpretation: “The projection of one vector onto another cannot exceed its length.”
Connection to Norm #
From lecture:
\[ \|x\| = \sqrt{x^T x} \]Cauchy–Schwarz ensures consistency between:
- norm
- inner product
- distance
Important Consequence #
Triangle inequality is derived using Cauchy–Schwarz:
\[ \|x + y\| \le \|x\| + \|y\| \]Example #
\[ a = (1,2), \quad b = (3,4) \] \[ a \cdot b = 11 \] \[ \|a\| = \sqrt{5}, \quad \|b\| = 5 \] \[ |11| \le \sqrt{5} \cdot 5 \]Inequality holds.
Machine Learning Connection #
Cauchy–Schwarz appears in:
- cosine similarity
- projection formulas
- optimisation bounds
- gradient analysis
- kernel methods
Without this inequality:
- cosine similarity breaks
- angle-based ML models fail
Hidden Exam Pattern #
From lectures:
- Used in:
- angle proofs
- norm inequalities
- optimisation derivations
👉 often appears indirectly
Common Mistakes #
- Forgetting absolute value
- Mixing dot product and norm
- Ignoring equality condition
- Not recognising hidden usage
Strategy to Prepare #
- Memorise inequality
- Understand geometric meaning
- Practice applying in proofs
- Link with norm and angle
Quick Summary #
| Concept | Meaning |
|---|---|
| Bound | dot product ≤ product of norms |
| Equality | vectors are dependent |
| Use | angles, projections, ML |
Reference #
Lecture slides (Inner Product, Geometry)
Course handout