Norm #
A norm measures the length (magnitude) of a vector.
- the norm of a vector x measures the distance from the origin to the point x.
Common example: Euclidean norm.
\[ \lVert \mathbf{x} \rVert_2 = \sqrt{x_1^2 + \cdots + x_n^2} \]Key Idea: Norm = measure of size or length of a vector. It generalises the idea of distance in geometry to higher dimensions.
Common norms #
- L1
- L2
- Infinity norm
Why it matters #
- norms quantify size
- are used in distances and regularisation.
Intuition (From Lectures) #
From lecture discussions on analytic geometry:
- Vectors represent points or directions in space
- Norm tells “how far” a vector is from origin
- It is essentially a distance measure
So:
- Small norm → close to origin
- Large norm → far from origin
Formal Definition #
A norm is a function:
\[ \lVert \cdot \rVert : \mathbb{R}^n \rightarrow \mathbb{R} \]which satisfies:
Properties of Norm #
1. Non-negativity #
\[ \lVert x \rVert \geq 0 \]and
\[ \lVert x \rVert = 0 \iff x = 0 \]2. Homogeneity #
\[ \lVert \alpha x \rVert = |\alpha| \lVert x \rVert \]3. Triangle Inequality #
\[ \lVert x + y \rVert \leq \lVert x \rVert + \lVert y \rVert \]These three properties define any valid norm.
Common Norms #
L1 Norm (Manhattan Norm) #
\[ \lVert x \rVert_1 = \sum_{i=1}^{n} |x_i| \]- Measures path along axes
- Used in sparse models
L2 Norm (Euclidean Norm) #
\[ \lVert x \rVert_2 = \sqrt{\sum_{i=1}^{n} x_i^2} \]- Most common norm
- Corresponds to standard distance
Infinity Norm #
\[ \lVert x \rVert_\infty = \max_i |x_i| \]- Takes maximum component
- Useful in worst-case analysis
Distance Using Norm #
Distance between two vectors:
\[ d(x, y) = \lVert x - y \rVert \]From lecture insight:
- Norm + subtraction → distance
- Used in clustering and similarity
Connection to Inner Product #
From analytic geometry lectures:
Norm is induced by inner product:
\[ \lVert x \rVert = \sqrt{x^T x} \]This links:
- Norm
- Inner product
- Geometry
Length and Angle #
Using norm and inner product:
\[ \cos \theta = \frac{x^T y}{\lVert x \rVert \lVert y \rVert} \]So norm is essential for:
- measuring angles
- checking orthogonality
Geometric Interpretation #
- L2 norm → circular contours
- L1 norm → diamond shape
- L∞ norm → square shape
This affects optimisation behaviour.
Norm in Machine Learning #
Norms are used extensively in ML:
Regularisation #
- L2 norm → Ridge regression
- L1 norm → Lasso regression
Distance Metrics #
- k-NN uses norms
- clustering uses norms
Optimisation #
- Gradient descent uses norm of gradients
Norm and Optimization #
From later lectures:
- Norm helps measure error size
- Used in loss functions
- Helps determine convergence
Important Inequalities #
Cauchy-Schwarz Inequality #
\[ |x^T y| \leq \lVert x \rVert \lVert y \rVert \]This is fundamental in ML proofs.
Example #
Given:
\[ x = (3,4) \] \[ \lVert x \rVert_2 = \sqrt{3^2 + 4^2} = 5 \]Common Exam Questions #
- Compute L1, L2, L∞ norms
- Prove triangle inequality
- Use norm to compute distance
- Relate norm with inner product
- Interpret geometrically
Hidden Exam Pattern #
From lectures:
- Norm appears with:
- inner product
- orthogonality
- distance
👉 rarely asked alone
Common Mistakes #
- Mixing L1 and L2 formulas
- Forgetting absolute values in L1
- Not applying square root in L2
- Confusing norm with squared norm
Strategy to Prepare #
- Memorise formulas of all norms
- Practice geometric interpretation
- Solve distance-based problems
- Link with inner product
Quick Summary Table #
| Norm | Formula | Meaning |
|---|---|---|
| L1 | sum of absolute values | Manhattan distance |
| L2 | square root of sum of squares | Euclidean distance |
| L∞ | max absolute value | Maximum deviation |
References #
- Lecture slides (Analytic Geometry, Inner Product)
- Course handout