Vector Spaces

Basis and Rank

Basis and Rank #

A basis is a minimal set of linearly independent vectors that spans a space.

The dimension of a space is the number of vectors in a basis.

Key Idea: Basis = independence + spanning. Rank tells us how many independent directions exist in a matrix.

A basis must satisfy two conditions ⭐

  1. Vectors must be linearly independent
  2. Vectors must span the space

This means:

  • No redundancy (independence)
  • Full coverage (spanning)
\[ \text{Span}(v_1, v_2, \dots, v_k) = V \] \[ c_1 v_1 + \cdots + c_k v_k = 0 \Rightarrow c_i = 0 \]

Why Basis Matters #

  • Represents space efficiently
  • Removes redundancy
  • Helps define coordinates
  • Used in ML for feature representation

Dimension #

Dimension is the number of vectors in a basis.

Norm

Norm #

A norm measures the length (magnitude) of a vector.

  • the norm of a vector x measures the distance from the origin to the point x.

Common example: Euclidean norm.

\[ \lVert \mathbf{x} \rVert_2 = \sqrt{x_1^2 + \cdots + x_n^2} \]

Key Idea: Norm = measure of size or length of a vector. It generalises the idea of distance in geometry to higher dimensions.

Lengths and Distances

Lengths and Distances #

The length of a vector is given by its norm.

The distance between two points (vectors) is the norm of their difference.

Distance quantifies how far two vectors (data points) are from each other.

\[ d(\mathbf{x},\mathbf{y}) = \lVert \mathbf{x} - \mathbf{y} \rVert \]

Key Idea: Length measures size of a single vector. Distance measures separation between two vectors. Distance = norm applied to difference.

Angles and Orthogonality

Angles and Orthogonality #

Once we define an inner product, we can define the angle between two vectors.

Angles allow us to measure how aligned or different two vectors are in space.

Key Idea: Angle measures similarity between vectors. Orthogonality means complete independence (no similarity).

Why It Matters in Machine Learning #

  • PCA produces orthogonal components
  • Orthogonal features reduce redundancy
  • Gradient directions depend on angle

Angle Formula #

For vectors in n-dimensional space:

Orthonormal Basis

Orthonormal Basis #

A basis is orthonormal if its vectors are:

  • orthogonal to each other
  • each has unit length
\[ \langle \mathbf{e}_i, \mathbf{e}_j \rangle = \begin{cases} 1 & i=j \\ 0 & i\ne j \end{cases} \]

Key Idea: Orthonormal basis = perfectly independent + perfectly scaled. This makes computations extremely simple and stable.