Vector Spaces

Vector Spaces #

A vector space is the mathematical “home” where vectors live and where addition and scaling are valid operations.

  • A vector space is a set closed under vector addition and scalar multiplication.

  • Machine learning operates in vector spaces.

  • covers independence, bases, rank, and geometric tools like norms and inner products that are used to measure length, distance, and angles.

A vector space is a set of vectors that follows ten axioms, defined under two operations:

  • Vector addition
  • Scalar multiplication

These axioms ensure consistent linear behaviour.

It also:

  • Contains a zero vector
  • Contains additive inverses

Geometric Intuition #

A vector space is the entire space where vectors live.

Examples:

  • A line through the origin
  • A plane through the origin
  • Higher-dimensional spaces

In Machine Learning #

Feature spaces and embedding spaces are vector spaces.



flowchart TD
  VS[Vector Spaces] --> LI[Linear Independence]
  VS --> BR[Basis & Rank]
  VS --> N[Norms]
  VS --> IP[Inner Products]
  VS --> LD[Lengths & Distances]
  VS --> AO[Angles & Orthogonality]
  VS --> ONB[Orthonormal Basis]

Key components #

  • Zero vector
  • Additive inverse
  • Closure under operations

Vector Spaces and Feature Spaces #

Machine learning operates in vector spaces.
Understanding these spaces is essential for reasoning about dimensionality, structure, and representations.


Vector Subspace #

Definition #

A vector subspace is a subset of a vector space that satisfies all vector space conditions.

A subspace must:

  • Contain the zero vector
  • Be closed under addition
  • Be closed under scalar multiplication

Geometric Intuition #

A subspace is a smaller space inside a larger space that still behaves like a full space.

Examples:

  • A line inside a plane
  • A plane inside 3D space

In Machine Learning #

Subspaces capture lower-dimensional structure in data, such as the space spanned by principal components.


Axioms of a Vector Space #

Properties of Vector Addition #

  • Closure \[ \mathbf{u}, \mathbf{v} \in V \Rightarrow \mathbf{u} + \mathbf{v} \in V \]
  • Commutativity \[ \mathbf{u} + \mathbf{v} = \mathbf{v} + \mathbf{u} \]
  • Associativity
    \[ (\mathbf{u} + \mathbf{v}) + \mathbf{w} = \mathbf{u} + (\mathbf{v} + \mathbf{w}) \]
  • Additive Identity
    \[ \mathbf{u} + \mathbf{0} = \mathbf{u} \]
  • Additive Inverse
    \[ \mathbf{u} + (-\mathbf{u}) = \mathbf{0} \]

Properties of Scalar Multiplication #

  • Distributivity over Vector Addition
    \[ c(\mathbf{u} + \mathbf{v}) = c\mathbf{u} + c\mathbf{v} \]
  • Distributivity over Scalar Addition
    \[ (c + d)\mathbf{u} = c\mathbf{u} + d\mathbf{u} \]
  • Associativity of Scalars
    \[ c(d\mathbf{u}) = (cd)\mathbf{u} \]
  • Multiplicative Identity
    \[ 1\mathbf{u} = \mathbf{u} \]
  • Zero Property
    \[ 0\mathbf{u} = \mathbf{0}, \quad c\mathbf{0} = \mathbf{0} \]

Why This Matters in ML #

  • Data lives in vector spaces
  • Models manipulate these spaces
  • Learning often discovers meaningful subspaces

Home | Linear Algebra