Inner Products and Dot Product #
An inner product maps two vectors to a single scalar.
It allows us to measure:
- similarity
- vector length
- projections
- orthogonality
flowchart TD
T["Inner<br/>products<br/>(types)"] --> DOT["Euclidean<br/>Dot product"]
T --> WIP["Weighted<br/>inner product"]
T --> FN["Function-space<br/>(integral)"]
T --> HERM["Complex<br/>Hermitian"]
T --> MAT["Matrix<br/>inner product<br/>(Frobenius)"]
DOT --> Rn["Vectors in<br/>
<span>
\( \mathbb{R}^n \)
</span>
"]
WIP --> SPD["SPD matrix<br/>W"]
FN --> L2["L2 space<br/>functions"]
HERM --> Cn["Vectors in<br/>C^n"]
MAT --> Mnm["Matrices<br/>R^{m×n}"]
style T fill:#90CAF9,stroke:#1E88E5,color:#000
style DOT fill:#C8E6C9,stroke:#2E7D32,color:#000
style WIP fill:#C8E6C9,stroke:#2E7D32,color:#000
style FN fill:#C8E6C9,stroke:#2E7D32,color:#000
style HERM fill:#C8E6C9,stroke:#2E7D32,color:#000
style MAT fill:#C8E6C9,stroke:#2E7D32,color:#000
style Rn fill:#CE93D8,stroke:#8E24AA,color:#000
style SPD fill:#CE93D8,stroke:#8E24AA,color:#000
style L2 fill:#CE93D8,stroke:#8E24AA,color:#000
style Cn fill:#CE93D8,stroke:#8E24AA,color:#000
style Mnm fill:#CE93D8,stroke:#8E24AA,color:#000
Definition #
For vectors
\( \mathbf{a}, \mathbf{b} \in \mathbb{R}^n \)
The inner product is written as:
\( \langle \mathbf{a}, \mathbf{b} \rangle \)In \( \mathbb{R}^n \) , this is the dot product.
Dot Product Formula #
Let
\( \mathbf{a} = (a_1, \dots, a_n) \)
\( \mathbf{b} = (b_1, \dots, b_n) \)
\[ \mathbf{a}\cdot\mathbf{b} = \sum_{i=1}^{n} a_i b_i \]
Key Properties #
Let
\( \mathbf{a},\mathbf{b},\mathbf{c}\in\mathbb{R}^n \)
and
\( \lambda\in\mathbb{R} \)
.
1) Symmetry #
\[ \mathbf{a}\cdot\mathbf{b} = \mathbf{b}\cdot\mathbf{a} \]
2) Linearity #
\[ (\mathbf{a}+\mathbf{b})\cdot\mathbf{c} = \mathbf{a}\cdot\mathbf{c} + \mathbf{b}\cdot\mathbf{c} \]
\[ (\lambda\mathbf{a})\cdot\mathbf{b} = \lambda(\mathbf{a}\cdot\mathbf{b}) \]
3) Positivity #
\[ \mathbf{a}\cdot\mathbf{a} \ge 0 \]
Equality holds only if
\( \mathbf{a}=\mathbf{0} \)
.
Norm (Length of a Vector) #
\[ \|\mathbf{a}\| = \sqrt{\mathbf{a}\cdot\mathbf{a}} \]
Important Theoretical Result #
The Cauchy–Schwarz Inequality places a bound on the dot product and guarantees that angle formulas are valid.
Machine Learning Connection #
The dot product appears in:
- Linear regression
\[ \hat{y} = \mathbf{w}\cdot\mathbf{x} + b \]
- Neural networks
- SVM linear kernel
- Cosine similarity
- Gradient-based optimisation
Summary #
- Inner product maps two vectors to a scalar
- In \( \mathbb{R}^n \) , it is the dot product
- Defines vector length
- Foundation for geometry and similarity