Calculus #
Calculus is the mathematical framework for understanding and controlling how quantities change.
- How fast is something changing right now?
- What happens to a system when inputs change slightly?
- Where is something maximum or minimum?
Differential Calculus (Rates of Change) #
Studies how things change.
- How steep is a curve at a point?
- Is something increasing or decreasing?
- Where are the maxima and minima?
- The key idea is the derivative.
A derivative measures how a small change in input affects the output.
Example intuition:
- Slope of a curve
- Instantaneous speed
- Gradient of a loss function
Integral Calculus (Accumulation) #
Studies how things add up.
- What is the total effect over time?
- How much area lies under a curve?
- How do small changes accumulate?
The key idea is the integral.
Example intuition:
- Total distance from speed
- Area under a curve
- Summing many tiny contributions
Multivariate Calculus #
Multivariate calculus deals with functions of more than one variable.
Univariate (single variable):
\[ y = f(x) \]Multivariate (many variables):
\[ z = f(x, y) \] \[ L(w_1, w_2, \dots, w_n) \]In Machine Learning, almost every function is multivariate.
Why Multivariate Calculus Matters in Machine Learning #
Machine learning models do not learn one parameter at a time.
They optimise many parameters simultaneously.
Example:
\[ \text{Loss}(w_1, w_2, \dots, w_n) \]Multivariate calculus tells us #
- How changing each parameter affects the output
- Which direction reduces the error fastest
- Whether a solution is a minimum, maximum, or saddle point
Key Topics (for ML) #
- Univariate differentiation (revision)
- Partial derivatives
- Gradients
- Jacobian and Hessian
- Gradients of vectors and matrices
- Useful gradient identities
- Backpropagation (conceptual)
- Automatic differentiation
Focus #
- Compute gradients correctly
- Use Hessian intuition for minima/maxima
- Understand Taylor series (multivariate)
ML Connection #
- Training neural networks (gradient-based learning)