Constrained Optimisation
#
Optimisation with constraints (equalities/inequalities).
Home | Continuous Optimisation
Lagrange Multipliers
#
Transforms constrained problems into unconstrained ones using Lagrangians.
Home | Continuous Optimisation
Convex Optimisation
#
Convex objectives have a single global minimum, making optimisation reliable.
Home | Continuous Optimisation
Nonlinear Optimisation in Machine Learning
#
Practical training challenges and modern optimisers used in ML.
Home | Calculus
Challenges in Gradient-Based Optimisation
#
- Local optima and flat regions
- Differential curvature
- Difficult topologies (cliffs and valleys)
Home | Nonlinear Optimisation
Stochastic Gradient Descent (SGD)
#
SGD uses mini-batches to trade exact gradients for speed and generalisation.
Home | Nonlinear Optimisation
Momentum-Based Learning
#
Momentum smooths updates and helps traverse valleys efficiently.
Home | Nonlinear Optimisation
Adaptive Methods: AdaGrad, RMSProp, Adam
#
Adaptive methods adjust learning rates per-parameter.
Home | Nonlinear Optimisation
Tuning Hyperparameters and Preprocessing
#
- Learning rate schedules
- Initialisation
- Tuning hyperparameters
- Importance of feature preprocessing
Home | Nonlinear Optimisation
Dimensionality reduction and PCA
#
PCA and SVM connect linear algebra, geometry, and optimisation.
Home | Linear Algebra