Nonlinear Optimisation in Machine Learning #
Practical training challenges and modern optimisers used in ML.
- Challenges in Gradient-Based Optimisation
- Stochastic Gradient Descent (SGD)
- Momentum-Based Learning
- Adaptive Methods: AdaGrad, RMSProp, Adam
- Tuning Hyperparameters and Preprocessing