Optimisation of Deep models #
- Goal of Optimization
- Optimization Challenges in Deep Learning
- Gradient Descent
- Stochastic Gradient Descent
- Minibatch Stochastic Gradient Descent
- Momentum
- Adagrad and Algorithm
- RMSProp and Algorithm
- Adadelta and Algorithm
- Adam and Algorithm
- Code Implementation and comparison of algorithms (webinar)
Reference #
- Dive into deep learning. Cambridge University Press.. (Ch12)