Adaptive Methods: AdaGrad, RMSProp, Adam
#
Adaptive methods adjust learning rates per-parameter.
Home | Nonlinear Optimisation
Tuning Hyperparameters and Preprocessing
#
- Learning rate schedules
- Initialisation
- Tuning hyperparameters
- Importance of feature preprocessing
Home | Nonlinear Optimisation
Dimensionality reduction and PCA
#
PCA and SVM connect linear algebra, geometry, and optimisation.
Home | Linear Algebra
Principal Component Analysis (PCA)
#
- dimensionality reduction technique
- helps us to reduce the number of features in a dataset while keeping the most important information.
- changes complex datasets by transforming correlated features into a smaller set of uncorrelated components.
- uses linear algebra to transform data into new features called principal components.
- finds these by calculating eigenvectors (directions) and eigenvalues (importance) from the covariance matrix.
- PCA selects the top components with the highest eigenvalues and projects the data onto them simplify the dataset.
PCA prioritizes the directions where the data varies the most because more variation = more useful information.
PCA Theory
#
- Problem setting
- Maximum variance perspective
- Projection perspective
- Eigenvector and low-rank approximations
Home | Dimensionality reduction and PCA
PCA in Practice
#
Key steps of PCA in practice, including considerations in high dimensions.
Home | Dimensionality reduction and PCA
Latent Variable Perspective
#
PCA can be interpreted as modelling data using a smaller number of latent variables.
Home | Dimensionality reduction and PCA
Mathematical Preliminaries of SVM
#
- Primal and dual perspectives
- Geometry of margins
Home | Dimensionality reduction and PCA
Nonlinear SVM and Kernels
#
Kernels allow inner products in high-dimensional feature spaces without explicit mapping.
Home | Dimensionality reduction and PCA
January 3, 2026AI Learning Resources
#
A curated list of high-quality online courses to learn Artificial Intelligence, Machine Learning, and Deep Learning from reputable universities and organisations.
Recommended Books & References
#
Deep Neural Networks (DNN)
#
Deep Learning. MIT Press.
Goodfellow, I., Bengio, Y., & Courville, A. (2016). (Vol. 1, No. 2).
Introduction to Deep Learning. MIT Press.
Eugene, C. (2019).
Deep Learning with Python. Simon & Schuster.
Chollet, F. (2021).