Support Vector Machine (SVM)
#
A Support Vector Machine (SVM) is a supervised machine learning algorithm used for:
- Classification (most common)
- Regression (SVR – Support Vector Regression)
Find the decision boundary that separates classes with the maximum margin.
A Support Vector Machine is a supervised learning algorithm that finds an optimal hyperplane by maximising the margin between classes, using support vectors and kernel functions to handle non-linear data.
April 19, 2026Deep Recurrent Neural Networks
#
Vanilla RNNs introduce the hidden-state idea, but they struggle on longer and more complex sequences because gradients can vanish across time. Deep recurrent models extend the RNN idea in two important ways:
- make the recurrent architecture richer, for example by stacking multiple recurrent layers or using information from both directions,
- use gates and memory cells to control what should be remembered, forgotten, updated, and exposed.
This is why practical recurrent modelling usually moves from a simple RNN to stacked RNNs, bidirectional RNNs, GRUs, or LSTMs.
March 18, 2026Linear Algebra
#
The study of vectors and matrices is called Linear Algebra.
Linear Algebra provides the mathematical language used to represent data, transformations, and structure in ML.
Why Linear Algebra Matters in ML
#
- Every machine learning model uses matrices
- All data in ML is represented using vectors and matrices
- Neural networks are pipelines of matrix operations
- Models apply matrix transformations to data
- Optimisation relies on linear algebra operations
What to Learn
#
- Scalars, vectors, and matrices
- Vector operations (addition, dot product)
- Matrix multiplication (critical)
- Identity matrices and transpose
- Eigenvalues and eigenvectors (conceptual understanding)
- Scalar → a number
- Vector → a directed point
- Matrix → a space transformer
- Linear transformation → structured mapping
- Feature → one axis
- Feature space → where data lives
- Vector space → where vectors live
Home | Mathematical Foundation