MFML Lecture to Course Content Map

MFML Lecture to Course Content Map #

This file maps the uploaded Maths lecture PDFs and webinar PDFs against the official course handout/contact-session plan. It is intended as an exam preparation index and as a source map for future Hugo Markdown notes.

Course identity #

  • Course: Mathematical Foundations for Machine Learning
  • Course code: AIML ZC416
  • Main areas: linear algebra, vector spaces, matrix decompositions, vector calculus, optimisation, PCA, and SVM.

Official module structure #

ModuleCourse handout areaMain ideasUploaded lecture coverage
M1Solution of linear systemsSystems of equations, matrices, solving Ax = bLecture 1, Webinar 1
M2Vector spaces and analytic geometryVector spaces, linear independence, basis, rank, norms, inner products, angles, orthogonality, orthonormal basisLecture 2, Lecture 3, Webinar 1
M3Matrix decomposition methodsDeterminant, trace, eigenvalues, eigenvectors, Cholesky, eigendecomposition, diagonalisation, SVD, matrix approximationLecture 4, Lecture 5, Webinar 1, Webinar 2
M4Vector calculusUnivariate differentiation, partial derivatives, gradients, matrix gradients, Taylor/Maclaurin series, Hessian, backpropagation, automatic differentiationLecture 6, Lecture 7, Lecture 8, Webinar 2
M5Continuous optimisationGradient descent, constrained optimisation, Lagrange multipliers, convex optimisationLecture 9, Lecture 14, Webinar 2, Webinar 3, Webinar 4
M6Nonlinear optimisationLearning rate, initialisation, SGD, feature preprocessing, local optima, cliffs/valleys, momentum, AdaGrad, RMSProp, AdamLecture 10, Lecture 11, Webinar 3
M7Dimensionality reduction, PCA, SVMPCA perspectives, low-rank approximation, high-dimensional PCA, practical PCA, SVM preliminaries, primal/dual SVM, kernelsLecture 12, Lecture 13, Lecture 14, Lecture 15, Webinar 4

Contact session by lecture #

SessionCourse handout topicUploaded fileWhat the lecture appears to coverExam relevance
1Solution of linear systemsLecture_1.pdfLinear algebra introduction, closure, systems of linear equations, matrix representation, solution types: no solution, unique solution, infinite solutions, pivot/free variables, matrix operations, inverse, transpose, compact Ax=b formVery high for Mid-Sem and Comprehensive
2Vector spaces, linear independence, basis, rankLecture_2.pdfGroups, Abelian groups, vector spaces, vector subspaces, closure tests, linear combinations, span, linear independence, basis, rank, nullspace/column space ideasVery high for Mid-Sem and Comprehensive
3Analytic geometryLecture_3.pdfNorms, dot product, inner products, bilinear mappings, symmetric positive-definite matrices, lengths, distances, angles, orthogonality, orthonormal basis, Gram-Schmidt ideasVery high for Mid-Sem and Comprehensive
4Matrix Decomposition Ilecture_4.pdfDeterminant, cofactor formula, determinant behaviour under row operations, rank-det relation, eigenvalues/eigenvectors, Cholesky-related positive definite ideasVery high for Mid-Sem and Comprehensive
5Matrix Decomposition IIlecture_5.pdfDiagonal matrices, diagonalisation, eigendecomposition, spectral theorem for symmetric matrices, SVD, matrix approximationVery high for Mid-Sem and Comprehensive
6Vector Calculus Ilecture_6.pdfDifferentiation of univariate functions, polynomial derivatives, Taylor polynomial/series, partial derivatives, gradients, vector-valued gradientsVery high for Mid-Sem and Comprehensive
7Vector Calculus IIlecture_7_edited.pdfMatrix gradients, useful gradient identities, backpropagation, automatic differentiation, chain rule through neural-network layersHigh for Mid-Sem and Comprehensive
8Vector Calculus IIIlecture_8.pdfTaylor/Maclaurin series theory, remainder term, two-variable Taylor series, Hessian matrix, maxima/minima, unconstrained optimisation preliminariesVery high for Mid-Sem and Comprehensive
9Continuous OptimisationLecture_9.pdfGradient descent, negative gradient direction, local minima, step size, line search, convergence intuition, quadratic examplesVery high for Comprehensive; likely useful for quizzes/problems
10Nonlinear Optimisation ILecture_10.pdfInitialisation, objective functions in ML, overfitting, feature processing/preprocessing, SGD and practical optimisation behaviourHigh for Comprehensive
11Nonlinear Optimisation IILecture_11.pdfDifficult topologies: cliffs, valleys, flat regions, curvature; momentum, AdaGrad, RMSProp, AdamHigh for Comprehensive
12PCA ILecture_12.pdfDimensionality reduction, PCA problem setting, centred data, covariance, maximum variance perspective, projection perspectiveVery high for Comprehensive
13PCA IILecture_13.pdfPractical PCA, eigenvector computation, SVD relationship, low-rank approximation, high-dimensional PCA, key PCA stepsVery high for Comprehensive
14Mathematical preliminaries for SVMLecture 14.pdfConstrained optimisation, Lagrangian, quadratic programming, primal/dual, weak/strong duality, Slater condition, KKT conditions, kernels, linear classifiersVery high for Comprehensive
15Primal/dual linear SVMLecture_15.pdfSVM primal problem, dual formulation, KKT conditions, support vectors, hinge loss, linear SVM numerical problem, hard/soft-margin directionVery high for Comprehensive
16Nonlinear SVM / kernelsNot clearly uploaded as a separate Lecture 16 PDFKernel functions, nonlinear SVM examples; likely partly covered in Lecture 14/15 and webinarsVery high for Comprehensive; gap to fill if Lecture 16 exists

Webinar mapping #

Webinar fileMain roleBest linked lecturesExam use
Webinar_1.pdfProblem sheet on linear systems, REF/RREF, column space, nullspace, row independence, subspaces, inner products, Cauchy-Schwarz, Cholesky, eigenvaluesLectures 1-5Excellent for Mid-Sem problem practice
Webinar_2.pdfWorked problems on maxima/minima, eigenvalues/spectral decomposition, gradient-related calculations and PCA-style examplesLectures 4-9, 12-13Excellent for Mid-Sem revision and Comprehensive practice
Webinar_3.pdfGradient descent algorithm, step-size derivation for quadratic functions, worked gradient descent examplesLectures 8-11Excellent for optimisation exam problems
webinar_4.pdfAppears linked to optimisation/SVM/PCA practice based on uploaded set; use as problem-solving supplement after Lecture 12 onwardsLectures 12-15Comprehensive exam practice

Mid-Sem focus #

The course handout states that the Mid-Semester Test covers Weeks 1-8. So for Mid-Sem, focus on:

  1. Lecture 1: Linear systems and matrices
  2. Lecture 2: Vector spaces, subspaces, linear independence, basis, rank
  3. Lecture 3: Norms, inner products, orthogonality, Gram-Schmidt
  4. Lecture 4: Determinant, trace, eigenvalues, eigenvectors, Cholesky
  5. Lecture 5: Diagonalisation, eigendecomposition, SVD, matrix approximation
  6. Lecture 6: Differentiation, partial derivatives, gradients, Taylor series
  7. Lecture 7: Matrix gradients, gradient identities, backpropagation, automatic differentiation
  8. Lecture 8: Taylor theorem, Hessian, maxima/minima, unconstrained optimisation
  9. Webinar 1 and Webinar 2 for worked problem practice

Comprehensive exam focus #

The course handout states that the Comprehensive Exam covers all sessions 1 to 16. So for Comprehensive, add:

  1. Lecture 9: Gradient descent and step size
  2. Lecture 10: Nonlinear optimisation, initialisation, feature preprocessing, overfitting, SGD
  3. Lecture 11: Cliffs, valleys, momentum, AdaGrad, RMSProp, Adam
  4. Lecture 12: PCA I
  5. Lecture 13: PCA II
  6. Lecture 14: SVM mathematical preliminaries, Lagrangian, KKT, duality
  7. Lecture 15: Linear SVM, support vectors, hinge loss, primal/dual solution
  8. Missing/gap area: nonlinear SVM and kernels if a separate Lecture 16 exists

Suggested future Hugo Markdown pages #

These are good page boundaries for your website and revision notes:

  1. 01-linear-systems-and-matrices.md
  2. 02-vector-spaces-subspaces-basis-rank.md
  3. 03-analytic-geometry-norms-inner-products.md
  4. 04-determinants-trace-eigenvalues.md
  5. 05-eigendecomposition-svd-matrix-approximation.md
  6. 06-vector-calculus-gradients.md
  7. 07-backpropagation-automatic-differentiation.md
  8. 08-taylor-series-hessian-maxima-minima.md
  9. 09-gradient-descent-continuous-optimisation.md
  10. 10-nonlinear-optimisation-sgd-feature-preprocessing.md
  11. 11-momentum-adagrad-rmsprop-adam.md
  12. 12-pca-foundations.md
  13. 13-pca-practical-computation-svd.md
  14. 14-lagrangian-duality-kkt.md
  15. 15-support-vector-machines.md
  16. 16-nonlinear-svm-kernels.md
  17. 99-mfml-exam-formula-sheet.md
  18. 99-mfml-webinar-problem-bank.md

Important exam-preparation note #

For exam preparation, do not only read the lecture slides. Use this sequence:

  1. Read the lecture slides for concept flow.
  2. Create a one-page formula sheet for each lecture.
  3. Work through the matching webinar problems.
  4. Mark every problem by topic: definition, proof, calculation, interpretation, or algorithm.
  5. For open-book comprehensive exam, prepare indexed printed notes rather than loose sheets.