COMP 7070 Advanced Topics in Artificial Intelligence and Machine Learning
Overall, this course is an invitation to core machine learning for AI application research. It aims to familiarize the students with useful concepts in machine learning, and therefore to benefit their own research.
Studies related to machine learning is implicitly divided into three genres in this statement: learning theory, core machine learning, and AI application.
Detailed course topics
- βLinearity is all you needβ
- Matrix derivative
- Basic concepts of numerical analysis
- Logistic regression
- Basic concepts of convex optimization (convergence rate of GD and SGD)
- Harness the power of randomness
- SubGaussian random variables and concentration inequalities
- Sketching methods
- Error decomposition (approximation, optimization, and generalization) and concentration bound for generalization error
- Advanced topics
- Kernel tricks, RKHS, and neural tangent kernels
- Generative modeling (MLE - EM algorithm - VI - VAE - GAN/WGAN - Flow - Diffusion)
Note that the detailed lecture schedule may change as the semester progresses, based on student interest.
Scribed lecture notes
- Lecture 1: Preliminaries π
- Lecture 2: Numerical Analysis π (matrix derivation and basics of numerical analysis, such as condition number, operator norm, SVD, etc.)
- Lecture 3: Regularization and Logistic Regression π (a thorough revisit of the softmax classifier)
- Lecture 4: Convex Optimization and the Convergence Rate of GD/SGD π
- Lecture 5: Concentration Inequalities and SubGaussian Random Variables π
- Lecture 6: JL Lemma and Sketching Methods π
- Lecture 7: Generalization Error π
- Lecture 8: Kernel Tricks and RKHS π
- Lecture 9: VAE π
- Lecture 10: GAN, WGAN, and duality π
- Lecture 11: Flow Models and Neural ODE π
- Lecture 12: From VAE to Diffusion Models π
- Lecture 13: Variants and Advances of Diffusion Models π