COMP 7070 Advanced Topics in Artificial Intelligence and Machine Learning

Overall, this course is an invitation to core machine learning for AI application research. It aims to familiarize the students with useful concepts in machine learning, and therefore to benefit their own research.

Studies related to machine learning is implicitly divided into three genres in this statement: learning theory, core machine learning, and AI application.

Detailed course topics

  • β€œLinearity is all you need”
    • Matrix derivative
    • Basic concepts of numerical analysis
    • Logistic regression
    • Basic concepts of convex optimization (convergence rate of GD and SGD)
  • Harness the power of randomness
    • SubGaussian random variables and concentration inequalities
    • Sketching methods
    • Error decomposition (approximation, optimization, and generalization) and concentration bound for generalization error
  • Advanced topics
    • Kernel tricks, RKHS, and neural tangent kernels
    • Generative modeling (MLE - EM algorithm - VI - VAE - GAN/WGAN - Flow - Diffusion)

Note that the detailed lecture schedule may change as the semester progresses, based on student interest.

Scribed lecture notes

  • Lecture 1: Preliminaries πŸ—ˆ
  • Lecture 2: Numerical Analysis πŸ—ˆ (matrix derivation and basics of numerical analysis, such as condition number, operator norm, SVD, etc.)
  • Lecture 3: Regularization and Logistic Regression πŸ—ˆ (a thorough revisit of the softmax classifier)
  • Lecture 4: Convex Optimization and the Convergence Rate of GD/SGD πŸ—ˆ
  • Lecture 5: Concentration Inequalities and SubGaussian Random Variables πŸ—ˆ
  • Lecture 6: JL Lemma and Sketching Methods πŸ—ˆ
  • Lecture 7: Generalization Error πŸ—ˆ
  • Lecture 8: Kernel Tricks and RKHS πŸ—ˆ
  • Lecture 9: VAE πŸ—ˆ
  • Lecture 10: GAN, WGAN, and duality πŸ—ˆ
  • Lecture 11: Flow Models and Neural ODE πŸ—ˆ
  • Lecture 12: From VAE to Diffusion Models πŸ—ˆ
  • Lecture 13: Variants and Advances of Diffusion Models πŸ—ˆ