Generative modelling
In the wake of models like Dall-E and ChatGPT, generative models have had a massive impact on text and image applications. The goal of this class is to present their mathematical and algorithmical foundations.
Teachers:
- Quentin Bertrand (CR Inria, MALICE team)
- Rémi Emonet (IUF, Associate professor, Jean Monnet University, MALICE Team)
- Mathurin Massias (CR Inria, OCKHAM team)
Tentative program
- 10/09 Class cancelled
- 11/09 Introduction to GenAI, MLE, Bayes + Base Models, Mixtures of Gaussian, EM (notes by Y. Dziki)
- 17/09 Maximum a posteriori vs max likelihood, PCA, PPCA, VAE notes by M. Ottavy
- at home: LAB 1 : simple generative models: PCA, mixture of Gaussian, pretrained models, VAE
- 18/09 [postponed due to ENS being closed] GAN/WGAN
- 24/09 Flow matching notes by H. Martel, blog post on Flow Matching
- 25/09 Lab on Flow Matching
- 01/10 GANS (material 1, material 2, material 3)
- 02/10 TBD
- 08/10 Diffusion + link with flow matching
- 09/10 Project progress evaluation
- 15/10 Conditional Generative Models
- 16/10 Intro to sequence modeling, tokenizer, base model, Autoregressive models bigrams
- 22/10 Attention, transformers
- 23/10 Evaluation metrics (OT + FID)
- 12/11 Project defense
Material
- Work in progress: site for the class
Validation
- weekly homeworks + quizzes + 3 Labs in python
- paper presentation and extension of a selected research article and the associated code applied on real data.
Prerequisite
- probabilities (densities, change of variable formula)
- linear algebra (PSD matrices, eigenvalue decomposition, spectral theorem)
- calculus (gradient, Hessian, Jacobian, chain rule, ordinary differential equations)