Generative modelling
In the wake of models like Dall-E and ChatGPT, generative models have had a massive impact on text and image applications. The goal of this class is to present their mathematical and algorithmical foundations.
Teachers:
- Quentin Bertrand (CR Inria, MALICE team)
- RĂ©mi Emonet (IUF, Associate professor, Jean Monnet University, MALICE Team)
- Mathurin Massias (CR Inria, OCKHAM team)
Syllabus
Generative modelling for images:
- Normalizing flows, Continuous normalizing flows
- Flow matching
- Diffusion models (DDPM, DDIM, Stochastic differential equations), latent diffusion
- GANs
Models for text:
- Autoregressive models, text-to-text models
- Transformers (the ChatGPT architecture)
- Mamba and State space models
- Text-to-image, conditional generation
Additional topics:
- Variational inference
- Sampling as optimization/energy-based model
- Evaluation of Generative Models
Validation
- 50 % weekly homeworks + 3 Labs in python
- 50 % paper presentation and extension of a selected research article and the associated code applied on real data.
Prerequisite
- probabilities (densities, change of variable formula)
- linear algebra (PSD matrices, eigenvalue decomposition, spectral theorem)
- calculus (gradient, Hessian, Jacobian, chain rule, ordinary differential equations)