This is an old revision of the document!


Neural ODE and Generative Modelling (Master Seminar)

Descripion of Seminar.

Score-based generative models have demonstrated state-of-the-art performance in numerous applications in recent years. The central concept behind these models involves the gradual injection of noise into the training data, followed by learning the reverse process to generate new samples. The training and sampling procedures can be conducted independently. The learning phase is facilitated by noise-conditional score networks, while sampling can be accomplished through various methods, including Langevin Monte Carlo approaches, stochastic differential equations, ordinary differential equations, and various combinations.

In this seminar, we will begin by reviewing generative models and examining the most common architectures used in current research. Special attention will be given to comparing different objective functions needed for training, as well as the different sampling procedures. We will explore invertible neural networks, with a particular focus on normalizing flows and continuous normalizing flows. Subsequently, we will address score matching and Langevin dynamics for score-based generative models. This includes an explanation of how Langevin dynamics can be used to approximate scores, deriving noise-conditional score networks, and providing a detailed explanation of the training process. Additionally, we will explore a broader generalization involving an infinite number of time steps for noise levels, studying the process using stochastic differential equations. This formulation, known as score SDEs, leverages SDEs for noise perturbation and sample generation. The seminar will conclude with a comparison to other possible diffusion models and a discussion of further enhancements in sample generation.

The seminar is scheduled for the second half of the winter term.

Organization

  • Prerequisites: Basic knowledge in probability theory and statistics
  • Registration: Via email to Jonathan
  • First (organizational) meeting: Friday, 26 April at 14:00 c.t.
  • Time and Location: Friday 14:00 c.t. (The location will be announced soon)

Further information on the seminar will be announced in the first organizational meeting. For any specific question you can contact Jonathan Schwarz.

Literature

  • An introduction to deep generative modeling, Ruthotto, Lars and Haber, Eldad, GAMM-Mitteilungen, Wiley Online Library (2021)
  • A conceptual introduction to Markov chain Monte Carlo methods, Speagle, Joshua S,arXiv preprint (2019)
  • Neural ordinary differential equations, Chen, Ricky TQ and Rubanova, Yulia and Bettencourt, Jesse and Duvenaud, David K NeurIPS (2018)
  • Ffjord: Free-form continuous dynamics for scalable reversible generative models, Grathwohl, Will and Chen, Ricky TQ and Bettencourt, Jesse and Sutskever, Ilya and Duvenaud, David arXiv preprint (2018)
  • Diffusion models: A comprehensive survey of methods and applications, Yang, Ling and Zhang, Zhilong and Song, Yang and Hong, Shenda and Xu, Runsheng and Zhao, Yue and Shao, Yingxia and Zhang, Wentao and Cui, Bin and Yang, Ming-Hsuan, arXiv preprint (2022)
  • Applied stochastic differential equations, Särkkä, Simo and Solin, Arno, Cambridge University Press (2019)
  • Generative modeling by estimating gradients of the data distribution, Song, Yang and Ermon, Stefano, NeurIPS (2019)
  • Improved techniques for training score-based generative models, Song, Yang and Ermon, Stefano, NeurIPS (2020)
  • Score-based generative modeling through stochastic differential equations, Song, Yang and Sohl-Dickstein, Jascha and Kingma, Diederik P and Kumar, Abhishek and Ermon, Stefano and Poole, Ben, ICLR (2021)
  • Gotta go fast when generating data with score-based models, olicoeur-Martineau, Alexia and Li, Ke and Piché-Taillefer, Rémi and Kachman, Tal and Mitliagkas, Ioannis, arXiv preprint (2021)