Workshop on Invertible Neural Nets and Normalizing Flows



Schedule

  9:30 - 10:30 Eric Jang Normalizing Flows: A Tutorial

10:30 - 10:50 Poster Spotlights

10:50 - 11:30 Coffee break and poster session I

11:30 - 11:50 Laurent Dinh Invited Talk: Building a Tractable Generator Network

11:50 - 12:10 Diederik Kingma & Prafulla Dhariwal Invited Talk: Glow: Generative Flow with Invertible 1x1 Convolutions
12:10 - 12:30 Conor Durkan Contributed Talk: Cubic-Spline Flows

12:30 - 14:00 Lunch

14:00 - 14:20 Jakub Tomczak Invited Talk: Householder meets Sylvester: Normalizing Flows for Variational Inference

14:20 - 14:40 Jesse Bettencourt Invited Talk: Neural Ordinary Differential Equations for Continuous Normalizing Flows

14:40 - 15:00 Ricky T. Q. Chen Contributed Talk: Residual Flows: Unbiased Generative Modeling with Norm-Learned i-ResNets.
15:00 - 16:00 Coffee break and poster session II

16:00 - 16:20 Matt Hoffman Invited Talk: The Bijector API: An Invertible Function Library for TensorFlow

16:20 - 16:40 Jorn-Henrik Jacobsen Invited Talk: Invertible Neural Networks for Understanding and Controlling Learned Representations
16:40 - 17:00 Manoj Kumar Contributed Talk: VideoFlow: A Flow-Based Generative Model for Video.

17:00 - 18:00 Panel Session (Submit your question here.)
Panelists:
Moderator: David Krueger