Workshop on Invertible Neural Nets and Normalizing Flows

Submission Details

We invite researchers to submit their recent work on the development, analysis, or application of work related to invertible neural nets and normalizing flows. Submissions should take the form of an extended abstract of 4 pages in PDF format using the ICML style. Submissions longer than 4 pages are allowed but reviewers are not expected to read more than 4 pages. Author names do not need to be anonymized. Submissions may include a supplement/appendix, but reviewers are not responsible for reading any supplementary material. Submissions that are currently under review or that have been recently accepted for publication to another conference are permitted. Potential topics include but are not limited to:

  • Proposing new invertible transformation to improve expressiveness and tractability.
  • Introducing different training criteria for invertible functions.
  • Studying the information regularization of neural networks.
  • Theoretical work in terms of optimization and/or expressivity of invertible networks.
  • Regularizing for invertibility and solving inverse problems.
  • Generalizations of and understanding of the change of variable theorem.
  • Applying normalizing flows for exact or approximate inference.
  • Normalizing flows with discrete distributions.
  • Improving scalability of continuous normalizing flows.
  • Hierarchical reinforcement learning.
  • Exploration via randomized value function in RL.
  • Continuous relaxation to discrete latent variables.
  • Probabilistic programming.

Please submit here. The submission deadline has been extended to May 1st, at 23h59 anywhere on earth.
Submissions will be accepted as poster presentations. Selected submissions will also be considered for contributed talks.

If you would like a complimentary registration for ICML, please inform the organizers by April 30th. We are reserving these for attendees who have a financial need; please request them if you are unable to fund your conference expenses by other means.

Questions? Contact us at