Workshop on Invertible Neural Nets and Normalizing Flows



INNF 2019 Accepted Papers


  1. Block Neural Autoregressive Flow
  2. Sum-of-Squares Polynomial Flows
    Priyank Jaini, Kira Selby and Yaoliang Yu.
  3. Inverting Deep Generative models, One layer at a time
    Qi Lei, Ajil Jalal, Inderjit Dhillon and Alexandros Dimakis.
  4. Normalizing flows for novelty detection in industrial time series data
  5. Embarassingly Parallel MCMC using Real NVP
    Diego Mesquita, Paul Blomstedt and Samuel Kaski.
  6. Benchmarking Invertible Architectures on Inverse Problems
    Jakob Kruse, Lynton Ardizzone and Ullrich Köthe.
  7. Residual Flows: Unbiased Generative Modeling with Norm-Learned i-ResNets
  8. PRECOG: PREdictions Conditioned On Goals in Visual Multi-Agent Scenarios
    Nicholas Rhinehart, Rowan McAllister, Kris Kitani and Sergey Levine.
  9. Improving Exploration in Soft-Actor-Critic with Normalizing Flows Policies
    Patrick Nadeem Ward, Ariella Smofsky and Avishek Joey Bose.
  10. JacNet: Learning Functions with Structured Jacobian
    Safwan Hossain and Jonathan Lorraine.
  11. Boosting Trust Region Policy Optimization with Normalizing Flows Policy
    Yunhao Tang and Shipra Agrawal.
  12. Optimal Domain Translation
    Emmanuel de Bézenac, Ibrahim Ayed and Patrick Gallinari.
  13. Structured Output Learning with Conditional Generative Flows
  14. Cubic-Spline Flows
  15. Information Theory in Density Destructors
  16. Adversarial training of partially invertible variational autoencoders
    Thomas Lucas, Konstantin Shmelkov, Kartheek Alahari, Cordelia Schmid and Jakob Verbeek.
  17. VideoFlow: A Flow-Based Generative Model for Video
    Manoj Kumar, Mohammad Babaeizadeh, Dumitru Erhan, Chelsea Finn, Sergey Levine, Laurent Dinh and Durk Kingma.
  18. Neural Importance Sampling
  19. Semi-Conditional Normalizing Flows for Semi-Supervised Learning
    Andrei Atanov, Alexandra Volokhova, Arsenii Ashukha, Ivan Sosnovik and Dmitry Vetrov.
  20. Symmetric Convolutional Flow
  21. Approximating exponential family models (not single distributions) with a two-network architecture
    Sean Bittner and John Cunningham.
  22. Covering up bias with Markov blankets: A post-hoc cure for attribute prior blindness
    Vinay Prabhu, Dian Ang Yap and Alexandar Wang.
  23. Neural Networks with Cheap Differential Operators
  24. AlignFlow: Learning from multiple domains via normalizing flows
    Aditya Grover, Christopher Chute, Rui Shu, Zhangjie Cao and Stefano Ermon.
  25. Invertible ConvNets
    Marc Finzi, Pavel Izmailov, Wesley Maddox, Polina Kirichenko and Andrew Wilson.
  26. On Mixed Conditional FFJORD with Large-Batch Training
    Tan Nguyen, Animesh Garg, Anjul Patney, Richard Baraniuk and Anima Anandkumar.
  27. Semi-Supervised Learning with Normalizing Flows
    Pavel Izmailov, Polina Kirichenko, Marc Finzi and Andrew Wilson.
  28. Investigating the Impact of Normalizing Flows on Latent Variable Machine Translation
    Michael Przystupa, Mark Schmidt and Muhammad Abdul-Mageed.
  29. Improving Normalizing Flows via Better Orthogonal Parameterizations
    Adam Golinski, Mario Lezcano-Casado and Tom Rainforth.
  30. MinvNet: Building Invertible Neural Networks with Masked Convolutions
    Yang Song, Chenlin Meng and Stefano Ermon.
  31. Learning Generative Samplers using Relaxed Injective Flow
    Abhishek Kumar, Ben Poole and Kevin Murphy.