Glow: Generative Flow with Invertible 1x1 Convolutions. Flow-based generative models (Dinh et al., 2014) are conceptually attractive due to tractability of the exact log-likelihood, tractability of exact latent-variable inference, and parallelizability of both training and synthesis. In this paper we propose Glow, a simple type of generative flow using an invertible 1x1 convolution. Using our method we demonstrate a significant improvement in log-likelihood on standard benchmarks. Perhaps most strikingly, we demonstrate that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images. The code for our model is available at this https URL

References in zbMATH (referenced in 11 articles )

Showing results 1 to 11 of 11.
Sorted by year (citations)

  1. Hagemann, Paul; Neumayer, Sebastian: Stabilizing invertible neural networks using mixture models (2021)
  2. Huang, Wen; Hand, Paul; Heckel, Reinhard; Voroninski, Vladislav: A provably convergent scheme for compressive sensing under random generative priors (2021)
  3. Papamakarios, George; Nalisnick, Eric; Rezende, Danilo Jimenez; Mohamed, Shakir; Lakshminarayanan, Balaji: Normalizing flows for probabilistic modeling and inference (2021)
  4. Paul, William; Wang, I-Jeng; Alajaji, Fady; Burlina, Philippe: Unsupervised discovery, control, and disentanglement of semantic attributes with applications to anomaly detection (2021)
  5. Arenz, Oleg; Zhong, Mingjun; Neumann, Gerhard: Trust-region variational inference with Gaussian mixture models (2020)
  6. Brehmer, Johann; Louppe, Gilles; Pavez, Juan; Cranmer, Kyle: Mining gold from implicit models to improve likelihood-free inference (2020)
  7. Drori, Iddo: Deep variational inference (2020)
  8. Mosser, Lukas; Dubrule, Olivier; Blunt, Martin J.: Stochastic seismic waveform inversion using generative adversarial networks as a geological prior (2020)
  9. Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios: Neural Spline Flows (2019) arXiv
  10. Yang, Yibo; Perdikaris, Paris: Adversarial uncertainty quantification in physics-informed neural networks (2019)
  11. Zhu, Yinhao; Zabaras, Nicholas; Koutsourelakis, Phaedon-Stelios; Perdikaris, Paris: Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data (2019)