Glow

Glow: Generative Flow with Invertible 1x1 Convolutions. Flow-based generative models (Dinh et al., 2014) are conceptually attractive due to tractability of the exact log-likelihood, tractability of exact latent-variable inference, and parallelizability of both training and synthesis. In this paper we propose Glow, a simple type of generative flow using an invertible 1x1 convolution. Using our method we demonstrate a significant improvement in log-likelihood on standard benchmarks. Perhaps most strikingly, we demonstrate that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images. The code for our model is available at this https URL


References in zbMATH (referenced in 22 articles )

Showing results 1 to 20 of 22.
Sorted by year (citations)

1 2 next

  1. Grementieri, Luca; Fioresi, Rita: Model-centric data manifold: the data through the eyes of the model (2022)
  2. Guo, Ling; Wu, Hao; Zhou, Tao: Normalizing field flows: solving forward and inverse stochastic differential equations using physics-informed flow models (2022)
  3. Tang, Kejun; Wan, Xiaoliang; Liao, Qifeng: Adaptive deep density approximation for Fokker-Planck equations (2022)
  4. Tekalp, A. Murat: Deep learning for image/video restoration and super-resolution (2022)
  5. Wang, Yu; Liu, Fang; Schiavazzi, Daniele E.: Variational inference with NoFAS: normalizing flow with adaptive surrogate for computationally expensive models (2022)
  6. Wan, Xiaoliang; Wei, Shuangqing: VAE-KRnet and its applications to variational Bayes (2022)
  7. Celledoni, E.; Ehrhardt, M. J.; Etmann, C.; McLachlan, R. I.; Owren, B.; Schonlieb, C.-B.; Sherry, F.: Structure-preserving deep learning (2021)
  8. Hagemann, Paul; Neumayer, Sebastian: Stabilizing invertible neural networks using mixture models (2021)
  9. Huang, Wen; Hand, Paul; Heckel, Reinhard; Voroninski, Vladislav: A provably convergent scheme for compressive sensing under random generative priors (2021)
  10. Padmanabha, Govinda Anantha; Zabaras, Nicholas: Solving inverse problems using conditional invertible neural networks (2021)
  11. Papamakarios, George; Nalisnick, Eric; Rezende, Danilo Jimenez; Mohamed, Shakir; Lakshminarayanan, Balaji: Normalizing flows for probabilistic modeling and inference (2021)
  12. Paul, William; Wang, I-Jeng; Alajaji, Fady; Burlina, Philippe: Unsupervised discovery, control, and disentanglement of semantic attributes with applications to anomaly detection (2021)
  13. Wang, Ruhua; An, Senjian; Liu, Wanquan; Li, Ling: Fixed-point algorithms for inverse of residual rectifier neural networks (2021)
  14. Animesh Karnewar, Oliver Wang: MSG-GAN: Multi-Scale Gradients for Generative Adversarial Networks (2020) arXiv
  15. Arenz, Oleg; Zhong, Mingjun; Neumann, Gerhard: Trust-region variational inference with Gaussian mixture models (2020)
  16. Brehmer, Johann; Louppe, Gilles; Pavez, Juan; Cranmer, Kyle: Mining gold from implicit models to improve likelihood-free inference (2020)
  17. Drori, Iddo: Deep variational inference (2020)
  18. Mosser, Lukas; Dubrule, Olivier; Blunt, Martin J.: Stochastic seismic waveform inversion using generative adversarial networks as a geological prior (2020)
  19. Wan, Xiaoliang; Wei, Shuangqing: Coupling the reduced-order model and the generative model for an importance sampling estimator (2020)
  20. Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios: Neural Spline Flows (2019) arXiv

1 2 next