NICE

NICE: Non-linear Independent Components Estimation. We propose a deep learning framework for modeling complex high-dimensional densities called Non-linear Independent Component Estimation (NICE). It is based on the idea that a good representation is one in which the data has a distribution that is easy to model. For this purpose, a non-linear deterministic transformation of the data is learned that maps it to a latent space so as to make the transformed data conform to a factorized distribution, i.e., resulting in independent latent variables. We parametrize this transformation so that computing the Jacobian determinant and inverse transform is trivial, yet we maintain the ability to learn complex non-linear transformations, via a composition of simple building blocks, each based on a deep neural network. The training criterion is simply the exact log-likelihood, which is tractable. Unbiased ancestral sampling is also easy. We show that this approach yields good generative models on four image datasets and can be used for inpainting.


References in zbMATH (referenced in 24 articles )

Showing results 1 to 20 of 24.
Sorted by year (citations)

1 2 next

  1. Guo, Ling; Wu, Hao; Zhou, Tao: Normalizing field flows: solving forward and inverse stochastic differential equations using physics-informed flow models (2022)
  2. Liang, Zhangyong; Gao, Huanhuan; Li, Tingting: SEM: a shallow energy method for finite deformation hyperelasticity problems (2022)
  3. Marino, Joseph: Predictive coding, variational autoencoders, and biological connections (2022)
  4. Marino, Joseph; Chen, Lei; He, Jiawei; Mandt, Stephan: Improving sequential latent variable models with autoregressive flows (2022)
  5. Wan, Xiaoliang; Wei, Shuangqing: VAE-KRnet and its applications to variational Bayes (2022)
  6. Xia, Yingzhi; Zabaras, Nicholas: Bayesian multiscale deep generative model for the solution of high-dimensional inverse problems (2022)
  7. Zhu, Aiqing; Zhu, Beibei; Zhang, Jiawei; Tang, Yifa; Liu, Jian: VPNets: volume-preserving neural networks for learning source-free dynamics (2022)
  8. Celledoni, E.; Ehrhardt, M. J.; Etmann, C.; McLachlan, R. I.; Owren, B.; Schonlieb, C.-B.; Sherry, F.: Structure-preserving deep learning (2021)
  9. Isomura, Takuya; Toyoizumi, Taro: On the achievability of blind source separation for high-dimensional nonlinear source mixtures (2021)
  10. Manoj Kumar, Dirk Weissenborn, Nal Kalchbrenner: Colorization Transformer (2021) arXiv
  11. Padmanabha, Govinda Anantha; Zabaras, Nicholas: Solving inverse problems using conditional invertible neural networks (2021)
  12. Wang, Ruhua; An, Senjian; Liu, Wanquan; Li, Ling: Fixed-point algorithms for inverse of residual rectifier neural networks (2021)
  13. Abbasnejad, M. Ehsan; Shi, Javen; van den Hengel, Anton; Liu, Lingqiao: GADE: a generative adversarial approach to density estimation and its applications (2020)
  14. Animesh Karnewar, Oliver Wang: MSG-GAN: Multi-Scale Gradients for Generative Adversarial Networks (2020) arXiv
  15. Arenz, Oleg; Zhong, Mingjun; Neumann, Gerhard: Trust-region variational inference with Gaussian mixture models (2020)
  16. Brehmer, Johann; Louppe, Gilles; Pavez, Juan; Cranmer, Kyle: Mining gold from implicit models to improve likelihood-free inference (2020)
  17. Chen Gao, Yunpeng Chen, Si Liu, Zhenxiong Tan, Shuicheng Yan: AdversarialNAS: Adversarial Neural Architecture Search for GANs (2020) arXiv
  18. Jin, Pengzhan; Zhang, Zhen; Zhu, Aiqing; Tang, Yifa; Karniadakis, George Em: Sympnets: intrinsic structure-preserving symplectic networks for identifying Hamiltonian systems (2020)
  19. Simon Badger, Joseph Bullock: Using neural networks for efficient evaluation of high multiplicity scattering amplitudes (2020) arXiv
  20. Wan, Xiaoliang; Wei, Shuangqing: Coupling the reduced-order model and the generative model for an importance sampling estimator (2020)

1 2 next