tmg

Exact Hamiltonian Monte Carlo for Truncated Multivariate Gaussians. We present a Hamiltonian Monte Carlo algorithm to sample from multivariate Gaussian distributions in which the target space is constrained by linear and quadratic inequalities or products thereof. The Hamiltonian equations of motion can be integrated exactly and there are no parameters to tune. The algorithm mixes faster and is more efficient than Gibbs sampling. The runtime depends on the number and shape of the constraints but the algorithm is highly parallelizable. In many cases, we can exploit special structure in the covariance matrices of the untruncated Gaussian to further speed up the runtime. A simple extension of the algorithm permits sampling from distributions whose log-density is piecewise quadratic, as in the ”Bayesian Lasso” model.


References in zbMATH (referenced in 18 articles )

Showing results 1 to 18 of 18.
Sorted by year (citations)

  1. Huang, Jingfang; Cao, Jian; Fang, Fuhui; Genton, Marc G.; Keyes, David E.; Turkiyyah, George: An (O(N)) algorithm for computing expectation of (N)-dimensional truncated multi-variate normal distribution. I: Fundamentals (2021)
  2. Schultheiss, Christoph; Renaux, Claude; Bühlmann, Peter: Multicarving for high-dimensional post-selection inference (2021)
  3. Bachoc, François; Helbert, Céline; Picheny, Victor: Gaussian process optimization with failures: classification and convergence proof (2020)
  4. López-Lopera, Andrés F.; Bachoc, François; Durrande, Nicolas; Rohmer, Jérémy; Idier, Déborah; Roustant, Olivier: Approximating Gaussian process emulators with linear inequality constraints and noisy observations via MC and MCMC (2020)
  5. Mulgrave, Jami J.; Ghosal, Subhashis: Bayesian inference in nonparanormal graphical models (2020)
  6. Nishimura, Akihiko; Dunson, David: Recycling intermediate steps to improve Hamiltonian Monte Carlo (2020)
  7. Ray, Pallavi; Pati, Debdeep; Bhattacharya, Anirban: Efficient Bayesian shape-restricted function estimation with constrained Gaussian process priors (2020)
  8. Bachoc, François; Lagnoux, Agnès; López-Lopera, Andrés F.: Maximum likelihood estimation for Gaussian processes under inequality constraints (2019)
  9. Benjamini, Yuval; Taylor, Jonathan; Irizarry, Rafael A.: Selection-corrected statistical inference for region detection with high-throughput assays (2019)
  10. Gunawan, D.; Tran, M.-N.; Suzuki, K.; Dick, J.; Kohn, R.: Computationally efficient Bayesian estimation of high-dimensional Archimedean copulas with discrete and mixed margins (2019)
  11. Bierkens, Joris; Bouchard-Côté, Alexandre; Doucet, Arnaud; Duncan, Andrew B.; Fearnhead, Paul; Lienart, Thibaut; Roberts, Gareth; Vollmer, Sebastian J.: Piecewise deterministic Markov processes for scalable Monte Carlo on restricted domains (2018)
  12. Jacobovic, Royi: On the relation between the true and sample correlations under Bayesian modelling of gene expression datasets (2018)
  13. López-Lopera, Andrés F.; Bachoc, François; Durrande, Nicolas; Roustant, Olivier: Finite-dimensional Gaussian approximation with linear inequality constraints (2018)
  14. Tian, Xiaoying; Taylor, Jonathan: Selective inference with a randomized response (2018)
  15. Canale, Antonio; Pagui, Euloge Clovis Kenne; Scarpa, Bruno: Bayesian modeling of university first-year students’ grades after placement test (2016)
  16. Ridgway, James: Computation of Gaussian orthant probabilities in high dimension (2016)
  17. Burda, Martin: Constrained Hamiltonian Monte Carlo in BEKK GARCH with targeting (2015)
  18. Pakman, Ari; Huggins, Jonathan; Smith, Carl; Paninski, Liam: Fast state-space methods for inferring dendritic synaptic connectivity (2014)