selectiveInference

R package selectiveInference: Tools for Post-Selection Inference. New tools for post-selection inference, for use with forward stepwise regression, least angle regression, the lasso, and the many means problem. The lasso function implements Gaussian, logistic and Cox survival models.


References in zbMATH (referenced in 21 articles )

Showing results 1 to 20 of 21.
Sorted by year (citations)

1 2 next

  1. Law, Michael; Ritov, Ya’acov: Inference without compatibility: using exponential weighting for inference on a parameter of a linear model (2021)
  2. Panigrahi, Snigdha; Taylor, Jonathan; Weinstein, Asaf: Integrative methods for post-selection inference under convex constraints (2021)
  3. Kennedy, Christopher; Ward, Rachel: Greedy variance estimation for the LASSO (2020)
  4. Martin, Ryan; Tang, Yiqi: Empirical priors for prediction in sparse high-dimensional linear regression (2020)
  5. Piironen, Juho; Paasiniemi, Markus; Vehtari, Aki: Projective inference in high-dimensional problems: prediction and feature selection (2020)
  6. Tian, Xiaoying: Prediction error after model search (2020)
  7. Atchadé, Yves F.: Quasi-Bayesian estimation of large Gaussian graphical models (2019)
  8. De Micheaux, Pierre Lafaye; Liquet, Benoît; Sutton, Matthew: PLS for Big Data: a unified parallel algorithm for regularised group PLS (2019)
  9. Lee, Kyoungjae; Lee, Jaeyong; Lin, Lizhen: Minimax posterior convergence rates and model selection consistency in high-dimensional DAG models based on sparse Cholesky factors (2019)
  10. Gueuning, Thomas; Claeskens, Gerda: A high-dimensional focused information criterion (2018)
  11. Homrighausen, Darren; McDonald, Daniel J.: A study on tuning parameter selection for the high-dimensional Lasso (2018)
  12. Javanmard, Adel; Montanari, Andrea: Debiasing the Lasso: optimal sample size for Gaussian designs (2018)
  13. Mikkelsen, Frederik Riis; Hansen, Niels Richard: Degrees of freedom for piecewise Lipschitz estimators (2018)
  14. Shah, Rajen D.; Bühlmann, Peter: Goodness-of-fit tests for high dimensional linear models (2018)
  15. Takahashi, Takashi; Kabashima, Yoshiyuki: A statistical mechanics approach to de-biasing and uncertainty estimation in Lasso for random measurements (2018)
  16. Wilms, I.; Croux, C.: An algorithm for the multivariate group Lasso with covariance estimation (2018)
  17. Boyer, Claire; De Castro, Yohann; Salmon, Joseph: Adapting to unknown noise level in sparse deconvolution (2017)
  18. Dezeure, Ruben; Bühlmann, Peter; Zhang, Cun-Hui: High-dimensional simultaneous inference with the bootstrap (2017)
  19. Reid, Stephen; Tibshirani, Robert; Friedman, Jerome: A study of error variance estimation in Lasso regression (2016)
  20. Sabourin, Jeremy A.; Valdar, William; Nobel, Andrew B.: A permutation approach for selecting the penalty parameter in penalized model selection (2015)

1 2 next