ranger

ranger: A Fast Implementation of Random Forests for High Dimensional Data in C++ and R. We introduce the C++ application and R package ranger. The software is a fast implementation of random forests for high dimensional data. Ensembles of classification, regression and survival trees are supported. We describe the implementation, provide examples, validate the package with a reference implementation, and compare runtime and memory usage with other implementations. The new software proves to scale best with the number of features, samples, trees, and features tried for splitting. Finally, we show that ranger is the fastest and most memory efficient implementation of random forests to analyze data on the scale of a genome-wide association study.


References in zbMATH (referenced in 24 articles )

Showing results 1 to 20 of 24.
Sorted by year (citations)

1 2 next

  1. Berk, Richard A.: Statistical learning from a regression perspective (2020)
  2. Boehmke, Brad; Greenwell, Brandon M.: Hands-on machine learning with R (2020)
  3. Bommert, Andrea; Sun, Xudong; Bischl, Bernd; Rahnenführer, Jörg; Lang, Michel: Benchmark for filter methods for feature selection in high-dimensional classification data (2020)
  4. Cerqueira, Vitor; Torgo, Luis; Mozetič, Igor: Evaluating time series forecasting models: an empirical study on performance estimation methods (2020)
  5. Genuer, Robin; Poggi, Jean-Michel: Random forests with R (2020)
  6. Hornung, Roman: Ordinal forests (2020)
  7. Ribeiro, Rita P.; Moniz, Nuno: Imbalanced regression and extreme value prediction (2020)
  8. Sage, Andrew J.; Genschel, Ulrike; Nettleton, Dan: Tree aggregation for random forest class probability estimation (2020)
  9. Sayan Putatunda, Dayananda Ubrangala, Kiran Rama, Ravi Kondapalli: DriveML: An R Package for Driverless Machine Learning (2020) arXiv
  10. Schmid, Matthias; Welchowski, Thomas; Wright, Marvin N.; Berger, Moritz: Discrete-time survival forests with Hellinger distance decision trees (2020)
  11. Tomita, Tyler M.; Browne, James; Shen, Cencheng; Chung, Jaewon; Patsolic, Jesse L.; Falk, Benjamin; Priebe, Carey E.; Yim, Jason; Burns, Randal; Maggioni, Mauro; Vogelstein, Joshua T.: Sparse projection oblique randomer forests (2020)
  12. Athey, Susan; Tibshirani, Julie; Wager, Stefan: Generalized random forests (2019)
  13. Cerqueira, Vitor; Torgo, Luís; Pinto, Fábio; Soares, Carlos: Arbitrage of forecasting experts (2019)
  14. Franzin, Alberto; Stützle, Thomas: Revisiting simulated annealing: a component-based analysis (2019)
  15. Lyubchich, Vyacheslav; Woodland, Ryan J.: Using isotope composition and other node attributes to predict edges in fish trophic networks (2019)
  16. Sellereite, Nikolai; Jullum, Martin: shapr: An R-package for explaining machine learning models with dependence-aware Shapley values (2019) not zbMATH
  17. Simon Hediger, Loris Michel, Jeffrey Näf: On the Use of Random Forest for Two-Sample Testing (2019) arXiv
  18. Alicja Gosiewska; Przemyslaw Biecek: auditor: an R Package for Model-Agnostic Visual Validation and Diagnostic (2018) arXiv
  19. Janitza, Silke; Celik, Ender; Boulesteix, Anne-Laure: A computationally fast variable importance test for random forests for high-dimensional data (2018)
  20. Probst, Philipp; Boulesteix, Anne-Laure: To tune or not to tune the number of trees in random forest (2018)

1 2 next