Search results
Results from the WOW.Com Content Network
Tibshirani is a prolific author of scientific works on various topics in applied statistics, including statistical learning, data mining, statistical computing, and bioinformatics. He along with his collaborators has authored about 250 scientific articles.
{{Hastie Tibshirani Friedman The Elements of Statistical Learning 2009}} will display: Hastie, Trevor; Tibshirani, Robert; Friedman, Jerome H. (February 2009). The Elements of Statistical Learning: Data Mining, Inference, and Prediction (pdf). Graduate Texts in Statistics (2nd ed.). New York: Springer-Verlag. ISBN 978-0-387-84857-0.
T. Hastie, R. Tibshirani, M. Wainwright, Statistical Learning with Sparsity: the Lasso and Generalizations, CRC Press, 2015 [11] (available for free from the author's website). Bradley Efron; Trevor Hastie (2016). Computer Age Statistical Inference. Cambridge University Press. ISBN 9781107149892.
In statistics, least-angle regression (LARS) is an algorithm for fitting linear regression models to high-dimensional data, developed by Bradley Efron, Trevor Hastie, Iain Johnstone and Robert Tibshirani. [1] Suppose we expect a response variable to be determined by a linear combination of a subset of potential covariates.
In statistics and machine learning, lasso (least absolute shrinkage and selection operator; also Lasso, LASSO or L1 regularization) [1] is a regression analysis method that performs both variable selection and regularization in order to enhance the prediction accuracy and interpretability of the resulting statistical model. The lasso method ...
GAMs were originally developed by Trevor Hastie and Robert Tibshirani [1] to blend properties of generalized linear models with additive models. They can be interpreted as the discriminative generalization of the naive Bayes generative model. [2] The model relates a univariate response variable, Y, to some predictor variables, x i.
She was elected as a Fellow of the American Statistical Association in 2020. [19] She was named to the 2022 class of Fellows of the Institute of Mathematical Statistics, for "substantial contributions to the field of statistical machine learning, with applications to biology; and for communicating the fundamental ideas in the field to a broad audience".
In machine learning and computational learning theory, LogitBoost is a boosting algorithm formulated by Jerome Friedman, Trevor Hastie, and Robert Tibshirani. The original paper casts the AdaBoost algorithm into a statistical framework. [1]