Chapman & Hall/CRC Monographs on Statistics and Applied Probability- Statistical Learning with Sparsity The Lasso and Generalizations

Afbeeldingen

Artikel vergelijken

  • Engels
  • Paperback
  • 9780367738334
  • 18 december 2020
  • 367 pagina's
Alle productspecificaties

Trevor Hastie

"Trevor John Hastie (born 27 June 1953) is a South African and American statistician and computer scientist. He is currently serving as the John A. Overdeck Professor of Mathematical Sciences and Professor of Statistics at the Stanford University. Hastie is known for his contributions to applied statistics, especially in the field of machine learning, data mining, and bioinformatics. He has authored several popular books in statistical learning, including The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Hastie has been listed as an ISI Highly Cited Author in Mathematics by the ISI Web of Knowledge.

(Bron: Wikipedia. Beschikbaar onder de licentie Creative Commons Naamsvermelding/Gelijk delen.)"

Samenvatting

In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. The authors cover the lasso for linear regression, generali





Discover New Methods for Dealing with High-Dimensional Data

A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data.

Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of ℓ1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso.

In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.

Productspecificaties

Inhoud

Taal
en
Bindwijze
Paperback
Oorspronkelijke releasedatum
18 december 2020
Aantal pagina's
367
Illustraties
Nee

Betrokkenen

Hoofdauteur
Trevor Hastie
Tweede Auteur
Robert Tibshirani
Co Auteur
Martin Wainwright
Hoofduitgeverij
Chapman & Hall/Crc

Overige kenmerken

Product breedte
156 mm
Product lengte
234 mm
Studieboek
Nee
Verpakking breedte
156 mm
Verpakking hoogte
234 mm
Verpakking lengte
234 mm
Verpakkingsgewicht
640 g

EAN

EAN
9780367738334
Nog geen reviews

Kies gewenste uitvoering

Prijsinformatie en bestellen

De prijs van dit product is 55 euro en 99 cent.
Uiterlijk 14 juni in huis
Verkoop door bol
  • Prijs inclusief verzendkosten, verstuurd door bol
  • Ophalen bij een bol afhaalpunt mogelijk
  • 30 dagen bedenktijd en gratis retourneren
  • Dag en nacht klantenservice