libri scuola books Fumetti ebook dvd top ten sconti 0 Carrello


Torna Indietro

hastie trevor; tibshirani robert; wainwright martin - statistical learning with sparsity
Zoom

Statistical Learning with Sparsity The Lasso and Generalizations

; ;




Disponibilità: Normalmente disponibile in 20 giorni
A causa di problematiche nell'approvvigionamento legate alla Brexit sono possibili ritardi nelle consegne.


PREZZO
129,98 €
NICEPRICE
123,48 €
SCONTO
5%



Questo prodotto usufruisce delle SPEDIZIONI GRATIS
selezionando l'opzione Corriere Veloce in fase di ordine.


Pagabile anche con Carta della cultura giovani e del merito, 18App Bonus Cultura e Carta del Docente


Facebook Twitter Aggiungi commento


Spese Gratis

Dettagli

Genere:Libro
Lingua: Inglese
Pubblicazione: 08/2015
Edizione: 1° edizione





Note Editore

Discover New Methods for Dealing with High-Dimensional Data A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data. Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of l1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso. In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.




Sommario

Introduction The Lasso for Linear Models Introduction The Lasso Estimator Cross-Validation and Inference Computation of the Lasso Solution Degrees of Freedom Uniqueness of the Lasso Solutions A Glimpse at the Theory The Nonnegative Garrote lq Penalties and Bayes Estimates Some Perspective Generalized Linear Models Introduction Logistic Regression Multiclass Logistic Regression Log-Linear Models and the Poisson GLM Cox Proportional Hazards Models Support Vector Machines Computational Details and glmnet Generalizations of the Lasso PenaltyIntroductionThe Elastic Net The Group LassoSparse Additive Models and the Group LassoThe Fused Lasso Nonconvex Penalties Optimization Methods Introduction Convex Optimality Conditions Gradient DescentCoordinate Descent A Simulation Study Least Angle Regression Alternating Direction Method of Multipliers Minorization-Maximization Algorithms Biconvexity and Alternating Minimization Screening Rules Statistical Inference The Bayesian Lasso The Bootstrap Post-Selection Inference for the Lasso Inference via a Debiased Lasso Other Proposals for Post-Selection Inference Matrix Decompositions, Approximations, and Completion Introduction The Singular Value DecompositionMissing Data and Matrix Completion Reduced-Rank Regression A General Matrix Regression Framework Penalized Matrix Decomposition Additive Matrix Decomposition Sparse Multivariate Methods Introduction Sparse Principal Components Analysis Sparse Canonical Correlation Analysis Sparse Linear Discriminant Analysis Sparse Clustering Graphs and Model Selection Introduction Basics of Graphical ModelsGraph Selection via Penalized Likelihood Graph Selection via Conditional Inference Graphical Models with Hidden Variables Signal Approximation and Compressed Sensing Introduction Signals and Sparse Representations Random Projection and Approximation Equivalence between l0 and l1 Recovery Theoretical Results for the Lasso Introduction Bounds on Lasso l2-error Bounds on Prediction Error Support Recovery in Linear Regression Beyond the Basic Lasso Bibliography Author Index Index Bibliographic Notes and Exercises appear at the end of each chapter.




Autore

Trevor Hastie is the John A. Overdeck Professor of Statistics at Stanford University. Prior to joining Stanford University, Professor Hastie worked at AT&T Bell Laboratories, where he helped develop the statistical modeling environment popular in the R computing system. Professor Hastie is known for his research in applied statistics, particularly in the fields of data mining, bioinformatics, and machine learning. He has published five books and over 180 research articles in these areas. In 2014, he received the Emanuel and Carol Parzen Prize for Statistical Innovation. He earned a PhD from Stanford University. Robert Tibshirani is a professor in the Departments of Statistics and Health Research and Policy at Stanford University. He has authored five books, co-authored three books, and published over 200 research articles. He has made important contributions to the analysis of complex datasets, including the lasso and significance analysis of microarrays (SAM). He also co-authored the first study that linked cell phone usage with car accidents, a widely cited article that has played a role in the introduction of legislation that restricts the use of phones while driving. Professor Tibshirani was a recipient of the prestigious COPSS Presidents’ Award in 1996 and was elected to the National Academy of Sciences in 2012. Martin Wainwright is a professor in the Department of Statistics and the Department of Electrical Engineering and Computer Sciences at the University of California, Berkeley. Professor Wainwright is known for theoretical and methodological research at the interface between statistics and computation, with particular emphasis on high-dimensional statistics, machine learning, graphical models, and information theory. He has published over 80 papers and one book in these areas, received the COPSS Presidents’ Award in 2014, and was a section lecturer at the International Congress of Mathematicians in 2014. He received PhD in EECS from the Massachusetts Institute of Technology (MIT).










Altre Informazioni

ISBN:

9781498712163

Condizione: Nuovo
Collana: Chapman & Hall/CRC Monographs on Statistics and Applied Probability
Dimensioni: 9.25 x 6.25 in Ø 1.80 lb
Formato: Copertina rigida
Illustration Notes:99 color images and 11 color tables
Pagine Arabe: 367


Dicono di noi





Per noi la tua privacy è importante


Il sito utilizza cookie ed altri strumenti di tracciamento che raccolgono informazioni dal dispositivo dell’utente. Oltre ai cookie tecnici ed analitici aggregati, strettamente necessari per il funzionamento di questo sito web, previo consenso dell’utente possono essere installati cookie di profilazione e marketing e cookie dei social media. Cliccando su “Accetto tutti i cookie” saranno attivate tutte le categorie di cookie. Per accettare solo deterninate categorie di cookie, cliccare invece su “Impostazioni cookie”. Chiudendo il banner o continuando a navigare saranno installati solo cookie tecnici. Per maggiori dettagli, consultare la Cookie Policy.

Impostazioni cookie
Rifiuta Tutti i cookie
Accetto tutti i cookie
X