libri scuola books Fumetti ebook dvd top ten sconti 0 Carrello


Torna Indietro

kulkarni sr - an elementary introduction to statistical learning  theory
Zoom

An Elementary Introduction to Statistical Learning Theory




Disponibilità: Normalmente disponibile in 20 giorni
A causa di problematiche nell'approvvigionamento legate alla Brexit sono possibili ritardi nelle consegne.


PREZZO
121,95 €
NICEPRICE
115,85 €
SCONTO
5%



Questo prodotto usufruisce delle SPEDIZIONI GRATIS
selezionando l'opzione Corriere Veloce in fase di ordine.


Pagabile anche con Carta della cultura giovani e del merito, Carta della Cultura e Carta del Docente


Facebook Twitter Aggiungi commento


Spese Gratis

Dettagli

Genere:Libro
Lingua: Inglese
Pubblicazione: 07/2011





Trama

A thought-provoking look at statistical learning theory and its role in understanding human learning and inductive reasoning

A joint endeavor from leading researchers in the fields of philosophy and electrical engineering, An Elementary Introduction to Statistical Learning Theory is a comprehensive and accessible primer on the rapidly evolving fields of statistical pattern recognition and statistical learning theory. Explaining these areas at a level and in a way that is not often found in other books on the topic, the authors present the basic theory behind contemporary machine learning and uniquely utilize its foundations as a framework for philosophical thinking about inductive inference.

Promoting the fundamental goal of statistical learning, knowing what is achievable and what is not, this book demonstrates the value of a systematic methodology when used along with the needed techniques for evaluating the performance of a learning system. First, an introduction to machine learning is presented that includes brief discussions of applications such as image recognition, speech recognition, medical diagnostics, and statistical arbitrage. To enhance accessibility, two chapters on relevant aspects of probability theory are provided. Subsequent chapters feature coverage of topics such as the pattern recognition problem, optimal Bayes decision rule, the nearest neighbor rule, kernel rules, neural networks, support vector machines, and boosting.

Appendices throughout the book explore the relationship between the discussed material and related topics from mathematics, philosophy, psychology, and statistics, drawing insightful connections between problems in these areas and statistical learning theory. All chapters conclude with a summary section, a set of practice questions, and a reference sections that supplies historical notes and additional resources for further study.

An Elementary Introduction to Statistical Learning Theory is an excellent book for courses on statistical learning theory, pattern recognition, and machine learning at the upper-undergraduate and graduate levels. It also serves as an introductory reference for researchers and practitioners in the fields of engineering, computer science, philosophy, and cognitive science that would like to further their knowledge of the topic.




Note Editore

This book offers a broad and accessible introduction to rapidly-evolving field of statistical learning theory. Harman and Kulkarni, based out of Princeton University's philosophy and engineering departments respectively, collaborate to present the basic theory behind contemporary machine learning and uniquely suggest that it serves as an excellent framework for philosophical thinking about inductive inference. The authors focus on the basic rules for classifying objects and estimating values (given certain observations or measurements), and then go on to explore the methods for learning rules for classification and estimation. The book begins with an introduction to machine learning and its various applications, including image recognition, speech recognition, medical diagnostics, and statistical arbitrage, along with a chapter on the relevance of probability theory in statistical learning. Subsequent chapters feature coverage of topics such as the pattern recognition problem, optimal Bayes decision rule, the nearest neighbor rule, kernel rules, neural networks, and multilayer networks. The book concludes with chapters devoted to topics that are not typically treated in introductory-level statistical learning books, including PAC learning, VC dimension, infinite VC dimension, and simplicity. While serving as introduction to statistical learning, the book also discusses the topic from a philosophical perspective. The authors recognize that a fundamental goal of statistical learning, knowing what is achievable and what is not, is also related to philosophical questions that arise in epistemology (ie: What can we learn and how can we learn it? What can we learn about other minds and the external world? What can we learn through induction?). As a result, each chapter is accompanied by an appendix that draws connections between inductive reasoning and statistical learning (ie: the chapter on the Nearest Neighbor rule is followed by an appendix that asks when the rule should be used and discusses how individuals utilize its principles in everyday scenarios). Along with these appendices, chapters also feature a summary section, set of practice questions, and a reference section that supplies readers with additional resources on the presented topic.




Sommario

A thought-provoking look at statistical learning theory and its role in understanding human learning and inductive reasoning A joint endeavor from leading researchers in the fields of philosophy and electrical engineering, An Elementary Introduction to Statistical Learning Theory is a comprehensive and accessible primer on the rapidly evolving fields of statistical pattern recognition and statistical learning theory. Explaining these areas at a level and in a way that is not often found in other books on the topic, the authors present the basic theory behind contemporary machine learning and uniquely utilize its foundations as a framework for philosophical thinking about inductive inference. Promoting the fundamental goal of statistical learning, knowing what is achievable and what is not, this book demonstrates the value of a systematic methodology when used along with the needed techniques for evaluating the performance of a learning system. First, an introduction to machine learning is presented that includes brief discussions of applications such as image recognition, speech recognition, medical diagnostics, and statistical arbitrage. To enhance accessibility, two chapters on relevant aspects of probability theory are provided. Subsequent chapters feature coverage of topics such as the pattern recognition problem, optimal Bayes decision rule, the nearest neighbor rule, kernel rules, neural networks, support vector machines, and boosting. Appendices throughout the book explore the relationship between the discussed material and related topics from mathematics, philosophy, psychology, and statistics, drawing insightful connections between problems in these areas and statistical learning theory. All chapters conclude with a summary section, a set of practice questions, and a reference sections that supplies historical notes and additional resources for further study. An Elementary Introduction to Statistical Learning Theory is an excellent book for courses on statistical learning theory, pattern recognition, and machine learning at the upper-undergraduate and graduate levels. It also serves as an introductory reference for researchers and practitioners in the fields of engineering, computer science, philosophy, and cognitive science that would like to further their knowledge of the topic.










Altre Informazioni

ISBN:

9780470641835

Condizione: Nuovo
Collana: Wiley Series in Probability and Statistics
Dimensioni: 234 x 16 x 156 mm Ø 437 gr
Formato: Copertina rigida
Pagine Arabe: 232


Dicono di noi